Drone-based 3D Reconstruction of Plants in Field Conditions Using Neural Radiance Fields (NeRFs)
Abstract
The performance of 3D reconstruction using Neural Radiance Fields (NeRFs) for outdoor phenotyping of plants is strongly influenced by the imaging modality used for data collection. We compare drone, handheld, and 360° ground robot datasets collected over soybean and mungbean plots, and evaluate reconstruction quality using 2D metrics PSNR, SSIM, LPIPS, and 3D geometric metrics precision, recall, and F1 score. Drone imagery produced the highest geometric fidelity, handheld captures achieved the strongest 2D appearance quality, and the 360° captures lagged in both metrics due to spherical distortion and motion artifacts. The consistency of the drone-based reconstructions highlights its suitability for field-scale 3D modeling and positions it as a practical foundation for future phenotyping applications
Sensors and Data Acquisition
Data were collected using multiple sensing platforms to capture complementary viewpoints and spatial scales for field-scale 3D reconstruction.
| Platform | Sensor | Resolution | #Images / Frames | Viewpoint Geometry | Capture Distance |
|---|---|---|---|---|---|
| Drone | DJI Inspire 2 + Zenmuse X5S (45 mm) | 4 K | ~300 images | Circular orbit, oblique | ~6 m |
| Handheld | iPhone 16 Pro | 12 MP | ~100 images | Multi-angle, close-range | ~0.5–1 m |
| 360° Robot | Insta360 X4 on TerraSentia robot | 5.7K video | ~300 images | Circular orbit, Equirectangular (360°) | ~1 m |
3D Reconstruction Results
Representative 3D point cloud reconstructions generated from aerial drone imagery and ground-level 360° video for soybean and mung bean field plots.
Soybean 3D point cloud reconstructed from a DJI drone capture.
Mungbean 3D point cloud reconstructed from a DJI drone capture.
Soybean field reconstruction using 360° ground-level video.
Mungbean field reconstruction using 360° ground-level video.
Handheld Data Capture and Reconstructions
Handheld imagery was collected at close range to capture fine-scale plant structure and provide dense visual coverage for individual plant reconstructions.
Team
* Corresponding author
BibTeX
@article{YourPaperKey2024,
title={Your Paper Title Here},
author={First Author and Second Author and Third Author},
journal={Conference/Journal Name},
year={2024},
url={https://your-domain.com/your-project-page}
}