Neural 3D Reconstruction and Immersive VR Visualization of Row Crops Across Phenological Growth Stages

Iowa State University
Smart Agriculture Technology
Pipeline

Real capture → NeRF / GSplat reconstruction → Unreal Engine visualization.

Abstract

Plant phenotyping in precision agriculture increasingly requires high-fidelity three-dimensional reconstruction and accessible visualization methods. This study presents an integrated pipeline combining Neural Radiance Fields (NeRF), 3D Gaussian Splatting (G-Splat), and Virtual Reality (VR) visualization for comprehensive plant analysis across developmental stages. We collected multi-view imagery of finger millet, proso millet, mungbean, and field pea under controlled greenhouse conditions, aligning data acquisition with standardized BBCH phenological scales. Camera pose estimation was performed using GLOMAP, followed by reconstruction via both Nerfacto and G-Splat implementations. Quantitative evaluation using PSNR, SSIM, and LPIPS metrics revealed complementary strengths of the two approaches: G-Splat achieved superior structural fidelity, while NeRF provided enhanced perceptual realism. Both reconstruction methods were successfully integrated into an immersive VR greenhouse environment deployed on Meta Quest headsets, maintaining consistently high framerates. This framework establishes a practical foundation for incorporating neural reconstruction and immersive technologies into agricultural phenotyping workflows, supporting both research applications and educational engagement.

3D Reconstruction Across Growth Stages

NeRF-based 3D point cloud reconstructions capturing the full growth cycle of various plants, from early development to maturity. Each model represents a distinct growth stage, providing a consistent and detailed 3D view of plant structural and morphological evolution over time.

COLMAP to Unreal Engine Pose Conversion for Quantitative Evaluation

The reconstruction quality is assessed using PSNR, SSIM, and LPIPS, referencing the original input images. To transform camera poses from COLMAP to Unreal Engine (UE), the coordinate axes are remapped to fit Unreal's conventions. This is followed by a change-of-basis transformation and a scaling adjustment for unit differences. The resulting poses can be used to load cameras accurately in Unreal Engine.

Qualitative Evaluation of NeRF and G-Splat Reconstructions

We compare reconstruction results from NeRF and Gaussian Splatting on both training (seen) and novel (unseen) viewpoints. This highlights each method’s ability to preserve fine details, maintain geometric structure, and generalize beyond observed views.

Trained (Seen) Viewpoints

Trained viewpoints comparison for NeRF and G-Splat

Unseen (Novel) Viewpoints

Unseen viewpoints comparison for NeRF and G-Splat

Virtual Greenhouse Walkthrough

The integrated virtual greenhouse platform combines optimized visualization, interactive features, and immersive exploration to enhance both data interpretability and user engagement. By balancing the high-fidelity detail of NeRF with the efficiency of Gaussian Splatting, the system enables effective visualization of plant structure and growth for research and analysis.

The video below presents a walkthrough of the Finger Millet reconstruction within the virtual environment.

BibTeX

@article{JOSHI2026102024,
title = {Neural 3D Reconstruction and Immersive VR Visualization of Row Crops Across Phenological Growth Stages},
journal = {Smart Agricultural Technology},
pages = {102024},
year = {2026},
issn = {2772-3755},
doi = {https://doi.org/10.1016/j.atech.2026.102024},
url = {https://www.sciencedirect.com/science/article/pii/S2772375526002479},
author = {Shambhavi Joshi and Juan Di Salvo and Yanben Shen and Mozhgan Hadadi and Venkata Naresh Boddepalli and Zaki Jubery and Soumik Sarkar and Arti Singh and Baskar Ganapathysubramanian and Asheesh K. Singh and Adarsh Krishnamurthy},
}