Preview

Agricultural Engineering (Moscow)

Advanced search

Automated sugar beet seedling detection and mapping using a UAV-borne RGB camera and deep learning techniques

https://doi.org/10.26897/2687-1149-2025-6-4-16

Abstract

Accurate and timely assessment of plant stand density is crucial for modern crop production, directly impacting sugar beet yield and profitability. This study aims to develop and validate a highly accurate automated method for counting sugar beet seedlings using unmanned aerial vehicles (UAVs) and deep learning algorithms, optimizing both precision and processing speed. Field experiments were conducted in 2025 on commercial sugar beet fields in the Buzdyak district of the Republic of Bashkortostan. A DJI Phantom 4 Pro UAV equipped with an RGB camera captured aerial imagery from a 20-meter altitude. Initial vegetation segmentation employed the Excess Green (ExG) index, followed by binarization and morphological filtering. The YOLOv8n and YOLOv5m deep learning architectures, trained on a manually annotated dataset of aerial images, were then implemented for seedling detection and classification. Algorithm performance was rigorously evaluated against manual seedling counts on control plots. The YOLOv8n model demonstrated superior performance (Precision: 0.80; Recall: 0.70; AP50: 0.75; R²: 0.99), achieving a minimum relative error of 1.11% and a root mean squared error (RMSE) of 3.0. While YOLOv5m exhibited comparable correlation (R²: 0.98), it displayed lower recall and precision. The developed algorithm enables the generation of spatial distribution maps of seedlings, readily integrated into precision agriculture systems. This technology significantly reduces labor costs for seedling counting – by orders of magnitude compared to manual methods – while also eliminating subjective errors. The obtained results demonstrate the feasibility for industrial implementation, enabling rapid crop condition assessment, informed replanting decisions, and targeted site-specific agro-technological interventions. Future research will focus on expanding the algorithm to incorporate simultaneous weed mapping and adapting it for use with other crops.

About the Authors

S. G. Mudarisov
Bashkir State Agrarian University
Russian Federation

Salavat G. Mudarisov, DSc, Professor

St. 50 years of October, 34, Ufa, 450001, Republic of Bashkortostan



I. R. Miftakhov
Bashkir State Agrarian University
Russian Federation

Ilnur R. Miftakhov, CSc

St. 50 years of October, 34, Ufa, 450001, Republic of Bashkortostan



I. M. Farkhutdinov
Bashkir State Agrarian University
Russian Federation

Ildar M. Farkhutdinov, DSc, Associate Professor

St. 50 years of October, 34, Ufa, 450001, Republic of Bashkortostan



References

1. Alt V.V., Pestunov I.A., Melnikov P.V., Elkin O.V. Automated detection of weeds and evaluation of crop sprouts quality based on RGB images. Siberian Herald of Agricultural Science. 2018;48(5):52-60. (In Russ.) https://doi.org/10.26898/0370-8799-2018-5-7

2. Bastaubaeva N.L., Bekbatyrov M.B., Tabynbaeva L.K., Burakhoja A.M. Formation of a programmed sugar beet harvest by influencing the basic elements of regulation. Sakhar. 2021;10:32-38. (In Russ.)

3. Vasilenko V.V., Vasilenko S.V. Inversion of seeds and its effect on the quality of single-grain sowing. Vestnik of Voronezh State Agrarian University. 2019;12(2):102-108. (In Russ.) EDN: WMRPRL

4. Barreto A.A., Lottes P., Yamati F.R.I. еt al. Automatic UAV-based counting of seedlings in sugar beet fields and extension to maize and strawberry. Computers and Electronics in Agriculture. 2021;191:106493. https://doi.org/10.1016/j.compag.2021.106493

5. Goltyapin V.Ya., Golubev I.G. Areas and experience of using unmanned aerial vehicles for surveying agricultural lands. Aktualnye voprosy razvitiya agrarnogo sektora ekonomiki baykalskogo regiona: Proceedings of the All-Russian (National) scientific and practical conference dedicated to the Day of Russian Science, Ulan-Ude, February 04-10, 2021. Ulan-Ude: Buryat State Agricultural Academy named after V.R. Filippov, 2021. Рp. 81-85.

6. Zhang S., Yang Y., Tu L. et al. Comparison of YOLO-based sorghum spike identification detection models and monitoring at the flowering stage. Plant Methods. 2025;21(20). https://doi.org/10.1186/s13007-025-01338-z

7. Casado-García A., Heras J., Milella A., Marani R. Semi-supervised deep learning and low-cost cameras for the semantic segmentation of natural images in viticulture. Precision Agriculture. 2022;23(6):2001-2026. https://doi.org/10.1007/s11119-022-09929-9

8. Mudarisov S.G., Miftakhov I.R. Deep learning methods and UAV technologies for crop disease detection. Agricultural Machinery and Technologies. 2024;18(4):24-33. (In Russ.) https://doi.org/10.22314/2073-7599-2024-18-4-24-33

9. Lottes P., Behley J., Milioto A., Stachniss C. Fully convolutional networks with sequential information for robust crop and weed detection in precision farming. IEEE Robotics and Automation Letters. 2018;3(4):2870-2877. https://doi.org/10.48550/arXiv.1806.03412

10. Logeshwaran J., Srivastava D., Kumar K.S. et al. Improving crop production using an agro-deep learning framework in precision agriculture. BMC Bioinformatics. 2024;25:341. https://doi.org/10.1186/s12859-024-05970-9

11. Oh S., Chang A., Ashapure A. et al. Plant counting of cotton from UAS imagery using deep learning-based object detection framework. Remote Sensing. 2020;12(18):2981. https://doi.org/10.3390/rs12182981

12. Minniakhmetov I.S., Murzabulatov B.S., Shafeeva E.I., Lykasov O.N. Development of agriculture in the Buzdyaksky district of the Republic of Bashkortostan. Vestnik Bashkir State Agrarian University. 2021;1:27-34. (In Russ.)

13. Bushnev A.S., Orekhov G.I., Kotlyarova I.A. et al. Efficiency of technological methods of cultivation of sunflower maternal line. Agricultural Science Euro-North-East. 2025;26(1):115-128. (In Russ.) https://doi.org/10.30766/2072-9081.2025.26.1.115-128

14. Sycheva I.V., Sychev S.M., Osipov A.A. Evaluation of disease prevalence on sugar beet hybrids. Vestnik Bryanskoy GSKhA. 2024;2:31-36. (In Russ.)

15. Han X., Wang H., Yuan T. et al. A rapid segmentation method for weed based on CDM and ExG index. Crop Protection. 2023;172;106321. https://doi.org/10.1016/j.cropro.2023.106321

16. Vaghefi S.A., Ibrahim M.F., Mohd M.H. et al. Optimized weed image classification via parallel convolutional neural networks integrating an excess green index channel. International journal of electrical and computer engineering systems. 2025;16(3):205-216. https://doi.org/10.32985/ijeces.16.3.2

17. Ueno T., Nagano Sh., Moriyuki Sh. et al. Optimized excess-green image binarization for accurate estimation of lettuce seedling leaf-area in a plant factory. Environmental Control in Biology. 2022;60(3):153-159. https://doi.org/10.2525/ecb.60.153

18. Sukhobok Yu.A., Ten Ye.E., Ponomarchuk Yu.V., Shoberg K.A. Railway gap detection based on image processing and deep learning techniques. Aktualnye teoretiko-metodologicheskie i prikladnye problemy virtualnoy realnosti i iskusstvennogo intellekta: Proceedings of the International Scientific Conference. 2021. Khabarovsk: Far Eastern State University of Railway Transport. Рp. 56-63

19. Kutyrev A.I. Convolutional neural network for segmentation of apple blossoms in images. Agricultural Science Euro-North-East. 2024;25(5):949-961. (In Russ.) https://doi.org/10.30766/2072-9081.2024.25.5.949-961

20. Ronkin M.V., Akimova E.N., Misilov V.E., Reshetnikov K.I. Review on application of deep neural networks and parallel architectures for rock fragmentation problems. Bulletin of the South Ural State University. Series: Computational Mathematics and Software Engineering. 2023;12(4):5-54. (In Russ.) https://doi.org/10.14529/cmse230401


Review

For citations:


Mudarisov S.G., Miftakhov I.R., Farkhutdinov I.M. Automated sugar beet seedling detection and mapping using a UAV-borne RGB camera and deep learning techniques. Agricultural Engineering (Moscow). 2025;27(6):4-16. (In Russ.) https://doi.org/10.26897/2687-1149-2025-6-4-16

Views: 65

JATS XML


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 2687-1149 (Print)
ISSN 2687-1130 (Online)