Comparison of deep learning-based object detection methods for automatic plant phenology recognition: A case study of Rhododendron hypoglaucum in Shennongjia
Jia Yuan, Zhang Lin, Song Chuangye, Zhao Changming, Guo Xiao, Zhu Xiaoguang, Wu Dongxiu
. 2026, 50 (生态统计方法专题):
0.
doi: 10.17521/cjpe.2025.0334
Abstract
(
48 )
Save
Related Articles |
Metrics
Plant phenology is a key indicator of how ecosystems respond to global climate change. Although automated imaging technologies can generate vast amounts of time-series phenological data, the accurate automated recognition of discrete phenological periods remains a significant methodological challenge, limiting the scalability of phenological monitoring. In this study, we focused on Rhododendron hypoglaucum, one of dominant species in the evergreen broad-leaved mixed forests of Shennongjia, China. Using 4624 automatically captured time-lapse images collected continuously from 2022 to 2025, we implemented and evaluated 3 representative object detection algorithms—Faster R-CNN, YOLOv11, and RT-DETR to develop automated recognition models for key phenophases. The performance of the models was compared to identify the most effective algorithm for establishing a high-frequency, long-term phenology recognition method. All 3 algorithms successfully detected six phenological traits: floral buds, leaf buds, flowers, new leaves, fruits, and senescent leaves. The YOLOv11 model performed best, achieving a precision of 0.785, recall of 0.745, mAP50 of 0.788, and mAP50-95 of 0.501. Based on the results of the optimal model, the phenological periods duration of R. hypoglaucum—including the duration of floral bud growth, leaf bud growth , flowering, new leaf growth, fruit growth, and yellow leaf—were automatically determined, showing a high level of consistency with manual visual interpretation. This study demonstrates that deep learning-based object detection methods can effectively realize the automated extraction of phenological traits and quantitative information from long-term, in-situ observations. It provides a novel approach for high-frequency, precise, and automated phenological recognition at the individual plant level. In the future, through multi-target collaborative monitoring and model optimization, the applicability and robustness of this method are expected to be further enhanced.