Preview

Proceedings of the Southwest State University

Advanced search

Algorithm for automatic counting of fish in an image and tracking their movement based on the YOLOv9t neural model

https://doi.org/10.21869/2223-1560-2025-29-4-187-203

Abstract

Purpose of research. Traditional methods based on visual observation and manual counting not only have obvious limitations in terms of time and human resource costs but also yield insufficiently accurate results due to the subjective human factor involved in the process. These inaccuracies, even minor ones, can lead to erroneous management decisions, which negatively impact production efficiency in aquaculture. 

Methods. To eliminate these shortcomings, this paper presents an automated solution that utilizes the YOLOv9t neural network model for the task of detecting and counting fish in underwater images. Thanks to the optimized architecture of the YOLOv9t neural model, which contains only 2 million parameters, it demonstrated high performance in identifying fish in images from the DeepFish dataset, with the following evaluation metrics: Precision - 0.928, Recall - 0.91, mAP50 - 0.961, and mAP50-95 - 0.584. The Non-Maximum Suppression method was used to eliminate duplicate detections of fish in the same area, while the application of the DeepSORT algorithm enabled the continuous tracking of each individual across video frame sequences by assigning unique identifiers. 

Results. The research results confirmed that the YOLOv9t neural model is suitable for creating automated video analytics systems in aquaculture for monitoring fish behavior and managing activation devices. This enables the transition of key control processes to a fully automated basis, thereby optimizing resource utilization. The proposed architecture provided high accuracy and reliability across various environmental conditions-from clear to murky wateropening prospects for application in real-world production environments. 

Conclusion. This operational stability makes the system ready for industrial-scale implementation with the aim of enhancing farm management efficiency.

About the Author

V. N. Le
St. Petersburg Federal Research Center of the Russian Academy of Sciences
Russian Federation

Le Van Nghia, Post-Graduate Student

14th Line V.O., 39, St. Petersburg 199178


Competing Interests:

The Author declare the absence of obvious and potential conflicts of interest related to the publication of this article.



References

1. Vo T. T. E., Ko H., Huh J. H., Kim Y. Overview of smart aquaculture system: Focusing on applications of machine learning and computer vision. Electronics. 2021; 10(22): 2882.

2. Ji, Yiting, et al. Design and realization of a novel hybrid-drive robotic fish for aquaculture water quality monitoring. Journal of Bionic Engineering. 2023; 20.2: 543-557.

3. Kruusmaa Maarja, et al. Salmon behavioural response to robots in an aq-uaculture sea cage. Royal Society open science. 2020; 7.3: 191220.

4. Rastegari Hajar, et al. Internet of Things in aquaculture: A review of the challenges and potential solutions based on current and future trends. Smart Agricultural Technology. 2023; 4: 100187.

5. Mandal Arghya, Apurba Ratan Ghosh. Role of artificial intelligence (AI) in fish growth and health status monitoring: A review on sustainable aquaculture. Aquaculture International. 2024; 32.3: 2791-2820.

6. Le V. N., Ronzhin A. L. A Review of Intelligent Control Systems and Robotics Applications in Aquaculture Production. Morskiye intellektual'nyye tekhnologii = Marine intelligent technologies. 2024; 63: 171–180. (In Russ.).

7. Chiu Min-Chie, et al. Development of smart aquaculture farm management system using IoT and AI-based surrogate models. Journal of Agriculture and Food Research, 2022; 9: 100357.

8. Ronzhin A. L., Le V. N., Shuvalov N. Optimization of the process map of admissible system-technical solutions for the problem of video analytics in aquaculture. Vestnik Yuzhno-Ural'skogo gosudarstvennogo universiteta = Bulletin of the South Ural State University. 2024; 16: 50-58. (In Russ.).

9. Yang Ling, et al. Computer vision models in intelligent aquaculture with emphasis on fish detection and behavior analysis: a review. Archives of Computational Methods in Engineering, 2021; 28.4.

10. Quaade Sebastian, et al. Remote sensing and computer vision for marine aquaculture. Science Advances, 2024; 10.42.

11. Van Nghia, Le, Tran Van Tuyen, Andrey Ronzhin. Fish image classification based on MobileNetV2 with transfer learning technique for robotic application in aquaculture. In: International Conference on Interactive Collaborative Robotics. 2024. P. 201-212.

12. Tran T., Duong B., Vu Q., Le V., Glibko O., Ronzhin A. L.. Methods and Technical Means of Nonintrusive Assessment of Fish Biomass and Robotic Maintenance of Cage Aquaculture. International Conference on Agriculture Digitalization and Organic Production. 2024. 207-215.

13. Terven Juan, Diana-Margarita Córdova-Esparza, Julio-Alejandro Romero-González. A comprehensive review of YOLO architectures in computer vision: From YOLOv1 to YOLOv8 and YOLO-NAS. Machine Learning and Knowledge Extraction. 2023; 5.4: 1680– 1716.

14. Zhang Zhenzuo, et al. A method for counting fish based on improved YOLOv8. Aquacultural Engineering. 2024; 107: 102450.

15. Yu Huihui, et al. An automatic detection and counting method for fish lateral line scales of underwater fish based on improved YOLOv5. IEEE, 2023. P. 143616-143627,

16. Le V. N., Ronzhin A. L. Methods and technical means of positioning and navigation of robots in the aquatic environment. Izvestiya Kabardino-Balkarskogo nauchnogo tsentra RAN = Bulletin of the Kabardino-Balkarian Scientific Center of the Russian Academy of Sciences. 2023; 6(116): 167–178 (In Russ.).

17. Wu Jianxin. Introduction to convolutional neural networks. National Key Lab for Novel Software Technology. 2017. P. 495.

18. Khanam Rahima, Muhammad Hussain. A Review of YOLOv12: At-tention-Based Enhancements vs. Previous Versions. arXiv preprint arXiv:2504.11995, 2025.

19. Shorten Connor, Taghi M. Khoshgoftaar. A survey on image data augmentation for deep learning. Journal of big data. 2019; 6(1): 1–48.

20. Zhuang Zhenxun, et al. Understanding AdamW through proximal methods and scale-freeness. arXiv preprint arXiv:2202.00089, 2022.

21. Prechelt L. Early stopping—but when? Neural Networks: Tricks of the trade. 2022. P. 55–69.

22. Hosang Jan, Rodrigo Benenson, Bernt Schiele. Learning non-maximum suppression. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2017.

23. Wojke Nicolai, Alex Bewley, Dietrich Paulus. Simple online and realtime tracking with a deep association metric. In: 2017 IEEE International Conference on Image Processing (ICIP), 2017.

24. Wang Chien-Yao, I-Hau Yeh, Hong-Yuan Mark Liao. Yolov9: Learning what you want to learn using programmable gradient information. In: European conference on computer vision, 2024.

25. Saleh Alzayat, et al. A realistic fish-habitat dataset to evaluate algorithms for underwater visual analysis. In: Scientific reports. 2020. P. 14671.


Review

For citations:


Le V.N. Algorithm for automatic counting of fish in an image and tracking their movement based on the YOLOv9t neural model. Proceedings of the Southwest State University. 2025;29(4):187-203. (In Russ.) https://doi.org/10.21869/2223-1560-2025-29-4-187-203

Views: 78

JATS XML


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 2223-1560 (Print)
ISSN 2686-6757 (Online)