عنوان مقاله [English]
Creating a map of the greenhouse environment and determine the position of the pots on this map, which are the main obstacles in agricultural environments, especially greenhouses, is an essential step in automating agricultural operations. In this research, using stereovision, the map from the greenhouse environment was extracted and the pots in this map were detected and segmented. To reach this goal, ROS framework, nodes and network connections in this framework, was used. To evaluate the designed algorithm, the error rate is calculated using Euclidean distance between estimated locations and actual locations of pots. The results of this study showed that 100% of the pots were identified and positioned. The evaluation results showed that the mean errors in estimating the position of the pots was 0.056 and Root mean squared error (RMSE) was 0.0006. Also, the maximum error in estimating the position of the pots was 0.137m and the minimum error was 0.005m. The results showed that the designed algorithm has a high accuracy in estimating the position of the pots
Audras, C., Comport, A., Meilland, M. & Rives, P. (2011) Real-time dense appearance-based slam for rgb-d sensors. In: Australasian Conf. on Robotics and Automation.
Auta Cheein, F. A., Steiner, G., Paina, G. P. & Carelli, R. (2011). Optimized EIF-SLAM algorithm for precision agriculture mapping based on stems detection. Computers and Electronics in Agriculture, 78, 195-207.
Bhatti, A. (2011) Advances in theory and applications of stereo vision. Croatia: InTech.
Borenstein, J. & Koren, Y. (1991) The vector ﬁeld histogram-fast obstacle avoidance for mobile robots. IEEE Transactions on Robotics and Automation, 7(3), 278–288.
Bradski, G. & Kaehler, A. (2008) Learning OpenCV: Computer vision with the OpenCV library. O'Reilly Media, Inc. Sebastopol, CA.
Brosnan, T. & Sun, D.W. (2002) Inspection and grading of agricultural and food products by computer vision systems—a review. Computers and electronics in agriculture, 36, 193-213.
Camera_Calibration. (2009). ROS: camera_calibration_ros software stack, Retrieved December 9, 2018, from http://wiki.ros.org/ camera_calibration_ros.
Cyganek, B. & Siebert, J. P. (2009) An Introduction to 3D Computer Vision Techniques and Algorithms. John Wiley & Sons, Ltd. United Kingdom.
Elfes, A. (1990) Occupancy grids: Astochastic spatial representation for active robot perception. In: Proceedings of the Sixth Conference on Uncertainty in AI, vol. 2929.
Eliazar, A. (2003). DP-SLAM: Fast, robust simultaneous localization and mapping without predetermined landmarks. International Joint Conference on Artificial Intelligence.
Eliazar, A. I. & Parr, R. (2004). DP-SLAM 2.0. Robotics and Automation. Proceedings. ICRA ’04. 2004 IEEE International Conference.
Estrada, C., Neira, J. & Tardos, J. D. (2005). Hierarchical SLAM: Real-Time Accurate Mapping of Large Environments. Robotics, IEEE Transactionson, 21 (4), 588–596.
Grisetti, G., Stachniss, C. & Burgard, W. (2007). Improved techniques for grid mapping with rao-blackwellized particle filters. IEEE Transactions in Robotics, 23, 34-46.
Ji, W., Zhao, D., Cheng, F., Xu, B., Zhang, Y. & Wang, J. (2012) Automatic recognition vision system guided for apple harvesting robot. Computers & Electrical Engineering, 38, 1186-1195.
Kise, M., Zhang, Q. & Más, F. R. (2005). A stereovision-based crop row detection method for tractor-automated guidance. Biosystems Engineering, 90, 357-367.
Kitt, B., Geiger, A. & Lategahn, H. (2010). Visual odometry based on stereo image sequences with RANSAC-based outlier rejection scheme. In Intelligent Vehicles Symposium. University of California, San Diego, CA, USA.
Labbé, M. & Michaud, F. (2011). Memory management for real-time appearance-based loop closure detection. In IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1271–1276.
Labbé, M. & Michaud, F. (2013). Appearance-Based Loop Closure Detection for Online Large-Scale and Long-Term Operation. IEEE Transactions on Robotics, 29 (3), 734-745.
Labbé, M. & Michaud, F. (2014). Online Global Loop Closure Detection for Large-Scale Multi-Session Graph-Based SLAM. In IEEE/RSJ International Conference on Intelligent Robots and Systems.
Langaniere, R. (2011). OpenCV 2 Computer Vision Application Programming Cookbook.
Leonard, J. & Durrant-Whyte, H. (1991). Mobile robot localization by tracking geometric beacons. IEEE Transactions on Robotics and Automation, 7, 376–382.
Lepej, P. & Rakun, J. (2016). Simultaneous localisation and mapping in a complex field environment. Biosystems Engineering, 150, 160-169.
Li, M. H., Hong, B. R., Cai, Z. S., Piao, S. H., & Huang, Q. C. (2008). Novel indoor mobile robot navigation using monocular vision. Engineering Applications of Artificial Intelligence, 21, 485–497.
Longuet-Higgins, H. (1987). A computer algorithm for reconstructing a scene from two projections. Readings in Computer Vision: Issues, Problems, Principles, and Paradigms, MA Fischler and O. Firschein, eds, pp. 61–62.
Lowe, D. (2007). Distinctive image features from scale-invariant key points. International journal of computer vision, 60 (2), 91–110.
Milella, A., Nardelli, B., Di Paola, D. & Cicirelli, G. (2009). Robust Feature Detection and Matching for Vehicle Localization in Uncharted Environments. In Proceedings of the IEEE/RSJ IROS Workshop Planning, Perception and Navigation for Intelligent Vehicles. Saint Louis, USA.
Montemerlo, M., Thrun, S., Koller, D. & Wegbreit, B. (2002). FastSLAM: a factored solution to the simultaneous localization and mapping problem. In AAAI National Conference on Artiﬁcial Intelligenc.
Mousazadeh, H. & Javan bakht, S. (2015). Mechatronics and Intelligent Systems for Off-road Vehicles. University of Tehran (1th ed.).
Nasiri, A. (2017). Creation of pathway map in a greenhouse environment using localization of cultivation platform based on stereo machine vision. Ph. D. dissertation, University of Tehran.
Nasiri, A., Mobli, H., Hosseinpour, S. & Rafiee, SH. (2016). Creation of two-dimensional greenhouse environment map using stereo vision. J. biosystem engineering, 47, 689-700. (In Farsi)
Nister, D., Naroditsky, O. & Bergen, J. (2004). Visual odometry. In: Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1, I–652.
Rath, T. & Kawollek, M. (2009). Robotic harvesting of Gerbera Jamesonii based on detection and three-dimensional modeling of cut flower pedicels. Computers and electronics in agriculture, 66, 85-92.
Rong X., Huanyu, J. & Yibin, Y. (2014). Recognition of clustered tomatoes based on binocular stereo vision. Computers and Electronics in Agriculture, 106, 75–90.
Slaughter, D., Giles, D. & Downey, D. (2008). Autonomous robotic weed control systems: A review. Computers and electronics in agriculture, 61, 63-78.
Thrun, S., Burgard, W. & Fox, D. (1998). A probabilistic approach to concurrent mapping and localization for mobile robots. Autonomous Robots, 3, 18.
Torii, T. (2000). Research in autonomous agriculture vehicles in Japan. Computers and electronics in agriculture, 25, 133-153.
Van Henten, E. J., Hemming, J., Van Tuijl, B., Kornet, J., Meuleman, J., Bontsema, J. & Van Os, E. (2002). An autonomous robot for harvesting cucumbers in greenhouses. Autonomous Robots, 13, 241-258.
Yeh, Y. H. F., Lai, T. C., Liu, T. Y., Liu, C. C., Chung, W. C. & Lin, T.T. (2014). An automated growth measurement system for leafy vegetables. Biosystems Engineering, 117, 4350.
Yongsheng S., Gang, L. & Juan, F. (2015). Location of apples in trees using stereoscopic vision. Computers and Electronics in Agriculture, 112, 68–74.
Zhang, Z. (1998). A flexible new technique for camera calibration. URL http://citeseer. IST. Psu. Edu/316762. html.
Zhang, Z. (1999). Flexible camera calibration by viewing a plane from unknown orientations. In Computer Vision. The Proceedings of the Seventh IEEE International Conference on. Kerkyra, Greece.