Abstract: With the wide application of micro Unmanned Aerial Vehicle (UAV) in civil fields such as aerial photography, mapping, environmental monitoring and courier delivery, there is a higher requirement for the availability and reliability of micro UAV. In order to make the micro UAV complete autonomous landing accurately, based on computer vision which has the advantages of low cost, strong independence, and abundant information, We present a relative localization algorithm that identified and matched a multi-leval nested Marker,and show the corresponding autonomous landing system of UAVs with multi-leval Marker detection and orientation. Compared with other systems, the amount of information encoded has the characteristics of high recognition rate and large coding space. Therefore, The system can simultaneously support the configuration of single or multiple apron platforms, and low cost, and without add airborne equipment cost. Finally, the simulation verification and real flight test of the system show that the proposed algorithm can realize the autonomous landing of UAVs.

Key words: Unmanned Aerial Vehicle (UAV), multi-level landing markers, estimation of camera pose, autonomous precision landing, landing system [1] 崔粲. 无人机航拍在铁路事故救援中应用思考[J]. 铁路通信信号工程技术, 2016, 13(1):48-51. CUI C. Thought of applying unmanned aerial vehicle photograph system in railway accident rescue[J]. Railway Signalling & Communication Engineering, 2016, 13(1):48-51(in Chinese).
[2] 邓扬, 何军, 李奇. 自动化无人机快递系统的研究与设计[J]. 计算机光盘软件与应用, 2014, 17(12):102-104. DENG Y, HE J, LI Q. Research and design of automatic UAV express system[J]. Computer CD Software and Applications, 2014, 17(12):102-104(in Chinese).
[3] MERZ T, DURANTI S, CONTE G. Autonomous landing of an un-manned helicopter based on vision and inertial sensing[J]. Berlin:Springer. 2006(21):343-352.
[4] KAANICHE K, CHAMPION B, PEGARD C, et al. A vision algorithm for dynamic detection of moving vehicles with a UAV[C]//International Conference on Robotics and Automation(ICRA). Piscataway, NJ:IEEE Press, 2005:1878-1883.
[5] 夏云龙, 魏国亮, 刘青阳, 等. 无人机精准降落的控制方法:CN103955227A[P]. 2014-07-30. XIA Y L, WEI G L, LIU Q Y, et al. UAV precision landing control methods:CN103955227A[P]. 2014-07-30(in Chinese).
[6] 崔玮玮, 曹志刚, 魏建强. 声源定位中的时延估计技术[J]. 数据采集与处理, 2007(1):90-99. CUI W W, CAO Z G, WEI J Q. Time delay estimation techniques in source location[J]. Journal of Data Acquisition & Processing, 2007(1):90-99(in Chinese).
[7] 吴赛飞, 王新华, 贾森, 等. 基于红外视觉的固定翼无人机自动着陆引导系统[J]. 电子测量技术, 2016, 39(3):131-135. WU S F, WANG X H, JIA S, et al. Autonomous landing guidance system of a fixed wing UAV based on infrared visual electronic measurement technology[J]. Electronic Measurement Technology, 2016, 39(3):131-135(in Chinese).
[8] 樊珑. 多旋翼无人机视觉引导降落研究[D]. 哈尔滨:哈尔滨工业大学, 2016:6-7. FAN L. Study on vision guided landing of a multirotor UAV[D]. Harbin:Harbin Institute of Technology, 2016:6-7(in Chinese).
[9] 郝振海, 黄圣国. 高精度气压高度表的研制[J]. 南京航空航天大学学报, 2009, 41(1):134-138. HAO Z H, HUANG S G. Development of high precision barometric altimeter[J]. Journal of Nanjing University of Aeronautics & Astronautics, 2009, 41(1):134-138(in Chinese).
[10] 张和生, 宋明耀. 提高超声测距精度的设计[J]. 电子产品世界, 2004(13):87-89. ZHANG H S, SONG M Y. The design for improving precision of ultrasound distance measurement[J]. Electronic Engineering & Product World, 2004(13):87-89(in Chinese).
[11] 章学静, 薛琳, 李金平, 等. 汉明(Hamming)码及其编译码算法的研究与实现[J]. 北京联合大学学报(自然科学版), 2008(1):46-49. ZHANG X J, XUE L, LI J P, et al. Research and realization of co-decoding algorithm of Hamming codes[J]. Journal of Beijing Union University (Natural Sciences), 2008(1):46-49(in Chinese).
[12] 张少伟. 基于机器视觉的边缘检测算法研究与应用[D]. 上海:上海交通大学, 2013:23-31. ZHANG S W. Research and application of edge detection algorithms based on machine vision[D]. Shanghai:Shanghai Jiao Tong University, 2013:23-31(in Chinese).
[13] LAGANIERE R. OpenCV 2计算机视觉编程手册[M]. 张静, 译. 北京:科学出版社, 2013:143-164. LAGANIERE R. OpenCV 2 computer vision programming manual[M]. ZHANG J, translated. Beijing:Science Press, 2013:143-164(in Chinese).
[14] STEGER C, ULRICH M, WIEDEMANN C. 机器视觉算法与应用[M]. 杨少荣, 吴迪靖, 段德山, 译. 北京:清华大学出版社, 2008:111-132. STEGER C, ULRICH M, WIEDEMANN C. Machine vision algorithms and applications[M]. YANG S R, WU D J, DUAN D S, translated. Beijing:Tsinghua University Press, 2008:111-132(in Chinese).
[15] RAFAEL C G, RICHARD E W. 数字图像处理[M]. 北京:电子工业出版社, 2004:627-679. RAFAEL C G, RICHARD E W. Digital image processing using MATLAB[M]. Beijing:Electronic Industry Press, 2004:627-679(in Chinese).
[16] 周爱武, 于亚飞. K-Means聚类算法的研究[J]. 计算机技术与发展, 2011, 21(2):62-65. ZHOU A W, YU Y F. The research about clustering algorithm of K-Means[J]. Computer Technology and Development, 2011, 21(2):62-65(in Chinese).
[17] HARTLEY R, ZISSERMAN A. Multiple view geometry in computer vision[M]. Cambridge:Cambridge University Press, 2003:416-421.
[18] CHEN Y L, CHEN Z R. A PID positioning controller with a curve fitting model based on RFID technology[J]. Journal of Applied Research and Technology, 2013, 11(2):301-310.
[19] 李高宇. 一种四旋翼无人机自动降落控制方法研究与设计[D]. 桂林:广西师范大学, 2017:28-39. LI G Y. Research and design of automatic landing method for four rotor UAV[D]. Guilin:Guangxi Normal University, 2017:28-39(in Chinese).
[20] SHREINER D. OpenGL编程指南[M]. 李军, 徐波, 译. 北京:机械工业出版社, 2010:19-422. SHREINER D. OpenGL programming guide[M]. LI J, XU B, translated. Beijing:China Machine Press, 2010:19-422(in Chinese). 郑高杰, 何小明, 李东坡, 谭慧俊, 汪昆, 吴祯龙, 王德鹏. 尾部推进无人机双90°偏折进气道/蜗壳耦合流动特性 [J]. 航空学报, 2024, 45(4): 128782-128782. 李伟, 郭艳, 李宁, 刘存涛, 袁昊. 智能反射面辅助无人机移动边缘计算任务数据最大化方法 [J]. 航空学报, 2023, 44(19): 328486-328486. 李辉, 龙腾, 孙景亮, 徐广通. 基于自适应视线法的无人机三维航迹跟踪方法 [J]. 航空学报, 2022, 43(9): 326105-326105. 高明, 余伟臣, 王杉杉, 王荣闯, 石健将. 太阳能无人机能源系统的多维耦合建模 [J]. 航空学报, 2021, 42(7): 224461-224461. 胡莘婷, 吴宇. 面向城市飞行安全的无人机离散型多路径规划方法 [J]. 航空学报, 2021, 42(6): 324383-324383. 张宏伟, 达新宇, 胡航, 倪磊, 潘钰. 基于多机协作的认知无人机网络能效联合优化 [J]. 航空学报, 2021, 42(6): 324548-324548. 王荣, 闫溟, 白鹏, 杨云军, 徐国武. 飞翼无人机平面外形气动隐身优化设计 [J]. 航空学报, 2017, 38(S1): 721532-721532. 马培蓓, 雷明, 纪军. 均等通信时滞下多UAV协同编队控制 [J]. 航空学报, 2017, 38(S1): 721551-721551. 张民, 田鹏飞, 陈欣. 一种无人机定距盘旋跟踪制导律及稳定性证明 [J]. 航空学报, 2016, 37(11): 3425-3434.