EI、Scopus 收录
中文核心期刊

面向航天器直线特征的事件相机在轨标定方法

ON-ORBIT CALIBRATION OF EVENT CAMERAS USING GEOMETRIC LINES OF SPACECRAFT

  • 摘要: 为了解决事件相机在太空态势感知、在轨服务及视觉导航等航天任务中面临的在轨标定难题, 本文提出了一种基于直线特征的标定方法, 充分利用了航天器固有的结构特征. 与需要专门标定图案的传统标定方法不同, 本文方法消除了对特定标定板或预部署标定基础设施的依赖, 使其在此类辅助设备不可用或不实用的真实在轨部署场景中具有实用性和可行性. 本文方法充分利用事件相机的成像特性, 首先设计了一种基于异步稀疏事件流的鲁棒直线提取算法. 即使在空间环境中常见的挑战性光照条件和动态运动场景下, 该算法也能够从航天器事件流中直接提取二维直线特征. 随后, 基于已知的航天器三维模型, 构建二维观测直线与三维模型直线间的几何投影约束, 利用直接线性变换快速求解出相机内外参数的初始值, 为后续优化过程提供稳健的基础. 为了进一步提升精度, 本文构建了以有效事件到重投影直线的欧氏距离为度量的代价函数, 结合非线性优化算法对焦距、主点、畸变系数及外参数等相机参数进行全局联合优化. 仿真与地面事件标定实验结果表明, 本文方法在各项标定参数上的精度均优于现有标定方法, 验证了其在复杂动态环境下的有效性和鲁棒性. 本研究为事件相机在轨标定提供了一种高效、灵活且环境适应性强的解决方案, 为后续位姿测量、卫星跟踪等视觉应用提供可靠的标定参数.

     

    Abstract: To address the critical challenge of on-orbit calibration for event cameras in demanding space applications, such as space situational awareness, on-orbit servicing, and visual navigation, this paper proposes a line-based calibration method that leverages the inherent structural features of spacecraft. Unlike conventional calibration approaches that necessitate specialized calibration patterns or artificial targets, our method eliminates the dependence on specific calibration boards or pre-deployed calibration infrastructure, thereby making it practical and feasible for real-world on-orbit deployment scenarios, where such auxiliary equipment is unavailable or impractical. The proposed framework leverages the unique characteristics of event cameras by first developing a robust line extraction algorithm specifically designed to handle the asynchronous and sparse nature of event streams. This algorithm can directly extract 2D lines from spacecraft event streams with high reliability, even under challenging lighting conditions and dynamic motion scenarios commonly encountered in space environments. Subsequently, given the known 3D geometric model of spacecraft, we establish rigorous geometric projection constraints between the extracted 2D observation lines and their corresponding 3D model lines. An initial estimation of both intrinsic and extrinsic camera parameters is then efficiently derived using the Direct Linear Transformation method, providing a robust foundation for subsequent optimization procedures. To further improve calibration precision, we formulate a nonlinear optimization objective function that minimizes the Euclidean distance between valid events and their reprojected lines. This approach enables joint refinement of camera parameters, including focal length, principal point, distortion coefficients, and extrinsic parameters, using nonlinear optimization algorithms. Comprehensive calibration experiments, encompassing both simulation studies and ground-based testing, demonstrate that the proposed method achieves higher accuracy in critical parameters compared to state-of-the-art approaches, thereby validating its effectiveness and robustness under complex dynamic conditions. This work provides an efficient, flexible, and environment-adaptive solution for on-orbit event camera calibration, delivering reliable calibration parameters for subsequent pose estimation, satellite tracking, and other vision-based space applications.

     

/

返回文章
返回