The design of an embedded vision system to enable visual servoing for a low-inertia nanosatellite capture arm is described. To minimise inertial effects on the satellite base, the arm utilises a novel joint design with remote actuation, where actuators are housed in the base and torque is transmitted via a cable-sheath mechanism. This allows for low-profile links but prevents the integration of standard angle encoders. Consequently, the arm’s kinematic state must be determined via external sensing.
We propose a vision-based kinematic estimation pipeline wherein each link is instrumented with active LED markers detected by a base-mounted CMOS sensor. The pipeline, implemented on a high-performance microcontroller, utilises colour-based marker segmentation and efficient Perspective-n-Point (PnP) solvers to determine the transform of each link relative to the base. The system is verified through hardware-in-the-loop laboratory experiments, evaluating the feasibility of high-frequency arm pose reconstruction within the power and compute constraints of a nanosatellite platform. These results demonstrate a solution enabling the control of low-inertia nanosatellite arms.