A coordinate frame is set of orthogonal axes attached to a body that serves to describe position of points relative to that body.
In robotics, we use right-handed coordinate systems.
Robot (Canonical) Frame
From REP103, In relation to a body the standard is:
- x forward, y left, and z up
In constrast, for the Optical Frame, we have
- x right, y down, z forward
This is because an image is drawn on the
Reference Frame Conventions:
In many robotics problems, the first step is to assign a coordinate frame to all objects of interest.
Why can't we just use a single coordinate frame (i.e.
Whether you like it or not, the car is equipped with a LiDAR, and obstacles are measured from the reference frame of
laser, NOT from the reference frame of the
Coordinate frames thus to describe the origin of some measurement. The coordinate doesn’t mean anything, unless you tell me it is in relation to what coordinate frame (ex:
base_link: rigidly attached to the mobile robot base
map: World fixed frame
odom: Also a world-fixed frame. See Odometry
earth(designed to allow multiple robots interact in different
odom for local sensing (like velocity calculations), and
map for global localization estimates.
odomframe is useful as an accurate, short-term local reference, but drift makes it a poor frame for long-term reference.”
mapframe is useful as a long-term global reference, but discrete jumps in position estimators make it a poor reference frame for local sensing and acting.""
The orientation of this coordinate frame is X-forward, Y-left, Z-up (REP 103 - Standard Units of Measure and Coordinate Conventions)
odomis the origin of the global axis, and it is fixed frame on the ground.
base_linkis the local axis, it is moving frame and fixed on the robot.
What coordinate system?
Most 3D libraries use righthanded coordinates (such as OpenGL, 3DS Max, etc.), and some libraries use lefthanded coordinates (such as Unity, Direct3D, etc).