# 利用视觉或运动捕捉系统进行位置估计

VIO 和 MOCAP 都从“视觉”信息中确定飞机的 pose （位置和姿态）。 它们之间的主要区别是框架透视图：

• VIO 使用 板载传感器 从车辆的角度获取姿势数据（见 egomotion）。
• MoCap 使用 离板摄像机 系统在 3D 空间中获取飞机姿态数据（即它是一个外部系统，告诉飞机其姿态）。

Pose data from either type of system can be used to update a PX4-based autopilot's local position estimate (relative to the local origin) and also can optionally also be fused into the vehicle attitude estimation. Additionally, if the external pose system also provides linear velocity measurements, it can be used to improve the state estimate (fusion of linear velocity measurements is only supported by the EKF2).

PX4 使用以下 MAVLink 消息获取外部位置信息，并将其映射到 uORB 主题

VISION_POSITION_ESTIMATE vehicle_visual_odometry
ODOMETRY (frame_id = MAV_FRAME_LOCAL_FRD) vehicle_visual_odometry
ATT_POS_MOCAP vehicle_mocap_odometry
ODOMETRY (frame_id = MAV_FRAME_MOCAP_NED) vehicle_mocap_odometry

EKF2 只订阅 vehicle_visual_odometry 主题，因此只能处理前两个消息（MoCap 系统必须生成这些消息才能与 EKF2 配合使用）。 The odometry message is the only message that can send also linear velocities to PX4. The LPE estimator subscribes to both topics, and can hence process all the above messages.

PX4 默认使用 EKF2 估计。 相比 LPE 得到更好的测试和支持，更得到推荐。

## 参考机架

PX4 uses FRD (X Forward, Y Right and Z Down) for the local body frame as well for the reference frame. When using the heading of the magnetometer, the PX4 reference frame x axis will be aligned with north, so therefore it is called NED (X North, Y East, Z Down). The heading of the reference frame of the PX4 estimator and the one of the external pose estimate will not match in most cases. Therefore the reference frame of the external pose estimate is named differently, it is called MAV_FRAME_LOCAL_FRD.

Depending on the source of your reference frame, you will need to apply a custom transformation to the pose estimate before sending the MAVLink Vision/MoCap message. This is necessary to change the orientation of the parent and child frame of the pose estimate, such that it fits the PX4 convention. Have a look at the MAVROS odom plugin for the necessary transformations.

ROS 用户可以在下面的 参考机架和 ROS 中找到更详细的说明。

x_{mav}y_{mav}z_{mav} 是我们将通过 MAVLink 发送的位置量，然后我们得到：

x_{mav} = x_{mocap}
y_{mav} = z_{mocap}
z_{mav} = - y_{mocap}


## EKF2 调参/配置

EKF2_AID_MASK Set vision position fusion, vision velocity fusion, vision yaw fusion and external vision rotation accoring to your desired fusion model.
EKF2_HGT_MODE 设置为 Vision 使用视觉作为高度估计的主要来源。
EKF2_EV_DELAY 设置为测量的时间戳和 "实际" 捕获时间之间的差异。 有关详细信息，请参阅 below
EKF2_EV_POS_X, EKF2_EV_POS_Y, EKF2_EV_POS_Z 设置视觉传感器（或 MoCap 标记）相对于机器人的车身框架的位置。

#### 调参 EKF2_EV_DELAY

EKF2_EV_DELAY 是相对于 IMU 测量的 Vision 位置估计延迟

## LPE 调参/配置

### 启用外部位置输入

LPE_FUSION 如果选中了 fuse 视觉位置 （默认情况下启用），则启用视觉集成。
ATT_EXT_HDG_M 设置为1或 2，以启用外部标题集成。 将其设置为1将启用视觉，而2则启用了 MoCap heading 的使用。

## 使用 ROS

ROS 不是提供外部姿态信息的 required，但强烈建议使用它，因为它已经与 VIO 和 MoCap 系统进行了良好的集成。 PX4 必须已设置如上所示。

### 将数据输入 ROS

VIO 和 MoCap 系统具有不同的获取姿势数据的方式，并且有自己的设置和主题。

below 涵盖了特定系统的设置。 对于其他系统，请参阅供应商设置文档。

### 将数据回传给 PX4

MAVROS 具有插件，可使用以下管道从 VIO 或 MOCAP 系统中继可视化估计：

/mavros/vision_pose/pose VISION_POSITION_ESTIMATE vehicle_visual_odometry
/mavros/odometry/odom ODOMETRY (frame_id = MAV_FRAME_LOCAL_FRD) vehicle_visual_odometry
/mavros/mocap/pose ATT_POS_MOCAP vehicle_mocap_odometry
/mavros/odometry/odom ODOMETRY (frame_id = MAV_FRAME_MOCAP_NED) vehicle_mocap_odometry

• 必须重新映射 geometry_msgs/PoseStampedgeometry_msgs/PoseWithCovarianceStamped 类型的 MOCAP ROS 主题，以 /mavros/vision_pose/posegeometry_msgs/PoseStamped 主题是最常见的，因为 mocap 通常没有与数据相关的协方差。
• 如果您通过 nav_msgs/Odometry ROS 消息获取数据，则需要重新映射数据以 /mavros/odometry/odom

### 参考框架和 ROS

ROS 和 PX4 使用的本地/世界坐标系和全局框架是不同的。

Both frames are shown in the image below (FLU on left/FRD on right).

With EKF2 when using external heading estimation, magnetic north can either be ignored and or the heading offset to magnetic north can be calculated and compensated. Depending on your choice the yaw angle is given with respect to either magnetic north or local x.

When creating the rigid body in the MoCap software, remember to first align the robot's local x axis with the world x axis otherwise the yaw estimate will have an offset. This can stop the external pose estimate fusion from working properly. Yaw angle should be zero when body and reference frame align.

The MAVROS odometry plugin makes it easy to handle the coordinate frames. It uses ROS's tf package. Your external pose system might have a completely different frame convention that does not match the one of PX4. The body frame of the external pose estimate can depend on how you set the body frame in the MOCAP software or on how you mount the VIO sensor on the drone. The MAVROS odometry plugin needs to know how the external pose's child frame is oriented with respect to either the airframe's FRD or FLU body frame known by MAVROS. You therefore have to add the external pose's body frame to the tf tree. This can be done by including an adapted version of the following line into your ROS launch file.

  <node pkg="tf" type="static_transform_publisher" name="tf_baseLink_externalPoseChildFrame"
args="0 0 0 <yaw> <pitch> <roll> base_link <external_pose_child_frame> 1000"/>


Make sure that you change the values of yaw, pitch and roll such that it properly attaches the external pose's body frame to the base_link or base_link_frd. Have a look at the tf package for further help on how to specify the transformation between the frames. You can use rviz to check if you attached the frame right. The name of the external_pose_child_frame has to match the child_frame_id of your nav_msgs/Odometry message. The same also applies for the reference frame of the external pose. You have to attach the reference frame of the external pose as child to either the odom or odom_frd frame. Adapt therefore the following code line accordingly.

  <node pkg="tf" type="static_transform_publisher" name="tf_odom_externalPoseParentFrame"
args="0 0 0 <yaw> <pitch> <roll> odom <external_pose_parent_frame> 1000"/>


If the reference frame has the z axis pointing upwards you can attached it without any rotation (yaw=0, pitch=0, roll=0) to the odom frame. The name of external_pose_parent_frame has to match the frame_id of the odometry message.

When using the MAVROS odom plugin, it is important that no other node is publishing a transform between the external pose's reference and child frame. This might break the tf tree.

## 特定的系统设置

### 光学跟踪 MoCap

The following steps explain how to feed position estimates from an OptiTrack system to PX4. It is assumed that the MoCap system is calibrated. See this video for a tutorial on the calibration process.

#### 将数据输入 ROS

• 安装 vrpn_client_ros
• 你可以通过运行在一个单独的话题上得到每一个机体的姿势
  bash
roslaunch vrpn_client_ros sample.launch server:=<mocap machine ip>


If you named the rigidbody as robot1, you will get a topic like /vrpn_client_node/robot1/pose

#### 重新映射/重新映射位置数据

MAVROS provides a plugin to relay pose data published on /mavros/vision_pose/pose to PX4. Assuming that MAVROS is running, you just need to remap the pose topic that you get from MoCap /vrpn_client_node/<rigid_body_name>/pose directly to /mavros/vision_pose/pose. Note that there is also a mocap topic that MAVROS provides to feed ATT_POS_MOCAP to PX4, but it is not applicable for EKF2. However, it is applicable with LPE.

Remapping pose topics is covered above Relaying pose data to PX4 (/vrpn_client_node/<rigid_body_name>/pose is of type geometry_msgs/PoseStamped).

Assuming that you have configured EKF2 parameters as described above, PX4 now is set and fusing MoCap data.

You are now set to proceed to the first flight.

## 第一次飞行

After setting up one of the (specific) systems described above you should now be ready to test. The instructions below show how to do so for MoCap and VIO systems

### Check external estimate

Be sure to perform the following checks before your first flight:

• Set the PX4 parameter MAV_ODOM_LP to 1. PX4 will therefore stream back the received external pose as MAVLink ODOMETRY messages.
• It is recommended to check these MAVLink messages with e.g. the Analyze Widget of QGroundControl. In order to do this, yaw the vehicle until the quaternion of the ODOMETRY message is very close to a unit quaternion. (w=1, x=y=z=0)
• At this point the body frame is aligned with the reference frame of the external pose system. If you do not manage to get a quaternion close to the unit quaternion without rolling or pitching your vehicle, your frame probably still have a pitch or roll offset. Do not proceed if this is the case and check your coordinate frames again.
• Once aligned you can pick the vehicle up from the ground and you should see the position's z coordinate decrease. Moving the vehicle in forward direction, should increase the position's x coordinate. While moving the vehicle to the right should increase the y coordinate. In the case you send also linear velocities from the external pose system, you should also check the linear velocities. Check that the linear velocities are in expressed in the FRD body frame reference frame.
• Set the PX4 parameter MAV_ODOM_LP back to 0. PX4 will stop streaming this message back.

If those steps are consistent, you can try your first flight.

Put the robot on the ground and start streaming MoCap feedback. Lower your left (throttle) stick and arm the motors.

At this point, with the left stick at the lowest position, switch to position control. You should have a green light. The green light tells you that position feedback is available and position control is now activated.

Put your left stick at the middle, this is the dead zone. With this stick value, the robot maintains its altitude; raising the stick will increase the reference altitude while lowering the value will decrease it. Same for right stick on x and y.

Increase the value of the left stick and the robot will take off, put it back to the middle right after. Check if it is able to keep its position.

If it works, you may want to set up an offboard experiment by sending position-setpoint from a remote ground station.