ROS环境下Android客户端与ORBSLAM2
ROS环境下编译ORBSLAM-2
ROS安装(Ubuntu18.04)
ROS环境准备安装
ORB-SLAM2 算法环境搭建
- 创建ROS workspace:
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/
catkin_make
mkdir ORB-SLAM2
- 添加环境变量:
echo "source ~/catkin_ws/devel/setup.bash" >> ~/.bashrc
source ~/.bashrc
- 安装依赖库:
sudo apt-get install libblas-dev liblapack-dev
编译Pangolin:
sudo apt-get install libglew-dev
sudo apt-get install libboost-dev libboost-thread-dev libboost-filesystem-dev
sudo apt-get install libpython2.7-dev
sudo apt-get install build-essential
cd ~/catkin_ws/ORB-SLAM2
git clone https://github.com/stevenlovegrove/Pangolin.git
cd Pangolin
mkdir build
cd build
cmake -DCPP11_NO_BOOST = 1..
make
下载Eigen 注意是3.2.10版本:
mkdir build
cd build
cmake ..
make
sudo make install
安装OpenCV:
sudo apt-get install cmake git libgtk2.0-dev pkg-config libavcodec-dev libavformat-dev libswscale-dev
mkdir release
cd release
cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local ..
make
sudo make install
- 编译安装ORB_SLAM2:
export ROS_PACKAGE_PATH=${ROS_PACKAGE_PATH}:/home/(user)/catkin_ws/ORB-SLAM2/Examples/ROS
cd ORB_SLAM2
chmod +x build_ros.sh
./build_ros.sh
error:uspleep()函数未定义的错误->解决方法,在报错代码文件加:
- 在ROS中使用usb摄像头:
cd catkin_ws/src
git clone https://github.com/bosch-ros-pkg/usb_cam.git
cd ..
catkin_make
mkdir build
cd build
cmake ..
make
查找替换摄像头:
ls /dev/video*
cd /home/pan/catkin_ws/src/usb_cam
cd launch
gedit usb_cam-test.launch
修改/dev/videoX 为支持摄像头 8. 运行指令:
roscore
roslaunch usb_cam usb_cam-test.launch
rosrun ORB_SLAM2 Mono /home/(user)/catkin_ws/src/ORB_SLAM2/Vocabulary/ORBvoc.txt /home/(user)/catkin_ws/src/ORB_SLAM2/Examples/Monocular/MyCam.yaml
跑通Android客户端
Android客户端开源代码
Android 客户端源代码:Github 修改build的地址
PC接收侧代码:Github 保证客户端和PC在同一局域网下,修改客户端连接ip地址
RVIZ 如何接收IMU 和图像数据
- 接收IMU 数据:
ubutun安装和ROS版本一致的IMU-TOOLS
sudo apt-get install ros-melodic-imu-tools
或者
sudo apt-get install ros-kinetic-imu-tools
指令打开RIVZ后选择接收Add - By topic - 添加 imu
将topic选为/android/imu
且在 Global Options -> Fix Frame 中 将 map 改为 imu。
参考操作:
调试rviz,并解决问题“For frame [laser]: Fixed Frame [map] does not exist”!
- 接收图像数据
首先安装对应版本的ROS配件:
sudo apt-get install ros-indigo-image-view ros-indigo-rqt-image-view ros-indigo-image-transport-plugins
或者
sudo apt-get install ros-melodic-image-view ros-melodic-rqt-image-view ros-melodic-image-transport-plugins
在Terminal 先启动roscore 在通过rostopic list 查看已有的订阅主题 默认的image传输为compressed,因此需要做将image转换为raw格式 cd到Android_Camera-IMU-master文件夹下找android_cam-imu.launch文件,修改其中的订阅节点如下 Terminal中输入指令:roslaunch android_cam-imu.launch Add Topic -> image -> Image Topic 设置为/camera/image_raw
Android摄像头的相机标定
github源码 相机标定模块使用说明 1. 在DASH中输入cheese打开相机,从各个角度拍摄标定棋盘10~20张,一定要变换相机姿态,减少误差。 2. 将ccheese拍摄到的图片拷贝到camera_calibration目录下,注意不要出现(.jpg命名,其中为自然数字,否则图片有可能被覆盖)。 3. 打开终端,执行./rename.sh讲图片名统一编号为自然数序列。 4. 在终端执行./camera_calibration boardWidth boardHeight squareSize frameNumber boardWidth :棋盘横向角点数目 如:9 boardHeight :棋盘纵向角点数目 如:6 squareSize :棋盘中每个格子(要求是正方形)的实际边长,单位:mm 如:25 frameNumber:要计算的图片数量 如:17
基于ROS协议的收发数据
参考文章:知乎 在android_tutorial_camera_imu 工程下增加Talker和Listener Class: Listener.java监听rmytopic
package org.ros.android.android_tutorial_camera_imu;
import android.util.Log;
import org.ros.message.MessageListener;
import org.ros.namespace.GraphName;
import org.ros.node.AbstractNodeMain;
import org.ros.node.ConnectedNode;
import org.ros.node.NodeMain;
import org.ros.node.topic.Subscriber;
public class Listener extends AbstractNodeMain {
@Override
public GraphName getDefaultNodeName() {
return GraphName.of("rosjava_tutorial_pubsub/listener");
}
public void onStart(ConnectedNode connectedNode)
{
Log.d("listener", "Node Log "+ connectedNode.getLog() + " ");
Subscriber<std_msgs.String> subscriber = connectedNode.newSubscriber("rmytopic", std_msgs.String._TYPE);
subscriber.addMessageListener(new MessageListener<std_msgs.String>() {
@Override
public void onNewMessage(std_msgs.String message) {
Log.d("listener", "I heard: \"" + message.getData() + "\"");
}
});
}
}
Talker.java发送至chatter
package org.ros.android.android_tutorial_camera_imu;
import org.ros.concurrent.CancellableLoop;
import org.ros.namespace.GraphName;
import org.ros.node.AbstractNodeMain;
import org.ros.node.ConnectedNode;
import org.ros.node.NodeMain;
import org.ros.node.topic.Publisher;
public class Talker extends AbstractNodeMain {
@Override
public GraphName getDefaultNodeName() {
return GraphName.of("rosjava_tutorial_pubsub/talker");
}
@Override
public void onStart(final ConnectedNode connectedNode) {
final Publisher<std_msgs.String> publisher =
connectedNode.newPublisher("chatter", std_msgs.String._TYPE);
connectedNode.executeCancellableLoop(new CancellableLoop() {
@Override
protected void setup() {
}
@Override
public void loop() throws InterruptedException {
std_msgs.String str = publisher.newMessage();
str.setData("Hello ROS from client");
publisher.publish(str);
Thread.sleep(25000);
}
});
}
}
修改MainActivity.java
package org.ros.android.android_tutorial_camera_imu;
import android.Manifest;
import android.annotation.TargetApi;
import android.content.Context;
import android.content.pm.PackageManager;
import android.hardware.Camera;
import android.hardware.SensorManager;
import android.location.LocationManager;
import android.os.Build;
import android.os.Bundle;
import android.support.v4.app.ActivityCompat;
import android.util.Log;
import android.view.MotionEvent;
import android.view.Window;
import android.view.WindowManager;
import android.widget.Toast;
import org.ros.address.InetAddressFactory;
import org.ros.android.RosActivity;
import org.ros.android.view.camera.RosCameraPreviewView;
import org.ros.node.NodeConfiguration;
import org.ros.node.NodeMainExecutor;
import java.util.List;
import org.ros.android.MessageCallable;
import org.ros.android.view.RosTextView;
import org.ros.android.android_tutorial_camera_imu.Talker;
public class MainActivity extends RosActivity {
private int cameraId = 0;
private RosCameraPreviewView rosCameraPreviewView;
private NavSatFixPublisher fix_pub;
private ImuPublisher imu_pub;
private NodeMainExecutor nodeMainExecutor;
private LocationManager mLocationManager;
private SensorManager mSensorManager;
private RosTextView<std_msgs.String> rosTextView;
private Talker talker;
private Listener listener;
public MainActivity() {
super("ROS", "Camera & Imu");
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_main);
rosCameraPreviewView = findViewById(R.id.ros_camera_preview_view);
mLocationManager = (LocationManager) this.getSystemService(Context.LOCATION_SERVICE);
mSensorManager = (SensorManager) this.getSystemService(SENSOR_SERVICE);
rosTextView = (RosTextView<std_msgs.String>)findViewById(R.id.text);
rosTextView.setTopicName("rmytopic");
rosTextView.setMessageType(std_msgs.String._TYPE);
rosTextView.setMessageToStringCallable(new MessageCallable<String, std_msgs.String>() {
@Override
public String call(std_msgs.String message) {
return message.getData();
}
});
}
@Override
public boolean onTouchEvent(MotionEvent event) {
if (Build.VERSION.SDK_INT < Build.VERSION_CODES.P) {
if (event.getAction() == MotionEvent.ACTION_UP) {
int numberOfCameras = Camera.getNumberOfCameras();
final Toast toast;
if (numberOfCameras > 1) {
cameraId = (cameraId + 1) % numberOfCameras;
rosCameraPreviewView.releaseCamera();
rosCameraPreviewView.setCamera(getCamera());
toast = Toast.makeText(this, "Switching cameras.", Toast.LENGTH_SHORT);
} else {
toast = Toast.makeText(this, "No alternative cameras to switch to.", Toast.LENGTH_SHORT);
}
runOnUiThread(new Runnable() {
@Override
public void run() {
toast.show();
}
});
}
}
return true;
}
@Override @TargetApi(Build.VERSION_CODES.ICE_CREAM_SANDWICH_MR1)
protected void init(NodeMainExecutor nodeMainExecutor) {
this.nodeMainExecutor = nodeMainExecutor;
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
String[] PERMISSIONS = {"", "", "", ""};
PERMISSIONS[0] = Manifest.permission.ACCESS_FINE_LOCATION;
PERMISSIONS[1] = Manifest.permission.CAMERA;
PERMISSIONS[2] = Manifest.permission.READ_EXTERNAL_STORAGE;
PERMISSIONS[3] = Manifest.permission.WRITE_EXTERNAL_STORAGE;
ActivityCompat.requestPermissions(this, PERMISSIONS, 0);
}else {
NodeConfiguration nodeConfiguration1 = NodeConfiguration.newPublic(InetAddressFactory.newNonLoopback().getHostAddress());
nodeConfiguration1.setMasterUri(getMasterUri());
nodeConfiguration1.setNodeName("android_sensors_driver_nav_sat_fix");
this.fix_pub = new NavSatFixPublisher(mLocationManager);
nodeMainExecutor.execute(this.fix_pub, nodeConfiguration1);
rosCameraPreviewView.setCamera(getCamera());
NodeConfiguration nodeConfiguration2 = NodeConfiguration.newPublic(InetAddressFactory.newNonLoopback().getHostAddress());
nodeConfiguration2.setMasterUri(getMasterUri());
nodeConfiguration2.setNodeName("android_sensors_driver_camera");
nodeMainExecutor.execute(this.rosCameraPreviewView, nodeConfiguration2);
}
NodeConfiguration nodeConfiguration3 = NodeConfiguration.newPublic(InetAddressFactory.newNonLoopback().getHostAddress());
nodeConfiguration3.setMasterUri(getMasterUri());
nodeConfiguration3.setNodeName("android_sensors_driver_imu");
this.imu_pub = new ImuPublisher(mSensorManager);
nodeMainExecutor.execute(this.imu_pub, nodeConfiguration3);
talker = new Talker();
listener = new Listener();
NodeConfiguration nodeConfiguration4 = NodeConfiguration.newPublic(InetAddressFactory.newNonLoopback().getHostAddress());
nodeConfiguration4.setMasterUri(getMasterUri());
NodeConfiguration nodeConfiguration5 = NodeConfiguration.newPublic(InetAddressFactory.newNonLoopback().getHostAddress());
nodeConfiguration5.setMasterUri(getMasterUri());
nodeMainExecutor.execute(talker, nodeConfiguration4);
nodeMainExecutor.execute(listener,nodeConfiguration5);
nodeMainExecutor.execute(rosTextView,nodeConfiguration4);
}
private void executeGPS() {
NodeConfiguration nodeConfiguration1 = NodeConfiguration.newPublic(InetAddressFactory.newNonLoopback().getHostAddress());
nodeConfiguration1.setMasterUri(getMasterUri());
nodeConfiguration1.setNodeName("android_sensors_driver_nav_sat_fix");
this.fix_pub = new NavSatFixPublisher(mLocationManager);
nodeMainExecutor.execute(this.fix_pub, nodeConfiguration1);
}
private void executeCamera() {
rosCameraPreviewView.setCamera(getCamera());
NodeConfiguration nodeConfiguration2 = NodeConfiguration.newPublic(InetAddressFactory.newNonLoopback().getHostAddress());
nodeConfiguration2.setMasterUri(getMasterUri());
nodeConfiguration2.setNodeName("android_sensors_driver_camera");
nodeMainExecutor.execute(this.rosCameraPreviewView, nodeConfiguration2);
}
private Camera getCamera() {
Camera cam = Camera.open(cameraId);
cam.stopPreview();
Camera.Parameters camParams = cam.getParameters();
camParams.setPreviewSize(720,480);
camParams.setPictureSize(720,480);
List<Camera.Size> supportedPreviewSizes = cam.getParameters().getSupportedPreviewSizes();
List<Camera.Size> pictureSizes = cam.getParameters().getSupportedPictureSizes();
int length = pictureSizes.size();
for (int i = 0; i < length; i++) {
Log.d("SupportedSizes","SupportedPictureSizes : " + pictureSizes.get(i).width + "x" + pictureSizes.get(i).height);
}
length = supportedPreviewSizes.size();
for (int i = 0; i < length; i++) {
Log.d("SupportedSizes","SupportedPreviewSizes : " + supportedPreviewSizes.get(i).width + "x" + supportedPreviewSizes.get(i).height);
}
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.ICE_CREAM_SANDWICH) {
if (camParams.getSupportedFocusModes().contains(
Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
camParams.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
} else {
camParams.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
}
}
cam.setParameters(camParams);
cam.startPreview();
return cam;
}
@Override
public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {
if (requestCode == 0) {
if (grantResults.length > 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
executeGPS();
}
if (grantResults.length > 1 && grantResults[1] == PackageManager.PERMISSION_GRANTED) {
executeCamera();
}
if (grantResults.length > 2 && grantResults[2] == PackageManager.PERMISSION_GRANTED &&
grantResults[3] == PackageManager.PERMISSION_GRANTED) {
}
}
}
}
PC终端侧 指令查阅:ROSwiki
查信息:
rostopic echo /chatter[topicName]
查发送速率:
rostopic hz /topic_name
发送数据:
rostopic pub my_topic std_msgs/String "hello there"
遇到问题: pc端查看rosgraph图,一切连接正常,查看android ros node 发布的话题,也有消息输出,但是当通过pc端使用rostopic pub 指令向android ros node 订阅的话题发布消息时,调试来看,android 端接收不到这个消息,问题在于没有在pc端设置ROS_IP环境变量
即需要在pc端每一个运行node的terminal中事先要设置ROS_IP环境变量为pc的IP地址
export ROS_IP=[your pc ip]
如何Reszie接收图像的大小
- 添加node节点至ORB_SLAM2中
将工作空间中的ORB_SLAM2/src/ORB_SLAM2/Examples/ROS/ORB_SLAM2/src中的ros_mono.cc
ros::Subscriber sub = nodeHandler.subscribe("/camera/image_raw", 1, &ImageGrabber::GrabImage,&igb);
这里调用的为相机原尺寸的图像,需要对图像大小resize(),
重新编译ORBSLAM2, 并将Android相机内参修改yaml文件,运行指令: 进入ros_mono.cc ->
- 添加opencv域:using namespace cv;
- void ImageGrabber::GrabImage()函数,先将ROS图像转为CV格式,再利用opencv Resize()函数重新修改大小:
void ImageGrabber::GrabImage(const sensor_msgs::ImageConstPtr& msg)
{
cv_bridge::CvImageConstPtr cv_ptr;
cv::Mat img;
cv::Mat img_resize;
try
{
cv_ptr = cv_bridge::toCvShare(msg);
img = cv_ptr->image;
resize(img, img_resize, cv::Size(720, 480));
}
catch (cv_bridge::Exception& e)
{
ROS_ERROR("cv_bridge exception: %s", e.what());
return;
}
mpSLAM->TrackMonocular(img_resize,cv_ptr->header.stamp.toSec());
}
最后,重新编译文件,再次运行指令:
rosrun ORB_SLAM2 Mono '/home/xujiayi/Desktop/ORBvoc.bin' '/home/xujiayi/catkin_ws/ORB_SLAM2/ORB_SLAM2-master/Examples/ROS/ORB_SLAM2/Asus.yaml'
|