IT数码 购物 网址 头条 软件 日历 阅读 图书馆
TxT小说阅读器
↓语音阅读,小说下载,古典文学↓
图片批量下载器
↓批量下载图片,美女图库↓
图片自动播放器
↓图片自动播放器↓
一键清除垃圾
↓轻轻一点,清除系统垃圾↓
开发: C++知识库 Java知识库 JavaScript Python PHP知识库 人工智能 区块链 大数据 移动开发 嵌入式 开发工具 数据结构与算法 开发测试 游戏开发 网络协议 系统运维
教程: HTML教程 CSS教程 JavaScript教程 Go语言教程 JQuery教程 VUE教程 VUE3教程 Bootstrap教程 SQL数据库教程 C语言教程 C++教程 Java教程 Python教程 Python3教程 C#教程
数码: 电脑 笔记本 显卡 显示器 固态硬盘 硬盘 耳机 手机 iphone vivo oppo 小米 华为 单反 装机 图拉丁
 
   -> 人工智能 -> 视觉SLAM十四讲笔记-7-1 -> 正文阅读

[人工智能]视觉SLAM十四讲笔记-7-1

视觉SLAM十四讲笔记-7-1

视觉里程计-特征点法

主要目标:
1.理解图像特征点的意义,并掌握在单幅图像中提取特征点及多幅图像中匹配特征点的方法;
2.理解对极几何的原理,利用对极几何的约束,恢复图像之间的摄像机的三维运动;
3.理解PNP问题,以及利用已知三维结构与图像的对应关系求解摄像机的三维运动;
4.理解ICP问题,以及利用点云的匹配关系求解摄像机的三维运动;
5.理解如何通过三角化获得二维图像上对应点的三维结构。
本节前面介绍了运动方程和观测方程的具体形式,并讲解了以非线性优化为主的求解方法。从这讲开始,进入正题。分别介绍四个模块:视觉里程计、后端优化、回环检测和地图构建。
本将介绍视觉里程计中常用的一种方法:特征点法。

7.1 特征点法

一个SLAM系统分为前端和后端,前端也成为视觉里程计。视觉里程计根据相邻图像的信息估计出粗略的相机运动,给后端提供较好的初始值。视觉里程计的算法主要分为两大类:特征点法直接法
本将学习如何提取、匹配图像特征点,然后估计两帧之间的相机运动和场景结构,从而实现一个两帧间视觉里程计,这类算法也成为两视图几何

7.1.1 特征点

视觉里程计的核心问题是如何根据图像估计相机运动。然而,图像是由一个由亮度和色彩组成的矩阵,如何直接从矩阵层面进行考虑运动估计,将会非常困难。所以,比较方便的做法是:首先,从图像中选取比较有代表性的点,这些点在相机视角发生少量变化后会保持不变,于是就可以在各个图像中找到相同的点。然后在这些点的基础上讨论相机的位姿估计问题,以及这些点的定位问题。在经典SLAM中,这些点称为路标。而在视觉SLAM中,路标则是图像特征
特征是图像信息的另一种数字表达形式。一组好的特征对在指定任务上的最终表现至关重要。在视觉里程计中希望特征点在相机运动之后保持稳定。 特征点是图像里一些特别的地方。例如角点(提取角点的方法:Harris角点,Fast角点,GFTT角点等)。但是在大多数应用中,单纯的角点并不能满足要求。为此,计算机视觉领域的研究者们在常年的研究中设计了更加稳定的局部图像特征,如著名的SIFT,SURF,ORB等等。相比于朴素的角点,这些人共设计的特征点能够拥有如下性质:
1.可重复性;2.可区别性;3.高效率;4.局部性
特征点由关键点(Key-point)描述子(Descriptor) 两部分组成。当说在一个图像中计算SIFT特征点时,是指提取SIFT关键点并计算描述子两件事情。关键点是指该特征点在图像中的位置,有些特征点还具有朝向、大小等信息。描述子通常是一个向量,按照某些人为设计的方式,描述了该关键点周围像素的信息。描述子是按照“外观相似的特征应该具有相似的描述子”的原则设计的。因此,只要认为两个特征点的描述子在向量空间上的距离相近,就认为它们是同样的特征点。
在目前的SLAM方案中,ORB是质量和性能之间较好的这种。因此以ORB为代表介绍提取特征的整个过程。

7.1.2 ORB特征

ORB特征有关键点描述子两部分组成。它的关键点称为"Oriented FAST",是一种改进的FAST角点。它的描述子称为BRIEF。因此,提取ORB特征分为如下两个步骤:
1.FAST角点提取:找到图像中的“角点”。相较于原版的FAST,ORB中计算了特征点的主方向,为后续的BRIEF描述子增加了旋转不变特性。
2.BRIEF描述子:对前一步提取出特征点的周围图像区域进行描述。ORB对BRIEF进行了一些改进,主要是指在BRIEF中使用了先前计算的方向信息。

FAST关键点

FAST是一种角点,主要检测局部像素灰度变化明显的地方,以速度快著称。它的一个思想是:如果一个像素与邻域像素的差别较大(过亮或过暗),那么它更可能是角点。相比于其他角点检测算法,FAST只需比较像素亮度的大小,十分敏捷。
检测过程如下:
1.在图像中选取像素 p p p,假设它的亮度为 I p I_p Ip?
2.设置一个阈值 T T T (比如 I p I_p Ip? 20 % 20 \% 20% ) 。
3.以像素 p p p为中心, 选取半径为 3 的圆上的 16 个像素点。
4.假如选取的圆上,有连续的 N N N个点的亮度大于 I p + T I_p + T Ip?+T 或小于 I p ? T I_p ? T Ip??T,那么像素 p p p可以被认为是特征点 ( N N N通常取 12,即为 FAST-12。其它常用的 N N N 取值为 9 和 11,他们分别被称为 FAST-9, FAST-11)。
循环以上四步,对每一个像素执行相同的操作。
图片来源:link
在这里插入图片描述

在FAST-12算法中,为了更高效,可以添加一项预测试操作,以快速地排除绝大多数不是角点的像素。具体操作为:对于每个像素,直接检测邻域圆上的第1,5,9,13个像素的亮度。只有当这4个像素中有3个同时大于 I p + T I_p + T Ip?+T 或小于 I p ? T I_p ? T Ip??T时,当前像素才有可能是一个角点,否则应直接排除。这样的预测试操作大大加速了角点检测。此外,原始的FAST角点经常出现“扎堆”现象。所以在第一遍检测之后,还需要用非极大值抑制-(Non-maximal suppression),在一定区域内仅保留响应极大值的角点,避免角点集中的问题。
FAST角点的计算仅仅比较像素间亮度的差异,所以速度非常快,但是它也有重复性不强、分布不均匀的缺点。此外,FAST角点不具有方向信息。而且在远看是角点近看不一定是角点了。针对FAST角点不具有方向性和尺度的弱点,ORB添加了尺度和旋转的描述。尺度不变性由构建图像金字塔,并在图像金字塔的每一层上检测角点来实现。而特征的旋转由灰度质心法实现。
金字塔是计算机视觉中常用的一种方法。金字塔底层是原始图像,每往上一层,就对图像进行一个固定倍率的缩放,这样就有了不同倍率的图像。较小的图像可以看作是远处看来的景象。在特征匹配算法中,可以匹配不同层的图像,从而实现尺度不变性
图像链接:link
在这里插入图片描述
在旋转方面,计算特征点附近的图像灰度质心。所谓质心是指以图像块灰度值作为权重的中心。其具体操作步骤如下:
1.在一个小的图像块 B B B中,定义图像块的矩为:
m p q = ∑ x , y ∈ B x p y q I ( x , y ) , p , q = { 0 , 1 } m_{p q}=\sum_{x, y \in B} x^{p} y^{q} I(x, y), \quad p, q=\{0,1\} mpq?=x,yB?xpyqI(x,y),p,q={0,1}
2.通过矩可以找到图像块的质心:
C = ( m 10 m 00 , m 01 m 00 ) = ( ∑ x I ( x , y ) ∑ I ( x , y ) , ∑ y I ( x , y ) ∑ I ( x , y ) ) C=\left(\frac{m_{10}}{m_{00}}, \frac{m_{01}}{m_{00}}\right)=\left(\frac{\sum x I(x, y)}{\sum I(x, y)}, \frac{\sum y I(x, y)}{\sum I(x, y)}\right) C=(m00?m10??,m00?m01??)=(I(x,y)xI(x,y)?,I(x,y)yI(x,y)?)
3.连接图像块的几何中心 O O O 与质心 C C C,得到一个方向向量 O C → \overrightarrow{O C} OC ,于是特征点的方向可以定义为:
θ = arctan ? ( m 01 m 10 ) = arctan ? ( ∑ y I ( x , y ) ∑ x I ( x , y ) ) \theta=\arctan \left(\frac{m_{01}}{m_{10}}\right)=\arctan \left(\frac{\sum y I(x, y)}{\sum x I(x, y)}\right) θ=arctan(m10?m01??)=arctan(xI(x,y)yI(x,y)?)
通过以上方法,FAST角点便具有了尺度和旋转的描述,从而大大提升了其在不同图像之间表述的鲁棒性。所以在ORB中,把这种改进后的FAST称为Oriented Fast。

BRIEF描述子

在提取到Oriented FAST关键点后,对每个点计算其描述子。ORB使用改进的BRIEF特征描述。
BRIEF是一种二进制描述子,其描述向量由许多个0和1组成,这里的0和1编码了关键点?两个随机像素(比如 p p p q q q)的大小关系:如果 p p p> q q q,则取1;反之取0。如果取128个这样的 p p p q q q,则最后得到128维由0,1组成的向量。BRIEF使用了随机选点的比较,速度非常快,而且由于使用了二进制表达,存储起来也比较方便,适用于实时的图像匹配。原始的BRIEF描述子不具有旋转不变性,因此在图像发生旋转时容易丢失。而ORB在FAST特征点提取阶段计算量关键点的方向,所以可以利用方向信息,计算旋转之后的"Steer BRIEF"特征使ORB的描述子具有较好的旋转不变性。
由于考虑了旋转和缩放,ORB在平移、旋转和缩放的变换下仍有良好的表现。同时FAST和BRIEF的组合也非常高效,使得ORB特征在实时SLAM中非常受欢迎。在下图中展示了一张使用OpenCV提取ORB特征的结果:
图像来源:link
在这里插入图片描述
下面介绍如何在不同的图像之间进行特征匹配。

7.1.3 特征匹配

特征匹配是视觉SLAM中极为关键的一步,特征匹配解决了SLAM中数据关联问题,即确定当前看到的路标与之前看到的路标之间的对应关系。通过对图像与图像或者图像与地图之间的描述子进行准确匹配,可以为后续的姿态估计、优化等操作减轻大量负担。然而,由于图像特征的局部特性,误匹配的情况广泛存在,而且长期以来一直没有得到有效解决,目前已经成为视觉SLAM制约性能提升的一大瓶颈。主要原因是场景中存在大量的重复纹理,使得特征描述非常相似。
考虑两个时刻的图像。如果在图像 I t I_t It?中提取到特征点 x t m , m = 1 , 2 , 3... , M x_t^m,m=1,2,3...,M xtm?,m=1,2,3...,M,在图像 I t + 1 I_{t+1} It+1?中提取到特征点 x t + 1 n , n = 1 , 2 , 3 , . . . , N x_{t+1}^n,n=1,2,3,...,N xt+1n?,n=1,2,3,...,N,如何寻找这两个集合元素的对应关系尼?最简单的方法就是暴力匹配,即对每一个特征点 x t m x_t^m xtm?与所有的 x t + 1 n x_{t+1}^n xt+1n?测量描述子之间的距离,然后排序,取最近的一个作为匹配点。描述子距离表示了两个特征之间的相似程度。不过在实际应用中还可以取不同的距离度量范数。对于浮点类型的描述子,使用欧氏距离进行度量;对于二进制的描述子(BRIEF),往往使用汉明距离作为度量—两个二进制之间的汉明距离,指的是不同位数的个数。
当特征点数量很大时,暴力匹配法的运算量将变得很大,特别是当想要匹配某个帧和一张地图时,这不符合在SLAM中的实时性要求。此时,快速近似最近邻(FLANN)算法更加适合于匹配数量极多的情况。目前这些匹配算法已经成熟并且已经集成到OpenCV。

7.2 实践:特征提取和匹配

OpenCV已经集成了多数主流的图像特征

7.2.1 OpenCV的ORB特征

新建文件夹,并在该文件夹下打开VS Code。

mkdir orb_cv
cd orb_cv
code .
//launch.json
{
    // Use IntelliSense to learn about possible attributes.
    // Hover to view descriptions of existing attributes.
    // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
    "version": "0.2.0",
    "configurations": [
        {
            "name": "g++ - 生成和调试活动文件",
            "type": "cppdbg",
            "request":"launch",
            "program":"${workspaceFolder}/build/orb_cv",
            "args": [],
            "stopAtEntry": false,
            "cwd": "${workspaceFolder}",
            "environment": [],
            "externalConsole": false,
            "MIMode": "gdb",
            "setupCommands": [
                {
                    "description": "为 gdb 启动整齐打印",
                    "text": "-enable-pretty-printing",
                    "ignoreFailures": true
                }
            ],
            "preLaunchTask": "Build",
            "miDebuggerPath": "/usr/bin/gdb"
        }
    ]
}
//tasks.json
{
	"version": "2.0.0",
	"options":{
		"cwd": "${workspaceFolder}/build"   //指明在哪个文件夹下做下面这些指令
	},
	"tasks": [
		{
			"type": "shell",
			"label": "cmake",   //label就是这个task的名字,这个task的名字叫cmake
			"command": "cmake", //command就是要执行什么命令,这个task要执行的任务是cmake
			"args":[
				".."
			]
		},
		{
			"label": "make",  //这个task的名字叫make
			"group": {
				"kind": "build",
				"isDefault": true
			},
			"command": "make",  //这个task要执行的任务是make
			"args": [

			]
		},
		{
			"label": "Build",
			"dependsOrder": "sequence", //按列出的顺序执行任务依赖项
			"dependsOn":[				//这个label依赖于上面两个label
				"cmake",
				"make"
			]
		}
	]
}
#CMakeLists.txt
cmake_minimum_required(VERSION 3.0)

project(ORBCV)

#在g++编译时,添加编译参数,比如-Wall可以输出一些警告信息
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall")
set(CMAKE_CXX_FLAGS "-std=c++11")

#一定要加上这句话,加上这个生成的可执行文件才是可以Debug的,不然不加或者是Release的话生成的可执行文件是无法进行调试的
set(CMAKE_BUILD_TYPE Debug)

#此工程要调用opencv库,因此需要添加opancv头文件和链接库
#寻找OpenCV库
find_package(OpenCV REQUIRED)

#添加头文件
include_directories(${OpenCV_INCLUDE_DIRS})

add_executable(orb_cv orb_cv.cpp)

#链接OpenCV库
target_link_libraries(orb_cv ${OpenCV_LIBS})
#include <iostream>
#include <opencv2/core/core.hpp>
#include <opencv2/features2d/features2d.hpp>
#include <opencv2/highgui/highgui.hpp>
#include <algorithm>
#include <chrono>

using namespace std;
using namespace cv;

int main(int argc, char **argv)
{
    // 
    // if(argc != 3)
    // {
    //     cout << "usage: feature_extraction img1 img2" << endl;
    //     return 1;
    // }
    // //---读取图像
    // Mat img_1 = imread(argv[1], CV_LOAD_IMAGE_COLOR);
    // Mat img_2 = imread(argv[2], CV_LOAD_IMAGE_COLOR);
    // assert(img_1.data != nullptr && img_2.data != nullptr);
    // 

   //---读取图像
    Mat img_1 = imread("./1.png", CV_LOAD_IMAGE_COLOR);
    Mat img_2 = imread("./2.png", CV_LOAD_IMAGE_COLOR);

    //初始化
    std::vector<KeyPoint> keypoints_1, keypoints_2; //img_1的关键点,img_2的关键点
    Mat descriptors_1, descriptors_2;//img_1的描述子,img_2的描述子
    Ptr<FeatureDetector> detector = ORB::create();//create()中可以赋值,比如create(2000)来表示提取特征点的个数
    Ptr<DescriptorExtractor> descriptor = ORB::create(); //描述子
    Ptr<DescriptorMatcher> matcher = DescriptorMatcher::create("BruteForce-Hamming"); //特征匹配

    //---第一步:检测Oriented FAST角点位置
    //定义Ptr< FeatureDetector >类型的检测对象,使用这个detect去检测两个图像,然后返回值存储在刚才定义的 std::vector< KeyPoint> 类型的特征点。
    chrono::steady_clock::time_point t1 = chrono::steady_clock::now();
    detector->detect(img_1, keypoints_1);//检测img_1的Oriented FST角点
    detector->detect(img_2, keypoints_2);//检测img_2的Oriented FST角点

    //---第二步:根据角点位置计算BRIEF描述子
    //定义Ptr< DescriptorExtractor >类型的特征点提取对象,使用这个descriptor,根据计算出的关键点,计算BRIEF描述子
    descriptor->compute(img_1, keypoints_1, descriptors_1);
    descriptor->compute(img_2, keypoints_2, descriptors_2);
    chrono::steady_clock::time_point t2 = chrono::steady_clock::now();
    chrono::duration<double> time_used = chrono::duration_cast<chrono::duration<double>> (t2 - t1);
    cout << "extract ORB cost = " << time_used.count() << "second." << endl;

    Mat outimg1;
    drawKeypoints(img_1, keypoints_1, outimg1, Scalar::all(-1), DrawMatchesFlags::DEFAULT);
    imshow("ORB features", outimg1);
    //waitKey(0);

    //第三步:对两幅图像中的BRIEF描述子进行匹配,使用Hamming距离
    vector<DMatch> matches;
    t1 = chrono::steady_clock::now();
    //定义Ptr< DescriptorMatcher > 类型的匹配对象,用这个matcher对象的match函数,输入刚刚计算出的描述子descriptors_1,descriptors_2,对两幅图片的BRIEF描述子进行匹配,返回为vector< DMatch > 类型的匹配对象matches.
    matcher->match(descriptors_1, descriptors_2, matches);
    t2 = chrono::steady_clock::now();
    time_used = chrono::duration_cast<chrono::duration<double>> (t2 - t1);
    cout << "match ORB cost = " << time_used.count() << "second." << endl;

    //第四步:匹配点对筛选
    //计算最小距离和最大距离
    auto min_max = minmax_element(matches.begin(), matches.end(),
    [](const DMatch &m1, const DMatch &m2) {return m1.distance < m2.distance;});
    double min_dist = min_max.first->distance;
    double max_dist = min_max.second->distance;

    printf("-- Max dist : %f \n", max_dist);
    printf("-- Min dist : %f \n", min_dist);

    //当描述子之间的距离大于两倍的最小距离时,即认为匹配有误。但有时最小距离会非常小,所以要设置一个经验值为30作为下限
    std::vector<DMatch> good_matches;
    for(int i=0; i<descriptors_1.rows; ++i)
    {
        if(matches[i].distance <= max(2 * min_dist, 30.0))
        {
            good_matches.push_back(matches[i]);
        }
    }
    //第五步:绘制检测结果
    Mat img_match;
    Mat img_goodmatch;
    drawMatches(img_1, keypoints_1, img_2, keypoints_2, matches, img_match);
    drawMatches(img_1, keypoints_1, img_2, keypoints_2, good_matches, img_goodmatch);
    imshow("all matches", img_match);
    imshow("good matches", img_goodmatch);
    waitKey(0);
    return 0;
}

运行结果:
请添加图片描述
请添加图片描述
请添加图片描述
请添加图片描述
可以看出,从匹配所有点的图片中可以看出,带有大量的误匹配。经过一次筛选后,匹配数量就减少了好多,单大多数都是正确的。
从用时可以看出,ORB提取用时比匹配用时耗费多得多,可见大部分时间都花在特征提取上。

7.2.2 手写ORB特征

//launch.json
{
    // Use IntelliSense to learn about possible attributes.
    // Hover to view descriptions of existing attributes.
    // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
    "version": "0.2.0",
    "configurations": [
        {
            "name": "g++ - 生成和调试活动文件",
            "type": "cppdbg",
            "request":"launch",
            "program":"${workspaceFolder}/build/orb_self",
            "args": [],
            "stopAtEntry": false,
            "cwd": "${workspaceFolder}",
            "environment": [],
            "externalConsole": false,
            "MIMode": "gdb",
            "setupCommands": [
                {
                    "description": "为 gdb 启动整齐打印",
                    "text": "-enable-pretty-printing",
                    "ignoreFailures": true
                }
            ],
            "preLaunchTask": "Build",
            "miDebuggerPath": "/usr/bin/gdb"
        }
    ]
}
//tasks.json
{
	"version": "2.0.0",
	"options":{
		"cwd": "${workspaceFolder}/build"   //指明在哪个文件夹下做下面这些指令
	},
	"tasks": [
		{
			"type": "shell",
			"label": "cmake",   //label就是这个task的名字,这个task的名字叫cmake
			"command": "cmake", //command就是要执行什么命令,这个task要执行的任务是cmake
			"args":[
				".."
			]
		},
		{
			"label": "make",  //这个task的名字叫make
			"group": {
				"kind": "build",
				"isDefault": true
			},
			"command": "make",  //这个task要执行的任务是make
			"args": [

			]
		},
		{
			"label": "Build",
			"dependsOrder": "sequence", //按列出的顺序执行任务依赖项
			"dependsOn":[				//这个label依赖于上面两个label
				"cmake",
				"make"
			]
		}
	]
}
#CMakeLists.txt
cmake_minimum_required(VERSION 3.0)

project(ORBCV)

#在g++编译时,添加编译参数,比如-Wall可以输出一些警告信息
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall")
set(CMAKE_CXX_FLAGS "-std=c++14 -mfma")

#一定要加上这句话,加上这个生成的可执行文件才是可以Debug的,不然不加或者是Release的话生成的可执行文件是无法进行调试的
set(CMAKE_BUILD_TYPE Debug)

#此工程要调用opencv库,因此需要添加opancv头文件和链接库
#寻找OpenCV库
find_package(OpenCV REQUIRED)

#添加头文件
include_directories(${OpenCV_INCLUDE_DIRS})

add_executable(orb_self orb_self.cpp)

#链接OpenCV库
target_link_libraries(orb_self ${OpenCV_LIBS})
#include <iostream>
#include <opencv2/opencv.hpp>
#include <string>
#include <nmmintrin.h>
#include <algorithm>
#include <chrono>

using namespace std;

//global variables
string first_file = "./1.png";
string second_file = "./2.png";

//32 bit unsigned int, will have 8, 8 * 32 = 256
//用256位的二进制描述,对应到8个32位的unsigned int数据,用typedef将它表示为DescType
typedef vector<uint32_t> DescType; // Descriptor type

/**
 * @brief compute descriptor of orb keypoints
 * 
 * @param img input image 
 * @param keypoints detected fast keypoints
 * @param descriptors descriptors
 * NOTE: if a keypoint goes outside the image boundary (8 pixels), descriptors will not be computed and will be left as
 * empty
 */
void ComputeORB(const cv::Mat &img, vector<cv::KeyPoint> & keypoints, vector<DescType> &descriptors); //函数ComputeORB输入图像和FAST角点,输出描述子

/**
 * brute-force match two sets of descriptors
 * @param desc1 the first descriptor
 * @param desc2 the second descriptor
 * @param matches matches of two images
 */
void BfMatch(const vector<DescType> &desc1, const vector<DescType> &desc2, vector<cv::DMatch> &matches);


int main(int argc, char **argv)
{
    //load image
    cv::Mat first_image = cv::imread(first_file, 0);
    cv::Mat second_image = cv::imread(second_file, 0);
    assert(first_image.data != nullptr && second_image.data != nullptr);

    //detect FAST keypoints1 using threshold=40
    chrono::steady_clock::time_point t1 = chrono::steady_clock::now();
    vector<cv::KeyPoint> keypoints1;
    cv::FAST(first_image, keypoints1, 40);
    vector<DescType> descriptor1;
    ComputeORB(first_image, keypoints1, descriptor1);

    // same for the second
    vector<cv::KeyPoint> keypoints2;
    cv::FAST(second_image, keypoints2, 40);
    vector<DescType> descriptor2;
    ComputeORB(second_image, keypoints2, descriptor2);
    chrono::steady_clock::time_point t2 = chrono::steady_clock::now();
    chrono::duration<double> time_used = chrono::duration_cast<chrono::duration<double>>(t2 - t1);
    cout << "extract ORB cost = " << time_used.count() << " seconds. " << endl;

    // find matches
    vector<cv::DMatch> matches;
    t1 = chrono::steady_clock::now();
    BfMatch(descriptor1, descriptor2, matches);
    t2 = chrono::steady_clock::now();
    time_used = chrono::duration_cast<chrono::duration<double>>(t2 - t1);
    cout << "match ORB cost = " << time_used.count() << " seconds. " << endl;
    cout << "matches: " << matches.size() << endl;

    //plot the matches
    cv::Mat image_show;
    cv::drawMatches(first_image, keypoints1, second_image, keypoints2, matches, image_show); //画出两幅图像间的匹配图
    cv::imshow("matches", image_show);//界面展示匹配结果
    cv::imwrite("matches.png", image_show);//在build文件中写入匹配结果图
    cv::waitKey(0);
    
    cout << "done." << endl;
    return 0;
}

// -------------------------------------------------------------------------------------------------- //
// ORB pattern
//确定二进制描述子Rotated BRIEF中每一位上是0还是1时,两个像素点的选择;32*32图像块
int ORB_pattern[256 * 4] = {
  8, -3, 9, 5/*mean (0), correlation (0)*/,
  4, 2, 7, -12/*mean (1.12461e-05), correlation (0.0437584)*/,
  -11, 9, -8, 2/*mean (3.37382e-05), correlation (0.0617409)*/,
  7, -12, 12, -13/*mean (5.62303e-05), correlation (0.0636977)*/,
  2, -13, 2, 12/*mean (0.000134953), correlation (0.085099)*/,
  1, -7, 1, 6/*mean (0.000528565), correlation (0.0857175)*/,
  -2, -10, -2, -4/*mean (0.0188821), correlation (0.0985774)*/,
  -13, -13, -11, -8/*mean (0.0363135), correlation (0.0899616)*/,
  -13, -3, -12, -9/*mean (0.121806), correlation (0.099849)*/,
  10, 4, 11, 9/*mean (0.122065), correlation (0.093285)*/,
  -13, -8, -8, -9/*mean (0.162787), correlation (0.0942748)*/,
  -11, 7, -9, 12/*mean (0.21561), correlation (0.0974438)*/,
  7, 7, 12, 6/*mean (0.160583), correlation (0.130064)*/,
  -4, -5, -3, 0/*mean (0.228171), correlation (0.132998)*/,
  -13, 2, -12, -3/*mean (0.00997526), correlation (0.145926)*/,
  -9, 0, -7, 5/*mean (0.198234), correlation (0.143636)*/,
  12, -6, 12, -1/*mean (0.0676226), correlation (0.16689)*/,
  -3, 6, -2, 12/*mean (0.166847), correlation (0.171682)*/,
  -6, -13, -4, -8/*mean (0.101215), correlation (0.179716)*/,
  11, -13, 12, -8/*mean (0.200641), correlation (0.192279)*/,
  4, 7, 5, 1/*mean (0.205106), correlation (0.186848)*/,
  5, -3, 10, -3/*mean (0.234908), correlation (0.192319)*/,
  3, -7, 6, 12/*mean (0.0709964), correlation (0.210872)*/,
  -8, -7, -6, -2/*mean (0.0939834), correlation (0.212589)*/,
  -2, 11, -1, -10/*mean (0.127778), correlation (0.20866)*/,
  -13, 12, -8, 10/*mean (0.14783), correlation (0.206356)*/,
  -7, 3, -5, -3/*mean (0.182141), correlation (0.198942)*/,
  -4, 2, -3, 7/*mean (0.188237), correlation (0.21384)*/,
  -10, -12, -6, 11/*mean (0.14865), correlation (0.23571)*/,
  5, -12, 6, -7/*mean (0.222312), correlation (0.23324)*/,
  5, -6, 7, -1/*mean (0.229082), correlation (0.23389)*/,
  1, 0, 4, -5/*mean (0.241577), correlation (0.215286)*/,
  9, 11, 11, -13/*mean (0.00338507), correlation (0.251373)*/,
  4, 7, 4, 12/*mean (0.131005), correlation (0.257622)*/,
  2, -1, 4, 4/*mean (0.152755), correlation (0.255205)*/,
  -4, -12, -2, 7/*mean (0.182771), correlation (0.244867)*/,
  -8, -5, -7, -10/*mean (0.186898), correlation (0.23901)*/,
  4, 11, 9, 12/*mean (0.226226), correlation (0.258255)*/,
  0, -8, 1, -13/*mean (0.0897886), correlation (0.274827)*/,
  -13, -2, -8, 2/*mean (0.148774), correlation (0.28065)*/,
  -3, -2, -2, 3/*mean (0.153048), correlation (0.283063)*/,
  -6, 9, -4, -9/*mean (0.169523), correlation (0.278248)*/,
  8, 12, 10, 7/*mean (0.225337), correlation (0.282851)*/,
  0, 9, 1, 3/*mean (0.226687), correlation (0.278734)*/,
  7, -5, 11, -10/*mean (0.00693882), correlation (0.305161)*/,
  -13, -6, -11, 0/*mean (0.0227283), correlation (0.300181)*/,
  10, 7, 12, 1/*mean (0.125517), correlation (0.31089)*/,
  -6, -3, -6, 12/*mean (0.131748), correlation (0.312779)*/,
  10, -9, 12, -4/*mean (0.144827), correlation (0.292797)*/,
  -13, 8, -8, -12/*mean (0.149202), correlation (0.308918)*/,
  -13, 0, -8, -4/*mean (0.160909), correlation (0.310013)*/,
  3, 3, 7, 8/*mean (0.177755), correlation (0.309394)*/,
  5, 7, 10, -7/*mean (0.212337), correlation (0.310315)*/,
  -1, 7, 1, -12/*mean (0.214429), correlation (0.311933)*/,
  3, -10, 5, 6/*mean (0.235807), correlation (0.313104)*/,
  2, -4, 3, -10/*mean (0.00494827), correlation (0.344948)*/,
  -13, 0, -13, 5/*mean (0.0549145), correlation (0.344675)*/,
  -13, -7, -12, 12/*mean (0.103385), correlation (0.342715)*/,
  -13, 3, -11, 8/*mean (0.134222), correlation (0.322922)*/,
  -7, 12, -4, 7/*mean (0.153284), correlation (0.337061)*/,
  6, -10, 12, 8/*mean (0.154881), correlation (0.329257)*/,
  -9, -1, -7, -6/*mean (0.200967), correlation (0.33312)*/,
  -2, -5, 0, 12/*mean (0.201518), correlation (0.340635)*/,
  -12, 5, -7, 5/*mean (0.207805), correlation (0.335631)*/,
  3, -10, 8, -13/*mean (0.224438), correlation (0.34504)*/,
  -7, -7, -4, 5/*mean (0.239361), correlation (0.338053)*/,
  -3, -2, -1, -7/*mean (0.240744), correlation (0.344322)*/,
  2, 9, 5, -11/*mean (0.242949), correlation (0.34145)*/,
  -11, -13, -5, -13/*mean (0.244028), correlation (0.336861)*/,
  -1, 6, 0, -1/*mean (0.247571), correlation (0.343684)*/,
  5, -3, 5, 2/*mean (0.000697256), correlation (0.357265)*/,
  -4, -13, -4, 12/*mean (0.00213675), correlation (0.373827)*/,
  -9, -6, -9, 6/*mean (0.0126856), correlation (0.373938)*/,
  -12, -10, -8, -4/*mean (0.0152497), correlation (0.364237)*/,
  10, 2, 12, -3/*mean (0.0299933), correlation (0.345292)*/,
  7, 12, 12, 12/*mean (0.0307242), correlation (0.366299)*/,
  -7, -13, -6, 5/*mean (0.0534975), correlation (0.368357)*/,
  -4, 9, -3, 4/*mean (0.099865), correlation (0.372276)*/,
  7, -1, 12, 2/*mean (0.117083), correlation (0.364529)*/,
  -7, 6, -5, 1/*mean (0.126125), correlation (0.369606)*/,
  -13, 11, -12, 5/*mean (0.130364), correlation (0.358502)*/,
  -3, 7, -2, -6/*mean (0.131691), correlation (0.375531)*/,
  7, -8, 12, -7/*mean (0.160166), correlation (0.379508)*/,
  -13, -7, -11, -12/*mean (0.167848), correlation (0.353343)*/,
  1, -3, 12, 12/*mean (0.183378), correlation (0.371916)*/,
  2, -6, 3, 0/*mean (0.228711), correlation (0.371761)*/,
  -4, 3, -2, -13/*mean (0.247211), correlation (0.364063)*/,
  -1, -13, 1, 9/*mean (0.249325), correlation (0.378139)*/,
  7, 1, 8, -6/*mean (0.000652272), correlation (0.411682)*/,
  1, -1, 3, 12/*mean (0.00248538), correlation (0.392988)*/,
  9, 1, 12, 6/*mean (0.0206815), correlation (0.386106)*/,
  -1, -9, -1, 3/*mean (0.0364485), correlation (0.410752)*/,
  -13, -13, -10, 5/*mean (0.0376068), correlation (0.398374)*/,
  7, 7, 10, 12/*mean (0.0424202), correlation (0.405663)*/,
  12, -5, 12, 9/*mean (0.0942645), correlation (0.410422)*/,
  6, 3, 7, 11/*mean (0.1074), correlation (0.413224)*/,
  5, -13, 6, 10/*mean (0.109256), correlation (0.408646)*/,
  2, -12, 2, 3/*mean (0.131691), correlation (0.416076)*/,
  3, 8, 4, -6/*mean (0.165081), correlation (0.417569)*/,
  2, 6, 12, -13/*mean (0.171874), correlation (0.408471)*/,
  9, -12, 10, 3/*mean (0.175146), correlation (0.41296)*/,
  -8, 4, -7, 9/*mean (0.183682), correlation (0.402956)*/,
  -11, 12, -4, -6/*mean (0.184672), correlation (0.416125)*/,
  1, 12, 2, -8/*mean (0.191487), correlation (0.386696)*/,
  6, -9, 7, -4/*mean (0.192668), correlation (0.394771)*/,
  2, 3, 3, -2/*mean (0.200157), correlation (0.408303)*/,
  6, 3, 11, 0/*mean (0.204588), correlation (0.411762)*/,
  3, -3, 8, -8/*mean (0.205904), correlation (0.416294)*/,
  7, 8, 9, 3/*mean (0.213237), correlation (0.409306)*/,
  -11, -5, -6, -4/*mean (0.243444), correlation (0.395069)*/,
  -10, 11, -5, 10/*mean (0.247672), correlation (0.413392)*/,
  -5, -8, -3, 12/*mean (0.24774), correlation (0.411416)*/,
  -10, 5, -9, 0/*mean (0.00213675), correlation (0.454003)*/,
  8, -1, 12, -6/*mean (0.0293635), correlation (0.455368)*/,
  4, -6, 6, -11/*mean (0.0404971), correlation (0.457393)*/,
  -10, 12, -8, 7/*mean (0.0481107), correlation (0.448364)*/,
  4, -2, 6, 7/*mean (0.050641), correlation (0.455019)*/,
  -2, 0, -2, 12/*mean (0.0525978), correlation (0.44338)*/,
  -5, -8, -5, 2/*mean (0.0629667), correlation (0.457096)*/,
  7, -6, 10, 12/*mean (0.0653846), correlation (0.445623)*/,
  -9, -13, -8, -8/*mean (0.0858749), correlation (0.449789)*/,
  -5, -13, -5, -2/*mean (0.122402), correlation (0.450201)*/,
  8, -8, 9, -13/*mean (0.125416), correlation (0.453224)*/,
  -9, -11, -9, 0/*mean (0.130128), correlation (0.458724)*/,
  1, -8, 1, -2/*mean (0.132467), correlation (0.440133)*/,
  7, -4, 9, 1/*mean (0.132692), correlation (0.454)*/,
  -2, 1, -1, -4/*mean (0.135695), correlation (0.455739)*/,
  11, -6, 12, -11/*mean (0.142904), correlation (0.446114)*/,
  -12, -9, -6, 4/*mean (0.146165), correlation (0.451473)*/,
  3, 7, 7, 12/*mean (0.147627), correlation (0.456643)*/,
  5, 5, 10, 8/*mean (0.152901), correlation (0.455036)*/,
  0, -4, 2, 8/*mean (0.167083), correlation (0.459315)*/,
  -9, 12, -5, -13/*mean (0.173234), correlation (0.454706)*/,
  0, 7, 2, 12/*mean (0.18312), correlation (0.433855)*/,
  -1, 2, 1, 7/*mean (0.185504), correlation (0.443838)*/,
  5, 11, 7, -9/*mean (0.185706), correlation (0.451123)*/,
  3, 5, 6, -8/*mean (0.188968), correlation (0.455808)*/,
  -13, -4, -8, 9/*mean (0.191667), correlation (0.459128)*/,
  -5, 9, -3, -3/*mean (0.193196), correlation (0.458364)*/,
  -4, -7, -3, -12/*mean (0.196536), correlation (0.455782)*/,
  6, 5, 8, 0/*mean (0.1972), correlation (0.450481)*/,
  -7, 6, -6, 12/*mean (0.199438), correlation (0.458156)*/,
  -13, 6, -5, -2/*mean (0.211224), correlation (0.449548)*/,
  1, -10, 3, 10/*mean (0.211718), correlation (0.440606)*/,
  4, 1, 8, -4/*mean (0.213034), correlation (0.443177)*/,
  -2, -2, 2, -13/*mean (0.234334), correlation (0.455304)*/,
  2, -12, 12, 12/*mean (0.235684), correlation (0.443436)*/,
  -2, -13, 0, -6/*mean (0.237674), correlation (0.452525)*/,
  4, 1, 9, 3/*mean (0.23962), correlation (0.444824)*/,
  -6, -10, -3, -5/*mean (0.248459), correlation (0.439621)*/,
  -3, -13, -1, 1/*mean (0.249505), correlation (0.456666)*/,
  7, 5, 12, -11/*mean (0.00119208), correlation (0.495466)*/,
  4, -2, 5, -7/*mean (0.00372245), correlation (0.484214)*/,
  -13, 9, -9, -5/*mean (0.00741116), correlation (0.499854)*/,
  7, 1, 8, 6/*mean (0.0208952), correlation (0.499773)*/,
  7, -8, 7, 6/*mean (0.0220085), correlation (0.501609)*/,
  -7, -4, -7, 1/*mean (0.0233806), correlation (0.496568)*/,
  -8, 11, -7, -8/*mean (0.0236505), correlation (0.489719)*/,
  -13, 6, -12, -8/*mean (0.0268781), correlation (0.503487)*/,
  2, 4, 3, 9/*mean (0.0323324), correlation (0.501938)*/,
  10, -5, 12, 3/*mean (0.0399235), correlation (0.494029)*/,
  -6, -5, -6, 7/*mean (0.0420153), correlation (0.486579)*/,
  8, -3, 9, -8/*mean (0.0548021), correlation (0.484237)*/,
  2, -12, 2, 8/*mean (0.0616622), correlation (0.496642)*/,
  -11, -2, -10, 3/*mean (0.0627755), correlation (0.498563)*/,
  -12, -13, -7, -9/*mean (0.0829622), correlation (0.495491)*/,
  -11, 0, -10, -5/*mean (0.0843342), correlation (0.487146)*/,
  5, -3, 11, 8/*mean (0.0929937), correlation (0.502315)*/,
  -2, -13, -1, 12/*mean (0.113327), correlation (0.48941)*/,
  -1, -8, 0, 9/*mean (0.132119), correlation (0.467268)*/,
  -13, -11, -12, -5/*mean (0.136269), correlation (0.498771)*/,
  -10, -2, -10, 11/*mean (0.142173), correlation (0.498714)*/,
  -3, 9, -2, -13/*mean (0.144141), correlation (0.491973)*/,
  2, -3, 3, 2/*mean (0.14892), correlation (0.500782)*/,
  -9, -13, -4, 0/*mean (0.150371), correlation (0.498211)*/,
  -4, 6, -3, -10/*mean (0.152159), correlation (0.495547)*/,
  -4, 12, -2, -7/*mean (0.156152), correlation (0.496925)*/,
  -6, -11, -4, 9/*mean (0.15749), correlation (0.499222)*/,
  6, -3, 6, 11/*mean (0.159211), correlation (0.503821)*/,
  -13, 11, -5, 5/*mean (0.162427), correlation (0.501907)*/,
  11, 11, 12, 6/*mean (0.16652), correlation (0.497632)*/,
  7, -5, 12, -2/*mean (0.169141), correlation (0.484474)*/,
  -1, 12, 0, 7/*mean (0.169456), correlation (0.495339)*/,
  -4, -8, -3, -2/*mean (0.171457), correlation (0.487251)*/,
  -7, 1, -6, 7/*mean (0.175), correlation (0.500024)*/,
  -13, -12, -8, -13/*mean (0.175866), correlation (0.497523)*/,
  -7, -2, -6, -8/*mean (0.178273), correlation (0.501854)*/,
  -8, 5, -6, -9/*mean (0.181107), correlation (0.494888)*/,
  -5, -1, -4, 5/*mean (0.190227), correlation (0.482557)*/,
  -13, 7, -8, 10/*mean (0.196739), correlation (0.496503)*/,
  1, 5, 5, -13/*mean (0.19973), correlation (0.499759)*/,
  1, 0, 10, -13/*mean (0.204465), correlation (0.49873)*/,
  9, 12, 10, -1/*mean (0.209334), correlation (0.49063)*/,
  5, -8, 10, -9/*mean (0.211134), correlation (0.503011)*/,
  -1, 11, 1, -13/*mean (0.212), correlation (0.499414)*/,
  -9, -3, -6, 2/*mean (0.212168), correlation (0.480739)*/,
  -1, -10, 1, 12/*mean (0.212731), correlation (0.502523)*/,
  -13, 1, -8, -10/*mean (0.21327), correlation (0.489786)*/,
  8, -11, 10, -6/*mean (0.214159), correlation (0.488246)*/,
  2, -13, 3, -6/*mean (0.216993), correlation (0.50287)*/,
  7, -13, 12, -9/*mean (0.223639), correlation (0.470502)*/,
  -10, -10, -5, -7/*mean (0.224089), correlation (0.500852)*/,
  -10, -8, -8, -13/*mean (0.228666), correlation (0.502629)*/,
  4, -6, 8, 5/*mean (0.22906), correlation (0.498305)*/,
  3, 12, 8, -13/*mean (0.233378), correlation (0.503825)*/,
  -4, 2, -3, -3/*mean (0.234323), correlation (0.476692)*/,
  5, -13, 10, -12/*mean (0.236392), correlation (0.475462)*/,
  4, -13, 5, -1/*mean (0.236842), correlation (0.504132)*/,
  -9, 9, -4, 3/*mean (0.236977), correlation (0.497739)*/,
  0, 3, 3, -9/*mean (0.24314), correlation (0.499398)*/,
  -12, 1, -6, 1/*mean (0.243297), correlation (0.489447)*/,
  3, 2, 4, -8/*mean (0.00155196), correlation (0.553496)*/,
  -10, -10, -10, 9/*mean (0.00239541), correlation (0.54297)*/,
  8, -13, 12, 12/*mean (0.0034413), correlation (0.544361)*/,
  -8, -12, -6, -5/*mean (0.003565), correlation (0.551225)*/,
  2, 2, 3, 7/*mean (0.00835583), correlation (0.55285)*/,
  10, 6, 11, -8/*mean (0.00885065), correlation (0.540913)*/,
  6, 8, 8, -12/*mean (0.0101552), correlation (0.551085)*/,
  -7, 10, -6, 5/*mean (0.0102227), correlation (0.533635)*/,
  -3, -9, -3, 9/*mean (0.0110211), correlation (0.543121)*/,
  -1, -13, -1, 5/*mean (0.0113473), correlation (0.550173)*/,
  -3, -7, -3, 4/*mean (0.0140913), correlation (0.554774)*/,
  -8, -2, -8, 3/*mean (0.017049), correlation (0.55461)*/,
  4, 2, 12, 12/*mean (0.01778), correlation (0.546921)*/,
  2, -5, 3, 11/*mean (0.0224022), correlation (0.549667)*/,
  6, -9, 11, -13/*mean (0.029161), correlation (0.546295)*/,
  3, -1, 7, 12/*mean (0.0303081), correlation (0.548599)*/,
  11, -1, 12, 4/*mean (0.0355151), correlation (0.523943)*/,
  -3, 0, -3, 6/*mean (0.0417904), correlation (0.543395)*/,
  4, -11, 4, 12/*mean (0.0487292), correlation (0.542818)*/,
  2, -4, 2, 1/*mean (0.0575124), correlation (0.554888)*/,
  -10, -6, -8, 1/*mean (0.0594242), correlation (0.544026)*/,
  -13, 7, -11, 1/*mean (0.0597391), correlation (0.550524)*/,
  -13, 12, -11, -13/*mean (0.0608974), correlation (0.55383)*/,
  6, 0, 11, -13/*mean (0.065126), correlation (0.552006)*/,
  0, -1, 1, 4/*mean (0.074224), correlation (0.546372)*/,
  -13, 3, -9, -2/*mean (0.0808592), correlation (0.554875)*/,
  -9, 8, -6, -3/*mean (0.0883378), correlation (0.551178)*/,
  -13, -6, -8, -2/*mean (0.0901035), correlation (0.548446)*/,
  5, -9, 8, 10/*mean (0.0949843), correlation (0.554694)*/,
  2, 7, 3, -9/*mean (0.0994152), correlation (0.550979)*/,
  -1, -6, -1, -1/*mean (0.10045), correlation (0.552714)*/,
  9, 5, 11, -2/*mean (0.100686), correlation (0.552594)*/,
  11, -3, 12, -8/*mean (0.101091), correlation (0.532394)*/,
  3, 0, 3, 5/*mean (0.101147), correlation (0.525576)*/,
  -1, 4, 0, 10/*mean (0.105263), correlation (0.531498)*/,
  3, -6, 4, 5/*mean (0.110785), correlation (0.540491)*/,
  -13, 0, -10, 5/*mean (0.112798), correlation (0.536582)*/,
  5, 8, 12, 11/*mean (0.114181), correlation (0.555793)*/,
  8, 9, 9, -6/*mean (0.117431), correlation (0.553763)*/,
  7, -4, 8, -12/*mean (0.118522), correlation (0.553452)*/,
  -10, 4, -10, 9/*mean (0.12094), correlation (0.554785)*/,
  7, 3, 12, 4/*mean (0.122582), correlation (0.555825)*/,
  9, -7, 10, -2/*mean (0.124978), correlation (0.549846)*/,
  7, 0, 12, -2/*mean (0.127002), correlation (0.537452)*/,
  -1, -6, 0, -11/*mean (0.127148), correlation (0.547401)*/
};



void ComputeORB(const cv::Mat &img, vector<cv::KeyPoint> & keypoints, vector<DescType> &descriptors)
{
    const int half_patch_size = 8; //计算特征点方向时,选取的图像块,16*16
    const int half_boundary = 16; //计算描述子时在32*32的图像块中选点
    int bad_points = 0;
    for(auto & kp : keypoints)
    {
        //超出图像边界的角点的描述子设为空
        if(kp.pt.x < half_boundary || kp.pt.y < half_boundary || kp.pt.x >= img.cols - half_boundary || kp.pt.y >= img.rows - half_boundary)
        {
            //outside
            ++bad_points;
            descriptors.push_back({});
            continue;
        }

        //计算16*16图像块的灰度质心,计算特征点附近的图像灰度质心,参考前面的计算公式
        float m01 = 0, m10 = 0;
        for(int dx = -half_patch_size; dx < half_patch_size; ++dx)
        {
            for(int dy = -half_patch_size; dy < half_patch_size; ++dy)
            {
                //这里参考slam14书中P157,计算m_pq时x,y是这个区间内的相对x,y
                uchar pixel = img.at<uchar>(kp.pt.y + dy, kp.pt.x + dx);
                m01 += dx * pixel;
                m10 = dy * pixel;
            }
        }
        //angle should be arc tan(m01 / m10);
        //参考链接:https://blog.csdn.net/yys2324826380/article/details/105181945/
        //根据sin_theta和cos_theta计算关键点的方向角
        float m_sqrt = sqrt(m01 * m01 + m10 * m10);
        float sin_theta = m01 / m_sqrt;
        float cos_theta = m10 / m_sqrt;

        //compute the angle of this point
        DescType desc(8,0); //8个元素,它们的值初始化为0
        for(int i=0; i<8; ++i)
        {
            uint32_t d =0;
            for(int k=0; k <32; ++k)
            {
                int idx_pq = i * 32 + k; //idx_pq表示二进制描述子中的第几位
                cv::Point2f p (ORB_pattern[idx_pq * 4], ORB_pattern[idx_pq * 4 + 1]);
                cv::Point2f q (ORB_pattern[idx_pq * 4 + 2], ORB_pattern[idx_pq * 4 + 3]);

                //rotate with theta
                //p,q绕原点旋转theta得到pp,qq
                cv::Point2f pp = cv::Point2f(cos_theta * p.x - sin_theta * p.y , sin_theta * p.x + cos_theta * p.y) + kp.pt;
                cv::Point2f qq = cv::Point2f(cos_theta * q.x - sin_theta * q.y , sin_theta * q.x + cos_theta * q.y) + kp.pt;
                if(img.at<uchar>(pp.y,pp.x) < img.at<uchar>(qq.y,qq.x))
                {
                    d |= 1 << k;
                }
            }
            desc[i] = d;
        }
        descriptors.push_back(desc);
    }
    cout << "bad / total:" << bad_points << "/" << keypoints.size() << endl;
}

//brute-force matching
void BfMatch(const vector<DescType> &desc1, const vector<DescType> & desc2, vector<cv::DMatch> & matches)
{
    const int d_max = 40;//描述子之间的距离小于这个值,才被认为是正确匹配
    for(size_t i1 = 0; i1 < desc1.size(); ++i1)
    {
        if(desc1[i1].empty()) continue;
        cv::DMatch m{i1, 0, 256};//定义了一个匹配对m
        for(size_t i2 = 0; i2 < desc2.size(); ++i2)
        {
            if(desc2[i2].empty()) continue;
            int distance = 0;
            for(int k=0; k<8; ++k)
            {
                //使用_mm_popcnt_u32函数计算一个unsigned int变量中1的个数,从而达到计算汉明距离的效果
                distance += _mm_popcnt_u32(desc1[i1][k] ^ desc2[i2][k]);
            }
            if(distance < d_max && distance < m.distance)
            {
                m.distance = distance;
                m.trainIdx = i2;
            }
        }
        if(m.distance < d_max)
        {
            matches.push_back(m);
        }
    }    
}

运行结果:
在这里插入图片描述
请添加图片描述

  人工智能 最新文章
2022吴恩达机器学习课程——第二课(神经网
第十五章 规则学习
FixMatch: Simplifying Semi-Supervised Le
数据挖掘Java——Kmeans算法的实现
大脑皮层的分割方法
【翻译】GPT-3是如何工作的
论文笔记:TEACHTEXT: CrossModal Generaliz
python从零学(六)
详解Python 3.x 导入(import)
【答读者问27】backtrader不支持最新版本的
上一篇文章      下一篇文章      查看所有文章
加:2022-05-15 11:35:23  更:2022-05-15 11:36:37 
 
开发: C++知识库 Java知识库 JavaScript Python PHP知识库 人工智能 区块链 大数据 移动开发 嵌入式 开发工具 数据结构与算法 开发测试 游戏开发 网络协议 系统运维
教程: HTML教程 CSS教程 JavaScript教程 Go语言教程 JQuery教程 VUE教程 VUE3教程 Bootstrap教程 SQL数据库教程 C语言教程 C++教程 Java教程 Python教程 Python3教程 C#教程
数码: 电脑 笔记本 显卡 显示器 固态硬盘 硬盘 耳机 手机 iphone vivo oppo 小米 华为 单反 装机 图拉丁

360图书馆 购物 三丰科技 阅读网 日历 万年历 2024年11日历 -2024/11/26 5:43:59-

图片自动播放器
↓图片自动播放器↓
TxT小说阅读器
↓语音阅读,小说下载,古典文学↓
一键清除垃圾
↓轻轻一点,清除系统垃圾↓
图片批量下载器
↓批量下载图片,美女图库↓
  网站联系: qq:121756557 email:121756557@qq.com  IT数码