Gao Bo SLAM Basic Class Fifth Lecture – Geometric Foundation

First of all, after thinking about the non-linear optimization in the last lecture:

   Why the function form is the addition of H and b calculated inside each item to be added instead of Take its square as the sum of fx and then calculate the total function for optimization?

   Because several types of nonlinear optimization methods such as Gauss-Newton method Hx=b are originally from ||f(x+δx)||2 this square form of formula Inferred from the sub, the calculated δx is the solution that makes the square value drop the fastest in this step of iteration.

   So directly adding H and b is the correct solution.

   So the method of nonlinear optimization and the form of linear least squares are closely related.

Back to this lecture, this lecture includes the extraction and matching principle of ORB feature points, binocular vision, especially 2d-2d epipolar geometry and 2d-3d PnP The solution method,

and the pose solution realized by nonlinear optimization, use the Lie algebra derivation formula and update strategy in the third lecture.

Still focusing on exercises

Let’s go

Share Picture

The code for this big question is concentrated in a whole file computeORB. In cpp, the following sub-topics implement each function in turn.

  1. The code first uses the built-in cv::FAST(image,keypoints,threshold) to get the corner points in the image

  2. Then call computeAngle(image,keypoints) The function calculates the connecting angle of each corner point to the center of gravity

  3. Then call computeORBDesc(image,keypoints,descriptors) to calculate the descriptors of each qualified feature point, and the non-compliant features The point descriptor is empty (the keypoint data type comes with the characteristic point angle value, so only the characteristic point vector is needed in the input item)

  4. Finally, call bfMatch(descriptor1,discriptors2,matches) to find it by brute force matching Get the matching between the feature points and the feature points, and view the results by drawing

Be aware of the representation methods and usage of several basic opencv data structures   

In addition, ORB feature point matching has opencv’s own functions, which can be achieved by tuning the library in general. But the author of ORB-SLAM2 is his own handwritten feature point matching module, this part of the principle is also helpful to understand

Share Picture

void computeAngle(const cv::Mat &image, vector &keypoints) {

int half_patch_size = 8;

for (auto &kp: keypoints) {
// START YOUR CODE HERE (~7 lines)
kp.angle = 0;// compute kp.angle
cv::Point2f p=kp.pt;
if(px<8||px>image .cols-8||py<8||py>image.rows-< span style="color: #800080;">8){continue;} < span style="color: #008000;">// Eliminate out-of-bounds points without consideration
double m10=0,m01=0,m00=0;
for(int i=-8;i<8;i++){
const uchar* col = image.ptr(p.y+i);
for(int j=-8;j<8;j++){
m00
+=col[(int)p.x+j];
m10
+=j*col[(int)p.x+j ];
m01
+=i*col[(int)p.x+j ];
}
}
double cx=0,cy=0;
cx
=m10/m00;
cy
=m01/m00;
kp.angle
=atan2(cy,cx)*180/pi ;

// END YOUR CODE HERE
}

return;
}

Note:

   1. Regarding KeyPoint, it contains a Point2f object to store xy The value can be obtained with .pt, and it can also be obtained directly if it contains an angle value.

     There are also other feature points such as size, response, octave, class_id, or domains used in classification tasks.

  2. In the math.h library:

    atan Find the tangent of the slope, only valid for -90°-90°
    atan2 gives two values, the range is the whole circle -180°-180°, Enter two values, the first one is y, The second one is x (attention!)

    In addition, pay attention to the conversion of angle to radians, angle stores degrees, and atan uses radians

  3. cv When the image is stored in ::mat, the reading method:

The image stored by     Mat, each line is continuous, you can get the pointer at the beginning of each line to access the image pixels. For example, extract the R channel image in an image, set all G and B channel pixels to zero, you can get the pointer at the beginning of each line, and use the pointer to traverse all the pixels in each line. If the storage of the image in the memory is continuous, you can also traverse all the pixels at once.

     soconst uchar* col = image.ptr(p.y+i); In this line, read the array pointer of line p.y+i, and then use the abscissa to access it

     In addition, the memory is strengthened again: The x of the point is horizontal The coordinates correspond to col, and y is the ordinate corresponds to row,

  4. Centroid m10 and m01:

     The value multiplied in front of the accumulation can be the absolute coordinate p .x+j or p.y+i, you can also use the relative coordinates of j and i.

     has an advantage in using the centroid of relative coordinates: when calculating cx and cy, if it is an absolute coordinate, then each step of m00 must also be multiplied by px and py, and it is necessary to calculate the angle of the absolute centroid. One time difference, relative coordinates can avoid multi-step multiplication and subtraction operations.

Draw the angle of each feature point as shown below:

share picture

share picture

 void computeORBDesc(const cv::Mat &image, vector &keypoints, vector &desc) {

for (auto &kp: keypoints) {
DescType d(
256, false);
for (int i = 0; i <256; i++) {
// START YOUR CODE HERE (~7 lines)
float cos_ = cos(kp.angle * pi / 180);
float sin_ = sin(kp.angle * pi / 180 span>);
//Rotate two pattrn points to the characteristic point after the angle is rotated Location
cv::Point2f up_t(cos_ * ORB_pattern[4 * i]-sin_ * ORB_pattern[4 * i + 1],
sin_
* ORB_pattern[4 * i] + cos_ * ORB_pattern[4 * i + 1]);
cv::Point2f uq_t(cos_
* ORB_pattern[4 * i + 2]-sin_ * ORB_pattern[4 * i + 3],
sin_
* ORB_pattern[4 * i + 2] + cos_* ORB_pattern[4 * i + 3]);
//Find the coordinates of two points for comparison
cv::Point2f up = up_t + kp.pt;
cv::Point2f uq
= uq_t + kp.pt;
//The out-of-bounds point clears the feature vector and does not count in the total
if (up.x <0 || up.y << span style="color: #800080;">0 || up.x> image.cols || up.y> image.rows ||
uq.x
< 0 || uq.y <0 || uq.x> image.cols || uq.y > image.rows) {
d.clear();
break;// if kp goes outside, set d.clear()
}
d[i]
= image.at(up)> image.at(uq)? 0 : 1;//and Is a way to read pixels
// END YOUR CODE HERE
}
desc.push_back(d);
}

int bad = 0;
for (auto &d: desc) {
if (d.empty()) bad++;
}
cout
<< "bad/total: " << bad << "/ " << desc.size() << endl;
return;
}

Note:

   1. First, the formula ① is to rotate a point, Including the calculation of u, v coordinates, pay attention to the calculation formula value.

  2. In addition, according to the principle of the code, the four values ​​of each row of the ORB pattern are p1u, p1v, p2u, p2v in order.

   3. Initialize d as a 256-dimensional vector, outside the bounds Point clear makes it possible to use empty to make out-of-bounds judgments

  4. The principle of ORB feature points is to compare the gray value relationship of pixel pairs in a specific orientation to construct a feature vector, here is 256 pairs

p>

Share a picture

void bfMatch(const vector &desc1, const vector &desc2, vector &matches) {int d_max = < span style="color: #800080;">50; // START YOUR CODE HERE (~12 lines) // find matches between desc1 and desc2. for (size_t i = 0; i i){ < span style="color: #000 0ff;">if(desc1[i].empty()) continue; int d_min=256; int index2=-1; for( size_t j=0;j ){ if(desc2[j].empty())continue; int dist=0; for(size_t k=0;k<256;k++){ dist+=desc1[i][k ]^desc2[j][k]; if(dist>d_max)break;} if(distd_min){ d_min=dist; index2=j;}} if(d_min<d_max){ matches.push_back(cv::DMatch(i,index2,d_min));} }
}
span>
span>

Note:

  1. The cv::DMach data structure only contains 4 values, namely int _queryIdx, int _trainIdx, int _imgIdx, float _distance

   The 234 card is off and I am too lazy to write. . .

The final result is as follows

Share a picture

void computeAngle( const cv::Mat &image, vector &keypoints) {

int half_patch_size = 8;

for (auto &kp: keypoints) {
// START YOUR CODE HERE (~7 lines)
kp.angle = 0;// compute kp.angle
cv::Point2f p=kp.pt;
if(px<8||px>image .cols-8||py<8||py>image.rows-< span style="color: #800080;">8){continue;} < span style="color: #008000;">// Eliminate out-of-bounds points without consideration
double m10=0,m01=0,m00=0;
for(int i=-8;i<8;i++){
const uchar* col = image.ptr(p.y+i);
for(int j=-8;j<8;j++){
m00
+=col[(int)p.x+j];
m10
+=j*col[(int)p.x+j ];
m01
+=i*col[(int)p.x+j ];
}
}
double cx=0,cy=0;
cx
=m10/m00;
cy
=m01/m00;
kp.angle
=atan2(cy,cx)*180/pi ;

// END YOUR CODE HERE
}

return;
}

void computeORBDesc(const cv::Mat &image, vector &keypoints, vector &desc) {

for (auto &kp: keypoints) {
DescType d(
256, false);
for (int i = 0; i <256; i++) {
// START YOUR CODE HERE (~7 lines)
float cos_ = cos(kp.angle * pi / 180);
float sin_ = sin(kp.angle * pi / 180 span>);
//Rotate two pattrn points to the characteristic point after the angle is rotated Location
cv::Point2f up_t(cos_ * ORB_pattern[4 * i]-sin_ * ORB_pattern[4 * i + 1],
sin_
* ORB_pattern[4 * i] + cos_ * ORB_pattern[4 * i + 1]);
cv::Point2f uq_t(cos_
* ORB_pattern[4 * i + 2]-sin_ * ORB_pattern[4 * i + 3],
sin_
* ORB_pattern[4 * i + 2] + cos_* ORB_pattern[4 * i + 3]);
//Find the coordinates of two points for comparison
cv::Point2f up = up_t + kp.pt;
cv::Point2f uq
= uq_t + kp.pt;
//The out-of-bounds point clears the feature vector and does not count in the total
if (up.x <0 || up.y << span style="color: #800080;">0 || up.x> image.cols || up.y> image.rows ||
uq.x
< 0 || uq.y <0 || uq.x> image.cols || uq.y > image.rows) {
d.clear();
break;// if kp goes outside, set d.clear()
}
d[i]
= image.at(up)> image.at(uq)? 0 : 1;//and Is a way to read pixels
// END YOUR CODE HERE
}
desc.push_back(d);
}

int bad = 0;
for (auto &d: desc) {
if (d.empty()) bad++;
}
cout
<< "bad/total: " << bad << "/ " << desc.size() << endl;
return;
}

void bfMatch( const vector &desc1, const vector &desc2, vector &matches) {< span style="color: #0000ff;">int d_max = 50; // START YOUR CODE HERE (~12 lines) // find matches between desc1 and desc2. for (size_t i = 0; i i){ if(desc1[i].empty()) < span style="color: #0000ff;">continue; int d_min=256; int index2=-1; for(size_t j=0;j ){ if(desc2[j].empty() )continue; int dist=0; for(size_t k=0;k <256;k++){ dist+=desc1[i][k]^desc2[j][k]; if(dist>d_max)break;} if(distd_min){ d_min=dist; index2=j;}} if(d_min< d_max){ matches.push_back(cv::D Match(i,index2,d_min));} }
}
< /span>
< /span>

WordPress database error: [Table 'yf99682.wp_s6mz6tyggq_comments' doesn't exist]
SELECT SQL_CALC_FOUND_ROWS wp_s6mz6tyggq_comments.comment_ID FROM wp_s6mz6tyggq_comments WHERE ( comment_approved = '1' ) AND comment_post_ID = 2533 ORDER BY wp_s6mz6tyggq_comments.comment_date_gmt ASC, wp_s6mz6tyggq_comments.comment_ID ASC

Leave a Comment

Your email address will not be published.