For this we need a function to transform the (alfa , r) pair from robot coordonates to world coordonates. After some easy geometry:
function result = robot2wrldLSQ(lsq,scannerPose)
alphaW = lsq(1)+scannerPose(3);
theRest = scannerPose(1)*cos(alphaW)+scannerPose(2)*sin(alphaW);
rW = lsq(2) + theRest;
result = [alphaW rW];
end
The difference is visible in scanPose2 where the two pairs differ.
This small change has been made to the code:
estimated11 = lsqline(line1) % in robot coords
estimated1 = robot2wrldLSQ(estimated11,scannerPose) %world coords
line1 = lsr2wrld(line1,scannerPose); %the scanned points in world coords (for a better plotting)
plot(...);
The result is:
estimatedn = alpha r
estimated1 = 1.5708 1.0000
estimated2 = -1.5708 1.0000
estimated4 = 0.0000 5.0000
It can be seen that the robot only scanned 3 different (colored) lines from point (3,0), facing angle 0 rad. But the LSQ lines are represented from the origo.
Also, the algorithm only displays estimated LSQs only for the lines 1, 2 and 4.
Compare estimates to the estimates from problem 6.
After running the algorithm for scannerPose1 we got the two set of estimates:
estimatednn = alpha r - estimate of robot coords
estimatedn = alpha r - estimate of worlds coords
estimated11 = 1.5708 1.0000
estimated1 = 1.5708 1.0000
estimated22 = -1.5708 1.0000
estimated2 = -1.5708 1.0000
We can conclude that the algorithm works as intended.
No comments:
Post a Comment