Geometric projections of 3D points into 2D camera pixels #65
-
I want to see the code of the function Sorry for the noob questions.. |
Beta Was this translation helpful? Give feedback.
Replies: 20 comments
-
Here it is: |
Beta Was this translation helpful? Give feedback.
-
@pattacini I am not being able to reproduce the output of To debug this I created a sphere centred in the world coordinates equal to [5,2,10]. I then project this point to pixel coordinates and check if the pixel coordinates are right in relation to the image that I get from the robot camera. When all the joints of the iCub are in their initial position (0 degrees) both codes give me similar results and they are both almost right (their projection is not exactly on the right spot); I guess that what I am doing wrong is the computation of the Matrix that transforms points that are in the Root coordinate frame to points that are in respect to the camera frame. Here's the code I'm using to compute that matrix (that I called M): //Torso
std::string robotName="icubSim";
std::string remotePorts="/";
remotePorts+=robotName;
remotePorts+="/torso";
std::string localPorts="/torso/client";
Property options;
options.put("device", "remote_controlboard");
options.put("local", localPorts.c_str()); //local port names
options.put("remote", remotePorts.c_str()); //where we connect to
// create a device
PolyDriver torsoDevice(options);
if (!torsoDevice.isValid()) {
printf("Device not available. Here are the known devices:\n");
printf("%s", Drivers::factory().toString().c_str());
return 0;
}
IEncoders *torso_encs;
bool ok;
ok = torsoDevice.view(torso_encs);
//Head
remotePorts="/";
remotePorts+=robotName;
remotePorts+="/head";
localPorts="/head/client";
options.put("device", "remote_controlboard");
options.put("local", localPorts.c_str()); //local port names
options.put("remote", remotePorts.c_str()); //where we connect to
// create a device
PolyDriver headDevice(options);
if (!headDevice.isValid()) {
printf("Device not available. Here are the known devices:\n");
printf("%s", Drivers::factory().toString().c_str());
return 0;
}
IEncoders *head_encs;
ok = headDevice.view(head_encs) & ok;
if (!ok) {
printf("Problems acquiring interfaces\n");
return 0;
}
int ntj=0;
torso_encs->getAxes(&ntj);
Vector t_encoders;
t_encoders.resize(ntj);
int nhj=0;
head_encs->getAxes(&nhj);
Vector h_encoders;
h_encoders.resize(nhj);
Vector q(8);
iCubEye *eye = new iCubEye("right_v2");
q[0]=t_encoders[0];
q[1]=t_encoders[1];
q[2]=t_encoders[2];
q[3]=h_encoders[0];
q[4]=h_encoders[1];
q[5]=h_encoders[2];
q[6]=h_encoders[3];
q[7]=h_encoders[4]+h_encoders[5]/(-2.0);//head[4]+head[5]/(isLeft?2.0:-2.0);
Matrix M = SE3inv(eye->getH(q)); In regard to the get2DPixel function. I have no idea of why it is not working because I am just doing Thank you very much and sorry for the long post! |
Beta Was this translation helpful? Give feedback.
-
I think that you have to either change the issue's name, or close it and open a new one. It seems to me that |
Beta Was this translation helpful? Give feedback.
-
@DanielePucci's suggestion endorsed ➡️ title changed. Hi @jpiabrantes Apparently, you're not taking into account the transformation between the world frame and the robot root frame. Find out more on this in the wiki. This might explain your wrong results. Secondly, you're basically replicating the functionality of the Gaze Control Interface by opening all the required motor interfaces to then manipulate the eye kinematics. Thus, I'd warmly recommend you to use the appropriate tool instead. |
Beta Was this translation helpful? Give feedback.
-
You might find useful this old piece of code I wrote to investigate geometric triangulation within the simulated world. |
Beta Was this translation helpful? Give feedback.
-
@pattacini I am taking into account the transformation between world frame and the robot frame. I just posted the code of computing the matrix M because if the code works when the camera is at its origin and stop working when it changes position I assume that the problem is in the transformation to the camera frame. The complete set of transformations that I do is:
I am going to look into your old piece of code this weekend. Thank you very much for the quick replies. |
Beta Was this translation helpful? Give feedback.
-
I am not using the Gaze Control Interface because I need to project thousands of points to the image at each iteration. I don't want to call the get2Dpixel a thousand times because it won't be fast enough. If there was a function in Gaze Control Interface that would return the correct M matrix I think my problems would be solved. |
Beta Was this translation helpful? Give feedback.
-
@jpiabrantes I had this problem in the past for the exact same reason (i.e. thousands of points to project on the image plane). As such, you might find helpful to dig into this method: https://github.com/robotology/peripersonal-space/blob/master/modules/visuoTactileRF/vtRFThread.cpp#L1090 |
Beta Was this translation helpful? Give feedback.
-
Btw, your method and the |
Beta Was this translation helpful? Give feedback.
-
The services to return The correct way is then to resort to the kinematics of the eye as you were already doing and as @alecive did for its task. |
Beta Was this translation helpful? Give feedback.
-
Thank you very much for the help. I have been trying to debug this all day but I still couldn't do it. The code does the following:
code, main.cpp: // -*- mode:C++; tab-width:4; c-basic-offset:4; indent-tabs-mode:nil -*-
#include <stdio.h>
#include <yarp/os/all.h>
#include <yarp/os/Network.h>
#include <yarp/os/Time.h>
#include <yarp/dev/ControlBoardInterfaces.h>
#include <yarp/dev/PolyDriver.h>
#include <yarp/sig/Vector.h>
#include <yarp/sig/Matrix.h>
#include <yarp/dev/GazeControl.h>
#include <yarp/math/Math.h>
using namespace yarp::dev;
using namespace yarp::sig;
using namespace yarp::os;
using namespace yarp::math;
int main(int argc, char *argv[]){
Network yarp;
//Create Sphere
Vector X(4);//World Homogeneous Coordinates
X[0]=5;X[1]=3;X[2]=10;X[3]=1;
RpcClient create_port;
create_port.open("/create/sphere/client");
create_port.addOutput("/icubSim/world");
Bottle cmd;
cmd.addString("world");
cmd.addString("mk");
cmd.addString("ssph");
cmd.addDouble(0.1);
for (int i=0;i<3;i++)
cmd.addDouble(X[i]);
cmd.addDouble(255.);
cmd.addDouble(255.);
cmd.addDouble(255.);
create_port.write(cmd);
//move head
Property optionsHead;
optionsHead.put("device", "remote_controlboard");
optionsHead.put("local", "/head/client");
optionsHead.put("remote", "/icubSim/head");
PolyDriver headDevice(optionsHead);
if (!headDevice.isValid()) {
printf("Device not available. Here are the known devices:\n");
printf("%s", Drivers::factory().toString().c_str());
return 0;
}
IPositionControl *pos;
headDevice.view(pos);
Vector pos_cmd(6);
pos_cmd[0] = 14;
pos_cmd[1] = 40;
pos_cmd[2] = 54;
pos_cmd[3] = 3;
pos_cmd[4] = 51;
pos_cmd[5] = 69;
pos->positionMove(pos_cmd.data());
bool done = false;
while (!done){
pos->checkMotionDone(&done);
}
Time::delay(5);//just to make sure the robot is head is stable
//Change World Coordinates to Root Coordinates
Matrix W2R(4,4); //World 2 Root Matrix
W2R[0][0]=0; W2R[0][1]=0; W2R[0][2]=-1; W2R[0][3]=-0.0260;
W2R[1][0]=-1; W2R[1][1]=0; W2R[1][2]=0; W2R[1][3]=0;
W2R[2][0]=0; W2R[2][1]=1; W2R[2][2]=0; W2R[2][3]=-0.5976;
W2R[3][0]=0; W2R[3][1]=0; W2R[3][2]=0; W2R[3][3]=1;
X = W2R*X; //sphere in root coordinates
//Where is the sphere in pixel coordinates? Using get2DPixel
Vector u(2);
Property option;
option.put("device","gazecontrollerclient");
option.put("remote","/iKinGazeCtrl");
option.put("local","/client/gaze");
PolyDriver clientGazeCtrl(option);
IGazeControl *igaze=NULL;
if (clientGazeCtrl.isValid()) {
clientGazeCtrl.view(igaze);
}
igaze->get2DPixel(1,X,u);
printf("result of get2DPixel is x:%f and y:%f\n",u[0],u[1]);
//Where is the sphere in pixel coordinates? Using the getRightEyePose()
//get M matrix
Vector x(3);
Vector od(4);
igaze->getRightEyePose(x,od);
Matrix M=axis2dcm(od);
for (int i=0; i<3; i++)
M[3][i]=x[i];
M=SE3inv(M);
//Prj matrix
Matrix Prj(3,4);
Prj[0][0]=257.34; Prj[0][1]=0.00; Prj[0][2]=160.00; Prj[0][3]=0.00;
Prj[1][0]=0.00; Prj[1][1]=257.34; Prj[1][2]=120.00; Prj[1][3]=0.00;
Prj[2][0]=0.00; Prj[2][1]=0.00; Prj[2][2]=1.00; Prj[2][3]=0.00;
Vector V = Prj*M*X;
V=V/V[2];
printf("result of getRightEyePose is x:%f and y:%f\n",V[0],V[1]);
return 0;
} To compile the code I'm using the following CMakeLists.txt:
And then I run the following commands in the terminal:
It will then print the pixel coordinates achieved by the two methods. Thank you very much for the help. |
Beta Was this translation helpful? Give feedback.
-
I have also tested my original code and I noticed that if I change the value of the last three head joints the M matrix doesn't change, which obviously shouldn't happen. int main(int argc, char *argv[]){
Network yarp;
Vector q(8);
q.zero();
iCubEye *eye = new iCubEye("right"); //also tried with right_v2 and obtained similar results
Matrix M1 = eye->getH(q);
for (int i=0; i<8; i++){
printf("%d\n",i); //which joint I am changing
q[i] = 15*M_PI/180; //change a joint
Matrix M2 = eye->getH(q);
for (int r=0; r<4; r++){
for (int c=0;c<4;c++){//the difference between M1 and M2 is all zeros when I change the joints 5,6 and 7.
printf("%f ",M1[r][c]-M2[r][c]);
}
printf("\n");
}
q[i]=0; //make the joint angle 0 again
}
return 0;
} |
Beta Was this translation helpful? Give feedback.
-
@jpiabrantes I'm a bit busy these days sorry. |
Beta Was this translation helpful? Give feedback.
-
Hi @jpiabrantes To start with, I wanted to test the Launching the modulesTherefore, first you launch the simulator along with the >> iCub_SIM
>> iKinGazeCtrl --from configSim.ini The Try to launch: >> yarp resource --context iKinGazeCtrl --find configSim.ini
>> yarp resource --context cameraCalibration --find icubSimEyes.ini to double check whether those files will be actually loaded by the Creating the sphereThen, you create a static sphere in position Pw= >> echo "world mk ssphr 0.02 0.0 1.0 1.0 1 0 0" | yarp rpc /icubSim/world Now, you could open up a >> yarpview --name /test --out /test/out &
>> yarp connect /icubSim/cam/right /test
>> yarp read ... /test/out Whenever you click on the red sphere you should obtain a pixel whose coordinates will look pretty much like these: Querying iKinGazeCtrlNow we reach the crucial part. Let's see whether the Therefore: echo "get 2D (right -1.026 0.0 0.4024)" | yarp rpc /iKinGazeCtrl/rpc returns 🎉: Response: [ack] (152.039528 103.523093) which is our pixel! Likewise, we can test if echo "get pose right" | yarp rpc /iKinGazeCtrl/rpc The response is: Response: [ack] (-0.062836 0.034001 0.340795 -0.575685 -0.578202 0.57816 2.096962) (...) whose first three numbers account for the position, whereas the last four numbers for the orientation in axis-angle notation. We have eventually all the ingredients for the check. Our pixel
where
Test passed! 🍰 |
Beta Was this translation helpful? Give feedback.
-
@jpiabrantes checking others' code tends to be much more time consuming. Next time, consider also posting your snippets in |
Beta Was this translation helpful? Give feedback.
-
@pattacini Thank you very much for the help. For example: Creating SphereI create a static sphere with Pw= >> echo "world mk ssphr 0.1 5.0 3.0 10.0 1 0 0" | yarp rpc /icubSim/world I use the yarpmotorgui to change the head joints away from their origin. >>yarpmotorgui .. in the head tab I change Joint 1(), Joint 2() and Joint 3() to 60, 55 and -20 degrees correspondingly. Note that Joint 1() is not the first joint because it starts counting at 0. Using get2DPixels()I have to find the root coordinates first: *Pr=SE3inv(T)Pw= Therefore: >> echo "get 2D (right -10.026, -5., 2.4024)" | yarp rpc /iKinGazeCtrl/rpc returns: Response: [ack] (311.080534 155.434653) Using getRightEyePose()>> echo "get pose right" | yarp rpc /iKinGazeCtrl/rpc returns: Response: [ack] (-0.067772 0.059091 0.320938 -0.295569 -0.342242 0.891913 3.049125) (18789 1445898216.848155) Doing the math: p=Prj*SE3inv(M)*SE3inv(T)*Pw; I get p= And p[0]/p[2]=311.07936419;
p[1]/p[2]=155.43556015; An error of 14.6 pxs. It's a big error and I have seen worse. sqrt((311.07936419-301)^2+(155.43556015-166)^2)=14.6 Do you also get this errors in this situation? |
Beta Was this translation helpful? Give feedback.
-
I've been playing around a bit with the Interestingly, I've been observing this weird behavior only when moving the Good catch @jpiabrantes, thanks for spotting this out. I believe it is worth opening a dedicated issue on Stay tuned! |
Beta Was this translation helpful? Give feedback.
-
@jpiabrantes |
Beta Was this translation helpful? Give feedback.
-
@pattacini yeap! All good 👍 |
Beta Was this translation helpful? Give feedback.
Hi @jpiabrantes
To start with, I wanted to test the
iKinGazeCtrl
functionalities in a setting very similar to your case.Launching the modules
Therefore, first you launch the simulator along with the
iKinGazeCtrl
:The
configSim.ini
should contain the attached options, while theicubSimEyes.ini
should look like as this file (cameras resolution of320x240
pixels).Try to launch:
to double check whether those files will be actually loaded by the
iKinGazeCtrl
at start-up.Creating the sphere
Then, you create a static …