Kinect: Depth RAW
Posted: 27 November 2010 11:48 AM   [ Ignore ]
New Member
Rank
Total Posts:  6
Joined  2010-11-27

Has anyone been able to map depth using Depth RAW? I know it’s an 11-bit integer (0-2047) and that the formula is 100 / (-0.00307 * disparity + 3.33), but I keep getting mixed results.

C# code would be awesome, but OpenCV would work.


Thanks!

Profile
 
 
Posted: 27 November 2010 01:50 PM   [ Ignore ]   [ # 1 ]
New Member
Rank
Total Posts:  6
Joined  2010-11-27

Hello,
I am new to the forum. Please have some regards for probably bad english. I am from Aldebaran ;.-)

At first: AlexP’s CL NUI Driver is a hondsome piece of work !

I too try to reconstruct spacial coordinates from the depth/raw Output.
It appears the -let’s name it Z-value from GetNUICameraDepthFrameRAW
is in a range of 400 to 1048.

It is not linked to z in a linear relation.

I measured some rwadata/distance values:

RAW cm
————-
417   46
540   61,5
608   71
682   84
774   109
794   117
881   165
918   198
920   204
946   238
978   310
977   318
1022   487
1041   650

I try to find a regression to that.
Polynomal regression doesn’t seem to fit.

Too, it appears the rawdata-distance is not the polar distance but seems to be preprocessed by the sensor.
It is already projected onto the z-axis.

Oliver Kreylos seem to have solved that problem, but his code is based on a quite complex calibration from his vr libs.
I didn’t catch anything out of this.

As soon as I find a solution I’ll tell.
Meanwhile any hint apreciated grin

Thrilling thing !

Profile
 
 
Posted: 27 November 2010 05:59 PM   [ Ignore ]   [ # 2 ]
Jr. Member
RankRank
Total Posts:  47
Joined  2010-02-15
edosw - 27 November 2010 01:50 PM

Hello,
I am new to the forum. Please have some regards for probably bad english. I am from Aldebaran ;.-)

At first: AlexP’s CL NUI Driver is a hondsome piece of work !

I too try to reconstruct spacial coordinates from the depth/raw Output.
It appears the -let’s name it Z-value from GetNUICameraDepthFrameRAW
is in a range of 400 to 1048.

It is not linked to z in a linear relation.

I measured some rwadata/distance values:

RAW cm
————-
417   46
540   61,5
608   71
682   84
774   109
794   117
881   165
918   198
920   204
946   238
978   310
977   318
1022   487
1041   650

I try to find a regression to that.
Polynomal regression doesn’t seem to fit.

Too, it appears the rawdata-distance is not the polar distance but seems to be preprocessed by the sensor.
It is already projected onto the z-axis.

Oliver Kreylos seem to have solved that problem, but his code is based on a quite complex calibration from his vr libs.
I didn’t catch anything out of this.

As soon as I find a solution I’ll tell.
Meanwhile any hint apreciated grin

Thrilling thing !


I am not sure the connect can operate on the temperatures on Aldebaran. Maybe that can explain your issue. smile

Profile
 
 
Posted: 27 November 2010 07:58 PM   [ Ignore ]   [ # 3 ]
New Member
Rank
Total Posts:  6
Joined  2010-11-27

How are you guys getting the raw values?

Profile
 
 
Posted: 28 November 2010 09:15 AM   [ Ignore ]   [ # 4 ]
New Member
Rank
Total Posts:  6
Joined  2010-11-27

Hello Boris,

I have found some aproximate solution like y= a * tan( (x-b)*c) +d
Using the CLNUIDevice.dll is fairly easy.
Just invoke CLNUIDevice.h + the lib and dll

Here’s some code to read out the depthmap and map into into real spacial coordinates:
( sorry for posting that much .. )
============================================

============================================


typedef double point3[3];

//————————————————————————————————————
  int vecNorm( point3 p, point3 res)
//————————————————————————————————————
{
int i;
double s;
for (s=0.,i=0;i

<3;i++)s+=(p

*p);
if (s==0.0) return 0;
s=sqrt(s);
for (i=0;i

<3;i++)res

=p/s;
return 1;
}
//————————————————————————————————————
  void vecScale( point3 p, double s )
//————————————————————————————————————
{ p[0]*=s;p[1]*=s;p[2]*=s;}

//————————————————————————————————————
  double vecMult( point3 a,point3 b)
//————————————————————————————————————
{
return(a[0]*b[0]+a[1]*b[1]+a[2]*b[2]);
}


//———————————————————————————————————————————
double *map2space( double *pp )
//———————————————————————————————————————————
// just an aproximate solution from empiric data
{
static double p[3];

// pp[0]  0.. 640
// pp[1]  0.. 480
// pp[2]  300 .. 1090

double pi=3.1415926535897932384626433832795;
double a,b,c,d;
a=50;
b=0;
c=0.0014410975475182537791113043042567;
d=12;


double s,sc,x,y,z;
s=pp[2];

s=a*tan((s-b)*c)+d; // map kinect -> cm

p[0]=(pp[0]-240);
p[1]=-(pp[1]-320);
p[2]=620;  // aproximate focal length

double n[3]={0,0,1};

vecNorm(p,p);
sc=1/vecMult(p,n);  // 1/cos(phi)
vecScale(p,s*sc);
return p;
}

//———————————————————————————————————————————
void DoCam()
//———————————————————————————————————————————
{
CLNUICamera cam;
PDWORD pData;

cam=CreateNUICamera();
StartNUICamera(cam);

bool ret;
ret=GetNUICameraDepthFrameRAW(cam, ( PUSHORT )pData, 500 );


PUSHORT lpU;
unsigned short u;

int ik,jk;
double x,y,z;
double p[3]={0,0,0},*pS;


int m=480; // Meshcount in X
int n=640; // Meshcount in Y

int dd=1;

p[2]=0;

// Read out the depth-Buffer, convert pixel into point in 3D-space
for (lpU=( PUSHORT )pData,i=0;i for (j=0;j { u=*lpU;
lpU++;
p[0]=(double)i;
p[1]=(double)j;
if (u > 0 && u < 2044 )
      p[2]=(double)u;


  pS=map2space(p);
 
  NOW_HAVE_SOME_FUN_WITH_THAT_POINT_IN_3D( pS ); //:-))

}


StopNUICamera(cam);
DestroyNUICamera(cam);
}

Image Attachments
cat3d.jpg
Profile
 
 
Posted: 28 November 2010 08:15 PM   [ Ignore ]   [ # 5 ]
New Member
Rank
Total Posts:  6
Joined  2010-11-27

My library has been released. It’s called NKinect and it’s available for preview download over at http://www.nkinect.com

Profile
 
 
Posted: 03 December 2010 06:14 AM   [ Ignore ]   [ # 6 ]
New Member
Rank
Total Posts:  6
Joined  2010-11-11

Hey Boris, i found your library very useful, gona talk on codeplex for further testing, compliments smile

Profile
 
 
 
 


RSS 2.0     Atom Feed