My application is more sensitive to minimum camera frame rate than average frame rate.
Using the SDK, I have a loop with only the following:
cvGetImageRawData(pCapImage, &pCapBuffer;);
CLEyeCameraGetFrame(_cam, pCapBuffer);
tickFPS = ::GetTickCount();
timeFPS = tickFPS - oldTickFPS;
oldTickFPS = tickFPS;
fps = 1000.0f/((float)timeFPS);
if (fps
< tMin)
tMin = fps;
if(fps >
tMax)
tMax = fps;
tAvg = ((tAvg * cnt) + fps) / (cnt+1);
cnt += 1;
printf(“fps = %g\n “, fps);
With cam = new CLEyeCameraCapture(windowName, guid, CLEYE_COLOR_PROCESSED, CLEYE_VGA,30), I see this (every 15th frame has 31.25 fps)
fps = 66.6667
fps = 62.5
fps = 66.6667
fps = 62.5
fps = 62.5
fps = 66.6667
fps = 62.5
fps = 62.5
fps = 66.6667
fps = 62.5
fps = 66.6667
fps = 62.5
fps = 62.5
fps = 66.6667
fps = 31.25
fps = 66.6667
fps = 62.5
fps = 66.6667
fps = 62.5
fps = 62.5
fps = 66.6667
fps = 62.5
fps = 62.5
fps = 66.6667
fps = 62.5
fps = 66.6667
fps = 62.5
fps = 62.5
fps = 66.6667
fps = 31.25
fps = 66.6667
fps = 62.5
fps = 66.6667
fps = 62.5
fps = 62.5
fps = 66.6667
fps = 62.5
fps = 62.5
fps = 66.6667
fps = 62.5
fps = 66.6667
fps = 62.5
fps = 62.5
fps = 66.6667
fps = 31.25
....
Even if I use 30 fps, I don’t see consistent frame rate:
With cam = new CLEyeCameraCapture(windowName, guid, CLEYE_COLOR_PROCESSED, CLEYE_VGA,30), I see this (occaisonally frame rate is only 21 fps):
fps = 32.2581
fps = 32.2581
fps = 31.25
fps = 32.2581
fps = 32.2581
fps = 32.2581
fps = 21.2766
fps = 32.2581
fps = 31.25
fps = 32.2581
fps = 32.2581
fps = 32.2581
fps = 31.25
fps = 32.2581
fps = 21.2766
fps = 32.2581
fps = 32.2581
fps = 31.25
fps = 32.2581
fps = 32.2581
fps = 32.2581
fps = 21.2766
fps = 32.2581
fps = 31.25
fps = 32.2581
fps = 32.2581
fps = 32.2581
fps = 31.25
fps = 32.2581
fps = 21.2766
Does this make sense?
Will the delay between frames vary this much?
Even though the delay between frames varies, can I assume the camera itself is actually capturing frames at a constant rate?
Is there a better way to measure actual camera frame rate?
Thanks,
Dave Thomas