Q: Buffers and timestamps?
Posted: 09 April 2010 06:26 PM   [ Ignore ]
New Member
Rank
Total Posts:  6
Joined  2010-04-01

Hello just a couple of related questions.

In my application latency is not important, but it is important that it doesn’t drop frames, or if it does drop frames I need to know.

Currently I have a one thread/camera collecting frames and buffering and labeling them for later processing in a separate thread. But if at 30fps what happens if Windows puts the collector thread to sleep for > 33ms?

Does CLEyeMulticam buffer images from the cameras in any way? Also is there any way to get a timestamp of say when the data arrived at the USB bus to double check drop frames?

Thanks in advance

DB

Profile
 
 
Posted: 10 April 2010 12:42 AM   [ Ignore ]   [ # 1 ]
New Member
Rank
Total Posts:  7
Joined  2010-02-21

Hi, I am interested in this information as well. Ideally each frame should have time stamp and frame number starting from 0/1 since the capturing has been triggered. This will enable applications to register frame drops and delays and react accordingly.
How possible is that with the SDK?

Profile
 
 
Posted: 11 April 2010 08:18 AM   [ Ignore ]   [ # 2 ]
Administrator
Avatar
RankRankRankRank
Total Posts:  585
Joined  2009-09-17
DavidJDBell - 09 April 2010 06:26 PM

Hello just a couple of related questions.

In my application latency is not important, but it is important that it doesn’t drop frames, or if it does drop frames I need to know.

Currently I have a one thread/camera collecting frames and buffering and labeling them for later processing in a separate thread. But if at 30fps what happens if Windows puts the collector thread to sleep for > 33ms?

Does CLEyeMulticam buffer images from the cameras in any way? Also is there any way to get a timestamp of say when the data arrived at the USB bus to double check drop frames?

Thanks in advance

DB

The frames are internally queued by the driver and you won’t lose any frames assuming that they were received by the driver. Unless of course you block for too long. In another words as long as your capture thread can keep up with the real-time stream you will be fine.

Profile
 
 
Posted: 11 April 2010 08:22 AM   [ Ignore ]   [ # 3 ]
Administrator
Avatar
RankRankRankRank
Total Posts:  585
Joined  2009-09-17
pavele - 10 April 2010 12:42 AM

Hi, I am interested in this information as well. Ideally each frame should have time stamp and frame number starting from 0/1 since the capturing has been triggered. This will enable applications to register frame drops and delays and react accordingly.
How possible is that with the SDK?

The frame number has very little meaning since your code will get every frame that was received by the driver. On the other hand, if the frame was dropped due to the USB bus contingency issues, and since the camera does not send any frame sequence number, the driver would not know about it and therefore generate next in sequence frame number.

Profile
 
 
Posted: 11 April 2010 10:00 AM   [ Ignore ]   [ # 4 ]
New Member
Rank
Total Posts:  6
Joined  2010-04-01

Thanks Alex

Would it be useful at some point in the future to add a SYSTEMTIME& argument to the GetFrame function, that reports time the driver first saw the frame?

DB

Profile
 
 
Posted: 11 April 2010 11:42 AM   [ Ignore ]   [ # 5 ]
New Member
Rank
Total Posts:  7
Joined  2010-02-21

Thanks Alex. You are right, indeed. But still there is a chance that for some reason the capture thread gets delayed even once and only in the time. From this point onwards there will be a data calculation error in the client app due to synchronization algorithm not being aware of the frame drop. I agree this may be a rare occasion, but when data accuracy is a must - frame numbering and timing are “must-have” features smile
May be add a version of CLEyeCameraGetFrame with two more parameters?

Regards, Pavel.

Profile
 
 
Posted: 11 April 2010 06:34 PM   [ Ignore ]   [ # 6 ]
Administrator
Avatar
RankRankRankRank
Total Posts:  585
Joined  2009-09-17
pavele - 11 April 2010 11:42 AM

Thanks Alex. You are right, indeed. But still there is a chance that for some reason the capture thread gets delayed even once and only in the time. From this point onwards there will be a data calculation error in the client app due to synchronization algorithm not being aware of the frame drop. I agree this may be a rare occasion, but when data accuracy is a must - frame numbering and timing are “must-have” features smile
May be add a version of CLEyeCameraGetFrame with two more parameters?

Regards, Pavel.

Of course it is possible that your thread gets delayed, but since it is processing data faster than real-time it will quickly catch up with the video stream and you would not have the problem you are talking about. In fact the video frames from camera come at perfectly even intervals (since the camera hardware runs and captures at the perfectly fixed clock) even if you don’t see the frames at these intervals received by your application. Therefore any kind of software time-stamping is meaningless because that only accounts for the timing of the USB-driver-multicam stack, for which you and your algorithm really shouldn’t care. The only sure way for us to know if the frame is dropped is by having the capture hardware time-stamp every frame on capture. Unfortunately the PS3Eye hardware currently does not do that.

Profile
 
 
Posted: 11 April 2010 09:39 PM   [ Ignore ]   [ # 7 ]
New Member
Rank
Total Posts:  7
Joined  2010-02-21

Alex, as I see it - you suggest to count the images and calculate their time stamp relatively to the first received image and relying on the fact that hardware generates images on a clock - which is a fact.
This can have tho main drawbacks:
- The whole time line will be based on the time of the first image, which in turn comes a bit (=unreliable value) later than it has been actually received by the driver.
- Nevertheless the capture thread is optimized for best performance it co-exists with other threads (and applications) and very likely some heavy proccessings that can clutch the whole performance for a while. For a complicated system it is practically out of control. Even if that happens for a second or two it should not compromise the time line (and therefore data stream) in any way.

I would guess the buffers at USB level implementation are cyclic and gets overwritten even if not read-out by the user. So if there are even a small practical chance that a frame/s has been lost without anybody (apart from USB driver) knowing it - that can compromise the whole system accuracy.
This is especially important if we have another data stream - for example a XYZ position that comes from another hardware device. At some point we need to couple XYZ and image in a best and accurate way it is possible. That can be done only if we have time stamping and/or image count. Preferably AND. smile
From what you described - I would guess it is possible to have image count and time stamp generated by a usb event (e.g. image receive started). As the hardware will always stream images at even intervals down to usb the image counts and timestamps would be reliable. Whatever happens to the image afterward is not important, as image drop can be registered and accounted by the user of the SDK.
I hope you see my point.  Is adding image count and time stamp foreseeable at your side?

Best regards,
Pavel

Profile
 
 
Posted: 12 April 2010 11:45 AM   [ Ignore ]   [ # 8 ]
Administrator
Avatar
RankRankRankRank
Total Posts:  585
Joined  2009-09-17

Pavel, I completely understand what you are saying. It all depends on with what time error can you live with. As I hinted in the previous post, synchronizing two hardware devices is only possible if they both: a) run off of the same sync clock, b) generate timestamps in hardware. Trying to achieve this in software is meaningless since the OS is not a real-time OS and therefore code execution timing is not guaranteed. The only thing that is known is that camera frames come at discrete time intervals. We also don’t know the exact capture start time, but we can guarantee that the error will be less than T/2 (T is frame period) assuming no frames are dropped. Time-stamping in software will only tell you at what time was the frame data first seen by the computer, furthermore this will be the time at which the whole frame was fully received. This therefore adds additional delay. On top of that if you add the acquisition delay from you other device, your error will go up even more.
One way you will decrease this error is by increasing your acquisition rate (ie frame rate). Another way as I said is to sync your devices in hardware (something we at CL are currently developing).
So having timestamps coming strictly from software is really bad solution. Even more so, since for someone with a slow machine these timestamps will be way off (and they will complain that there is something wrong with the SDK). So now instead of having the accurate measurement you are adding into play too many unknowns that will totally skew your measurements. Hence your measurements become meaningless and useless for practical purposes.
Even at the driver level, you will only get signaled once the full frame is received which already adds some time offset to the real frame capture time. Adding software time-stamping to the SDK is possible but we are currently working on much better solution. Will keep you posted.

Profile
 
 
Posted: 12 April 2010 07:38 PM   [ Ignore ]   [ # 9 ]
New Member
Rank
Total Posts:  6
Joined  2010-04-01
AlexP - 12 April 2010 11:45 AM

Another way as I said is to sync your devices in hardware (something we at CL are currently developing).  ... time-stamping to the SDK is possible but we are currently working on much better solution. Will keep you posted.

Hello Alex,

On my project it needs to link frames with events several seconds later. To do this I am just in the process of setting up a ATMEL(eq PIC) microcontroller to both control the machine and send signals to FSIN as described in http://codelaboratories.com/forums/viewthread/84/. The micro is a real time device so everything should synch perfectly. What I am a little bit uncomfortable with is that the micro wont know if a frame is dropped in the PC, thus my original question.

What I am thinking at the moment is the micro will send a message via rs232 every time it fires a frame, which should tell me if I have lost a frame but not when. I am also hoping that holding FSIN LOW will stop frames even when the camera has been started by CL, so something like should be pretty safe

Set_FSIN_LOW               // command to micro
Wait_for_FSIN_Confirm   // message from micro

CLEyeCreateCamera
CLEyeCameraStart         
// nothing happens while FSIN is LOW

StartFSINPulses              // command to micro
while(1)
    
{ReadFSINPulseConfirm  // message from micro, note rs232 has it's own subsystem & buffer
                                         // so always has true count
      
CLEyeGetFrame
      CheckFrameCount
     } 

Should I be waiting for your solution?

David

PS, I also wondered about using the microphones somehow - cutting them off and pulsing with the micro, but I’m not sure they will be perfectly synched with the video in windows anyway.

Profile
 
 
Posted: 12 April 2010 08:30 PM   [ Ignore ]   [ # 10 ]
Administrator
Avatar
RankRankRankRank
Total Posts:  585
Joined  2009-09-17

David,

Unfortunately you can’t just send FSIN signals to the camera at any arbitrary time. They have to be very precisely timed. Furthermore, holding FSIN low will not prevent sensor from running. The purpose of the FSIN input signal is for synchronization only. In your case I would not use FSIN input. I would use the sync output from the camera to sync your sensor event capture/timestamps.

Profile
 
 
Posted: 12 April 2010 09:06 PM   [ Ignore ]   [ # 11 ]
New Member
Rank
Total Posts:  6
Joined  2010-04-01

OK Thanks Alex. Hopefully I’ll have it set up this weekend to mess around a bit. I’ll start with reading VSYNCH with the micro and propagating to other (3) cameras FSIN. I did a run in software alone and 4 cameras can get out of synch with each other in less that 20 seconds so I need to do that as a minimum. I’ll let you know how it goes. DB

(Edit) Fall back as you say is for the micro to monitor all four VSYNCHs and send adjustment messages back to the PC. Synching four cams on FSIN much better though.

Profile
 
 
Posted: 15 April 2010 04:38 PM   [ Ignore ]   [ # 12 ]
Administrator
Avatar
RankRankRankRank
Total Posts:  585
Joined  2009-09-17

Let me know how that works for you. In the mean time I am also looking into extending the API to add the frame time stamp information that you’ll be able to retrieve along with the image data from your app.

Profile
 
 
Posted: 23 April 2010 09:56 PM   [ Ignore ]   [ # 13 ]
New Member
Rank
Total Posts:  6
Joined  2010-04-01

Hi Alex

Probably be next weekend by the time I’m done, but I had a look a bit with a CRO and it is interesting to note that VSYNCH is being fired all the time whether you are capturing or not. And as you said you can not stop capture by pulling FSIN LOW because it’s in that state anyway.

I thought about it a bit more, and my idea now is that I will still be using an mcu to control FSIN x 4 and to send messages back to pc at every frame with the true frame count. When the main thread receives an mcu message it marks the system time then waits for the latest frame from the four collector threads which also timestamp them with the system time when CLEyeGetFrame is done.

If all four frame time stamps are within 1/framerate of the mcu message it is guaranteed they in synch, and that they are a known frame (the last one signalled). CL queue is also empty. If any of the main, collector or CL threads get held up the timing would be suspect even if they appear to be the last set of four frames. I don’t need every frame so I would just discard anything suspect, though I think it would be possible to recover most of them later (eg one suspect frame set between two acceptable frame sets is also OK in hindsight).

The other thing that might be handy in CLEyeGetFrame is an indicator if the driver queue is empty (ie this is the last frame available). I guess you could get this by calling (CLEyeGetFrame(mCam, 1)) ?

DB

Profile
 
 
Posted: 07 December 2012 12:21 PM   [ Ignore ]   [ # 14 ]
New Member
Rank
Total Posts:  3
Joined  2012-11-28
AlexP - 15 April 2010 04:38 PM

Let me know how that works for you. In the mean time I am also looking into extending the API to add the frame time stamp information that you’ll be able to retrieve along with the image data from your app.

I know this topic is old but is this still in the works? I am looking for a way to get timestamps to millisecond accuracy or higher in my application.

Profile
 
 
 
 


RSS 2.0     Atom Feed