<?xml version="1.0" encoding="utf-8" ?>
<rss version="2.0"
    xmlns:dc="http://purl.org/dc/elements/1.1/"
    xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
    xmlns:admin="http://webns.net/mvcb/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:content="http://purl.org/rss/1.0/modules/content/">
    
    <channel>
    
    <title>CL</title>
    <link>https://codelaboratories.com/forums/</link>
    <description>CL</description>
    <dc:language>en</dc:language>
    <dc:rights>Copyright 2019</dc:rights>
    <dc:date>2019-06-09T10:25:37-08:00</dc:date>
    <admin:generatorAgent rdf:resource="" />
    

    <item>
      <title>ps3eye PCB and FSIN related question</title>
      <link>https://codelaboratories.com/forums/viewthread/509/</link>
      <guid>https://codelaboratories.com/forums/viewthread/509/#When:13:58:31Z</guid>
      <description>&lt;p&gt;This is my first post. Glad to be a member of this forum.&lt;/p&gt;

&lt;p&gt;I saw postings on how to sync multiple ps3eye by hacking ps3eye&#8217;s PCB(Printed Circuit Board).&lt;/p&gt;

&lt;p&gt;My understanding after reading these postings: FSIN is where input from other camera is connected and VSYNC is from where output wires goes to next ps3eye. My question is: is FSIN location same on all ps3eye PCBs? I&#8217;m asking this because there are couple of different versions of ps3eye and want to know if FSIN&#8217;s locations are same.
&lt;/p&gt;</description>
      <dc:date>2011-01-29T13:58:31-08:00</dc:date>
    </item>

    <item>
      <title>PS3 EYE &#45; Sync shutter to pulsed LED</title>
      <link>https://codelaboratories.com/forums/viewthread/135/</link>
      <guid>https://codelaboratories.com/forums/viewthread/135/#When:15:34:57Z</guid>
      <description>&lt;p&gt;Just popped over form NUI Group&#8230;&lt;/p&gt;

&lt;p&gt;I&#8217;ve read little bits about the theory of syncing the camera to a pulsed led but nothing definitive.&lt;/p&gt;

&lt;p&gt;So &#45; can it even be done or should I give up now :o)&lt;/p&gt;

&lt;p&gt;Liking the new forum by the way!
&lt;/p&gt;</description>
      <dc:date>2010-02-04T15:34:57-08:00</dc:date>
    </item>

    <item>
      <title>Fps &amp;gt; 75fps with cropped VGA &#63;</title>
      <link>https://codelaboratories.com/forums/viewthread/1051/</link>
      <guid>https://codelaboratories.com/forums/viewthread/1051/#When:09:18:26Z</guid>
      <description>&lt;p&gt;Hello; &lt;/p&gt;

&lt;p&gt;I&#8217;m an engineer/researcher in the field of music.&lt;/p&gt;

&lt;p&gt;I worked for several months on a project with the PS3 eyes, and I need to shoot an object with Fps &amp;gt; 75fps (Bayer mode)&lt;br /&gt;
QVGA resolution is unfortunately too much coarse and binning artifacts too annoying for my application.&lt;/p&gt;

&lt;p&gt;Can we expect to crop images (480 x 320 for exemple by cuting the side of the VGA) to reduce the data acquisition amount  for image transfer via USB and can thus increase the fps in VGA at over 75 fps?&lt;br /&gt;
I read in the OmnVision datasheet  that it was possible at the CMOS sensor level.&lt;/p&gt;

&lt;p&gt;It is really very important to me, and I would be willing to pay for the development .&lt;br /&gt;
I have no need of B&amp;amp;W or COLOR mode, I only use BAYER, maybe it could simplify development.&lt;br /&gt;
If this is possible, I would be interested by hundreds of licenses.&lt;/p&gt;

&lt;p&gt;Regards&lt;/p&gt;

&lt;p&gt;Baptiste&lt;/p&gt;

&lt;p&gt;I noticed some errors in the datasheet OmnVision and I corrected below:&lt;br /&gt;
(I just find the defaults HSTART strange.)
&lt;/p&gt;</description>
      <dc:date>2013-02-10T09:18:26-08:00</dc:date>
    </item>

    <item>
      <title>Multicam sync</title>
      <link>https://codelaboratories.com/forums/viewthread/84/</link>
      <guid>https://codelaboratories.com/forums/viewthread/84/#When:14:42:09Z</guid>
      <description>&lt;p&gt;Hi,&lt;/p&gt;

&lt;p&gt;I&#8217;m using three Eye cams on two PCs (since multicam driver detects only two cams on one PC). PCs are connected together with 1Gbps Ethernet cable. Is there any simple way to wire cams together and to do frame capture &lt;b&gt;exactly&lt;/b&gt; the same time? I know that there is a vsync &#8220;thing&#8221; on camera&#8217;s PCB but I have only basic electronics knowledge (I&#8217;m a programmer). So i&#8217;ll repeat my question: is there any &lt;b&gt;simple&lt;/b&gt; way to do this? &lt;img src=&quot;//codelaboratories.com/ee/images/smileys/smile.gif&quot; width=&quot;19&quot; height=&quot;19&quot; alt=&quot;smile&quot; style=&quot;border:0;&quot; /&gt;&lt;/p&gt;

&lt;p&gt;Thanks.&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
EDIT: In OV7720 datasheet there is information that sensor has an external frame synchronization capability using A4 pin called FSIN (Frame Synchronization INput?).
&lt;/p&gt;</description>
      <dc:date>2010-01-08T14:42:09-08:00</dc:date>
    </item>

    <item>
      <title>Disparity between frame rates and oscilloscope reading</title>
      <link>https://codelaboratories.com/forums/viewthread/582/</link>
      <guid>https://codelaboratories.com/forums/viewthread/582/#When:22:39:54Z</guid>
      <description>&lt;p&gt;Hi,&lt;/p&gt;

&lt;p&gt;I have used AlexP&#8217;s code to increase the number of frame rate selections for my PS3 EYE camera and it appears to have succeeded as I can now view multiple frame rates for the camera on my computer. However, when I choose different frame rates and use the camera while probing its insides, there is no difference in the reading of the oscilloscope connected to the probe despite me changing the frame rates on my computer. Why is this so and how can I observe the relevant readings? Help would be much appreciated.
&lt;/p&gt;</description>
      <dc:date>2011-04-18T22:39:54-08:00</dc:date>
    </item>

    <item>
      <title>Don&#8217;t need all those frames&#8230;</title>
      <link>https://codelaboratories.com/forums/viewthread/529/</link>
      <guid>https://codelaboratories.com/forums/viewthread/529/#When:15:24:55Z</guid>
      <description>&lt;p&gt;But I do need a fast exposure to minimize motion blur in frame captures.&lt;/p&gt;

&lt;p&gt;I don&#8217;t think the exposure parameter will help with motion blur, since the rolling shutter means that if exposure is set below maximum, not all lines will be exposed simultaneously.&lt;/p&gt;

&lt;p&gt;So, I&#8217;m think reducing exposure time with a fixed frame rate won&#8217;t help motion blur at all&#8212;please comment if I have this wrong.&lt;/p&gt;

&lt;p&gt;So, that&#8217;s why I want high fps.&amp;nbsp; I think I need all the lines exposed simultaneously, but for a brief period of time.&amp;nbsp; So, I can only get what I need by using higher fps.&lt;/p&gt;

&lt;p&gt;But, I really don&#8217;t need all the frames!&amp;nbsp; USB bandwidth and processor requirements are impacted as fps goes up.&amp;nbsp; JAll I need for my application is a fast exposure time, with all lines exposed simultaneously.&lt;/p&gt;

&lt;p&gt;Has anyone thought about a hardware hack that would let the camera &#8220;think&#8221; it was running at higher frame rates, but only send every nth frame?&lt;/p&gt;

&lt;p&gt;Alternately, I&#8217;m thinking about using a frame synchronized, infrared strobe with variable duty cycle.&amp;nbsp; If ambient 850 nm infrared light is low enough, this might also effectively eliminate motion blur at lower frames.&lt;/p&gt;

&lt;p&gt;Thanks,&lt;/p&gt;

&lt;p&gt;Dave Thomas
&lt;/p&gt;</description>
      <dc:date>2011-02-09T15:24:55-08:00</dc:date>
    </item>

    <item>
      <title>sensor color quality</title>
      <link>https://codelaboratories.com/forums/viewthread/149/</link>
      <guid>https://codelaboratories.com/forums/viewthread/149/#When:09:25:15Z</guid>
      <description>&lt;p&gt;Hi,&lt;/p&gt;

&lt;p&gt;I tried to do color tracking with the ps3eye but I realized that the color inside the image is way to weak. I use it with 30 fps.&lt;br /&gt;
I also have a Logitech Webcam, which colors are much more briliant and color tracking works fine.&lt;br /&gt;
I also tried converting the rgb image into hlv image, but still I do not get sufficiant color seperation.&lt;br /&gt;
Is this due to the camera sensor or is it a driver issue? When getting the raw bayer image from the next SDK version, will I be able to get better color results?&lt;br /&gt;
Or is there another way to enhance the color quality. I also tried adjusting the gain/exposure/white balance paramters, but without success.&lt;/p&gt;

&lt;p&gt;Thanks,&lt;/p&gt;

&lt;p&gt;Timo
&lt;/p&gt;</description>
      <dc:date>2010-02-14T09:25:15-08:00</dc:date>
    </item>

    <item>
      <title>Push up fps by droping colors</title>
      <link>https://codelaboratories.com/forums/viewthread/73/</link>
      <guid>https://codelaboratories.com/forums/viewthread/73/#When:07:53:34Z</guid>
      <description>&lt;p&gt;Who knows something about firmware of this camera ?&lt;br /&gt;
I think there is possible to pushing more fps from PS3Eye when we lose information about color.&lt;br /&gt;
(If cmos sensor is a bayer paterrn may be is possibility to boost resolution too)&lt;br /&gt;
My PS3Eye will arrive to me about end of the week and now I&#8217;m very interested, where may I find information about it.&lt;br /&gt;
Does somebody try something like this?&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;span style=&quot;font&#45;size:9px;&quot;&gt;I&#8217;m sorry if I wrote this in wrong place.&lt;/span&gt;
&lt;/p&gt;</description>
      <dc:date>2010-01-05T07:53:34-08:00</dc:date>
    </item>

    
    </channel>
</rss>
