My snickerdoodle idea

Hello,

My name is Duane Kaufman, and I am a complete newbie when it comes to FPGAs and what one can do with them. I saw the snickerdoodle as a way to educate myself on the topic.
What I would like to build a) may have already been done, and/or b) too much for me to bite off at this stage of knowledge, but here goes:

I would like to put together a 'My Magic Window', using a snickerdoodle for video processing, an LDC monitor, USB camera, and a wide-angle video camera, with the following geometry:

The USB camera would be used to track the position of the subjects eyes, and the snickerdoodle would take the image stream from the wide-angle camera, and process it to present on the LCD monitor the image that the subject would be able to see, if they could see through the LCD monitor and the wall. As the subject moved, the image would change, just like it was a real window. of course, the video stream would be live, showing movement outside the wall too.


OK, so I have a snickerdoodle black, and a breakybreaky board. I have a piSmasher on order. Is there a way I could get started on this, or things I could play with to become familiar with how things work? Could I purchase a camera that I could hook up to the breakybreaky, or should I just try to create a test video stream?


I would appreciate feedback from the community, even if it boils down to 'what a dumb idea' :)


Sincerely,

Duane



So is there a pan and tilt servo on the camera?



On Wednesday, July 27, 2016 at 1:06:53 PM UTC-7, Duane Kaufman wrote:
Hello,

My name is Duane Kaufman, and I am a complete newbie when it comes to FPGAs and what one can do with them. I saw the snickerdoodle as a way to educate myself on the topic.
What I would like to build a) may have already been done, and/or b) too much for me to bite off at this stage of knowledge, but here goes:

I would like to put together a 'My Magic Window', using a snickerdoodle for video processing, an LDC monitor, USB camera, and a wide-angle video camera, with the following geometry:

The USB camera would be used to track the position of the subjects eyes, and the snickerdoodle would take the image stream from the wide-angle camera, and process it to present on the LCD monitor the image that the subject would be able to see, if they could see through the LCD monitor and the wall. As the subject moved, the image would change, just like it was a real window. of course, the video stream would be live, showing movement outside the wall too.


OK, so I have a snickerdoodle black, and a breakybreaky board. I have a piSmasher on order. Is there a way I could get started on this, or things I could play with to become familiar with how things work? Could I purchase a camera that I could hook up to the breakybreaky, or should I just try to create a test video stream?


I would appreciate feedback from the community, even if it boils down to 'what a dumb idea' :)


Sincerely,

Duane



Hi,

> So is there a pan and tilt servo on the camera?

At least in my first incarnation, the wide-angle video camera on the 'outside' would be stationary. At most subject-monitor viewing distances, only a subset of the full frame taken by the wide-angle video camera would be displayed (it would be dynamically cropped, yet fill the monitor). As the subject moves closer to the monitor, more and more of the wide-angle frame is displayed.

So there are actually two things going on dynamically:
1) The scale of the displayed image is changed, based on the subject-USB camera distance, and the image is cropped appropriately
2) The displayed image is dynamically cropped version of the wide-angle video camera frame, depending on the left-right, up-down position of the subject

Obviously, the paradigm falls apart if the subject gets too close to the monitor that the wide-angle FOV isn't large enough.

Does that make sense?

Duane

On Wednesday, July 27, 2016 at 3:55:55 PM UTC-5, weath...@krtkl.com wrote:
So is there a pan and tilt servo on the camera?



On Wednesday, July 27, 2016 at 1:06:53 PM UTC-7, Duane Kaufman wrote:
Hello,

My name is Duane Kaufman, and I am a complete newbie when it comes to FPGAs and what one can do with them. I saw the snickerdoodle as a way to educate myself on the topic.
What I would like to build a) may have already been done, and/or b) too much for me to bite off at this stage of knowledge, but here goes:

I would like to put together a 'My Magic Window', using a snickerdoodle for video processing, an LDC monitor, USB camera, and a wide-angle video camera, with the following geometry:

The USB camera would be used to track the position of the subjects eyes, and the snickerdoodle would take the image stream from the wide-angle camera, and process it to present on the LCD monitor the image that the subject would be able to see, if they could see through the LCD monitor and the wall. As the subject moved, the image would change, just like it was a real window. of course, the video stream would be live, showing movement outside the wall too.


OK, so I have a snickerdoodle black, and a breakybreaky board. I have a piSmasher on order. Is there a way I could get started on this, or things I could play with to become familiar with how things work? Could I purchase a camera that I could hook up to the breakybreaky, or should I just try to create a test video stream?


I would appreciate feedback from the community, even if it boils down to 'what a dumb idea' :)


Sincerely,

Duane



Hi All,

So, given that the hardware I have (a snickerdoodle black and a breaky-breaky), what can I do at this point to get started?
If I wanted to connect a camera through the breaky-breaky, how would I go about it? Is there a tutorial somewhere?

Thanks
Duane

On Wednesday, July 27, 2016 at 3:06:53 PM UTC-5, Duane Kaufman wrote:
Hello,

My name is Duane Kaufman, and I am a complete newbie when it comes to FPGAs and what one can do with them. I saw the snickerdoodle as a way to educate myself on the topic.
What I would like to build a) may have already been done, and/or b) too much for me to bite off at this stage of knowledge, but here goes:

I would like to put together a 'My Magic Window', using a snickerdoodle for video processing, an LDC monitor, USB camera, and a wide-angle video camera, with the following geometry:

The USB camera would be used to track the position of the subjects eyes, and the snickerdoodle would take the image stream from the wide-angle camera, and process it to present on the LCD monitor the image that the subject would be able to see, if they could see through the LCD monitor and the wall. As the subject moved, the image would change, just like it was a real window. of course, the video stream would be live, showing movement outside the wall too.


OK, so I have a snickerdoodle black, and a breakybreaky board. I have a piSmasher on order. Is there a way I could get started on this, or things I could play with to become familiar with how things work? Could I purchase a camera that I could hook up to the breakybreaky, or should I just try to create a test video stream?


I would appreciate feedback from the community, even if it boils down to 'what a dumb idea' :)


Sincerely,

Duane



Hi Duane, I think the question in my mind is whether a USB camera is the best fit for what you are trying to do.
In the Zynq the USB interface is like the one interface that cannot be connected through the FPGA.
That doesn't mean the the FPGA fabric couldn't be used for hardware acceleration but you'd probably have better luck on this platform with a camera that is intended for connection to an FPGA.



On Friday, July 29, 2016 at 8:51:35 AM UTC-7, Duane Kaufman wrote:
Hi All,

So, given that the hardware I have (a snickerdoodle black and a breaky-breaky), what can I do at this point to get started?
If I wanted to connect a camera through the breaky-breaky, how would I go about it? Is there a tutorial somewhere?

Thanks
Duane

On Wednesday, July 27, 2016 at 3:06:53 PM UTC-5, Duane Kaufman wrote:
Hello,

My name is Duane Kaufman, and I am a complete newbie when it comes to FPGAs and what one can do with them. I saw the snickerdoodle as a way to educate myself on the topic.
What I would like to build a) may have already been done, and/or b) too much for me to bite off at this stage of knowledge, but here goes:

I would like to put together a 'My Magic Window', using a snickerdoodle for video processing, an LDC monitor, USB camera, and a wide-angle video camera, with the following geometry:

The USB camera would be used to track the position of the subjects eyes, and the snickerdoodle would take the image stream from the wide-angle camera, and process it to present on the LCD monitor the image that the subject would be able to see, if they could see through the LCD monitor and the wall. As the subject moved, the image would change, just like it was a real window. of course, the video stream would be live, showing movement outside the wall too.


OK, so I have a snickerdoodle black, and a breakybreaky board. I have a piSmasher on order. Is there a way I could get started on this, or things I could play with to become familiar with how things work? Could I purchase a camera that I could hook up to the breakybreaky, or should I just try to create a test video stream?


I would appreciate feedback from the community, even if it boils down to 'what a dumb idea' :)


Sincerely,

Duane



Hi Duane,

Let me see if I can get you some help on this as we know someone who's working on getting a low-cost camera working with snickerdoodle as a starting point.

Hang tight...

-Ryan

On Friday, July 29, 2016 at 9:36:24 AM UTC-7, weatherbee@krtkl.com wrote:
Hi Duane, I think the question in my mind is whether a USB camera is the best fit for what you are trying to do.
In the Zynq the USB interface is like the one interface that cannot be connected through the FPGA.
That doesn't mean the the FPGA fabric couldn't be used for hardware acceleration but you'd probably have better luck on this platform with a camera that is intended for connection to an FPGA.



On Friday, July 29, 2016 at 8:51:35 AM UTC-7, Duane Kaufman wrote:
Hi All,

So, given that the hardware I have (a snickerdoodle black and a breaky-breaky), what can I do at this point to get started?
If I wanted to connect a camera through the breaky-breaky, how would I go about it? Is there a tutorial somewhere?

Thanks
Duane

On Wednesday, July 27, 2016 at 3:06:53 PM UTC-5, Duane Kaufman wrote:
Hello,

My name is Duane Kaufman, and I am a complete newbie when it comes to FPGAs and what one can do with them. I saw the snickerdoodle as a way to educate myself on the topic.
What I would like to build a) may have already been done, and/or b) too much for me to bite off at this stage of knowledge, but here goes:

I would like to put together a 'My Magic Window', using a snickerdoodle for video processing, an LDC monitor, USB camera, and a wide-angle video camera, with the following geometry:

The USB camera would be used to track the position of the subjects eyes, and the snickerdoodle would take the image stream from the wide-angle camera, and process it to present on the LCD monitor the image that the subject would be able to see, if they could see through the LCD monitor and the wall. As the subject moved, the image would change, just like it was a real window. of course, the video stream would be live, showing movement outside the wall too.


OK, so I have a snickerdoodle black, and a breakybreaky board. I have a piSmasher on order. Is there a way I could get started on this, or things I could play with to become familiar with how things work? Could I purchase a camera that I could hook up to the breakybreaky, or should I just try to create a test video stream?


I would appreciate feedback from the community, even if it boils down to 'what a dumb idea' :)


Sincerely,

Duane



Hello,

As far as the subject-positioning USB camera is concerned, I was hoping to use the Linux side of the device, with something like OpenCV to recognize faces and eye position. The subject eye position information would then be placed where the FPGA could access it, so it could tailor the video that gets displayed on the monitor coming from the wide-angle camera mounted outside.

So, the flow of data would look like:

Viewer -> USB camera -> Linux (OpenCV or ...) -> Viewer position data --
|
------------------|
|
V
Wide-angle camera (complete frame) -> FPGA -> Cropped&panned wide-angle image to monitor

Duane

On Friday, July 29, 2016 at 11:36:24 AM UTC-5, weath...@krtkl.com wrote:
Hi Duane, I think the question in my mind is whether a USB camera is the best fit for what you are trying to do.
In the Zynq the USB interface is like the one interface that cannot be connected through the FPGA.
That doesn't mean the the FPGA fabric couldn't be used for hardware acceleration but you'd probably have better luck on this platform with a camera that is intended for connection to an FPGA.



On Friday, July 29, 2016 at 8:51:35 AM UTC-7, Duane Kaufman wrote:
Hi All,

So, given that the hardware I have (a snickerdoodle black and a breaky-breaky), what can I do at this point to get started?
If I wanted to connect a camera through the breaky-breaky, how would I go about it? Is there a tutorial somewhere?

Thanks
Duane

On Wednesday, July 27, 2016 at 3:06:53 PM UTC-5, Duane Kaufman wrote:
Hello,

My name is Duane Kaufman, and I am a complete newbie when it comes to FPGAs and what one can do with them. I saw the snickerdoodle as a way to educate myself on the topic.
What I would like to build a) may have already been done, and/or b) too much for me to bite off at this stage of knowledge, but here goes:

I would like to put together a 'My Magic Window', using a snickerdoodle for video processing, an LDC monitor, USB camera, and a wide-angle video camera, with the following geometry:

The USB camera would be used to track the position of the subjects eyes, and the snickerdoodle would take the image stream from the wide-angle camera, and process it to present on the LCD monitor the image that the subject would be able to see, if they could see through the LCD monitor and the wall. As the subject moved, the image would change, just like it was a real window. of course, the video stream would be live, showing movement outside the wall too.


OK, so I have a snickerdoodle black, and a breakybreaky board. I have a piSmasher on order. Is there a way I could get started on this, or things I could play with to become familiar with how things work? Could I purchase a camera that I could hook up to the breakybreaky, or should I just try to create a test video stream?


I would appreciate feedback from the community, even if it boils down to 'what a dumb idea' :)


Sincerely,

Duane



Hello Ryan,

Perfect! Let me know.

Duane

On Friday, July 29, 2016 at 1:08:42 PM UTC-5, Cousins wrote:
Hi Duane,

Let me see if I can get you some help on this as we know someone who's working on getting a low-cost camera working with snickerdoodle as a starting point.

Hang tight...

-Ryan

On Friday, July 29, 2016 at 9:36:24 AM UTC-7, weath...@krtkl.com wrote:
Hi Duane, I think the question in my mind is whether a USB camera is the best fit for what you are trying to do.
In the Zynq the USB interface is like the one interface that cannot be connected through the FPGA.
That doesn't mean the the FPGA fabric couldn't be used for hardware acceleration but you'd probably have better luck on this platform with a camera that is intended for connection to an FPGA.



On Friday, July 29, 2016 at 8:51:35 AM UTC-7, Duane Kaufman wrote:
Hi All,

So, given that the hardware I have (a snickerdoodle black and a breaky-breaky), what can I do at this point to get started?
If I wanted to connect a camera through the breaky-breaky, how would I go about it? Is there a tutorial somewhere?

Thanks
Duane

On Wednesday, July 27, 2016 at 3:06:53 PM UTC-5, Duane Kaufman wrote:
Hello,

My name is Duane Kaufman, and I am a complete newbie when it comes to FPGAs and what one can do with them. I saw the snickerdoodle as a way to educate myself on the topic.
What I would like to build a) may have already been done, and/or b) too much for me to bite off at this stage of knowledge, but here goes:

I would like to put together a 'My Magic Window', using a snickerdoodle for video processing, an LDC monitor, USB camera, and a wide-angle video camera, with the following geometry:

The USB camera would be used to track the position of the subjects eyes, and the snickerdoodle would take the image stream from the wide-angle camera, and process it to present on the LCD monitor the image that the subject would be able to see, if they could see through the LCD monitor and the wall. As the subject moved, the image would change, just like it was a real window. of course, the video stream would be live, showing movement outside the wall too.


OK, so I have a snickerdoodle black, and a breakybreaky board. I have a piSmasher on order. Is there a way I could get started on this, or things I could play with to become familiar with how things work? Could I purchase a camera that I could hook up to the breakybreaky, or should I just try to create a test video stream?


I would appreciate feedback from the community, even if it boils down to 'what a dumb idea' :)


Sincerely,

Duane