Depth from Stereo using multiple Pi Zeros?

The Raspberry Pi is a series of credit card-sized single-board computers developed in the United Kingdom by the Raspberry Pi Foundation to promote the teaching of basic computer science in schools and developing countries.

Post Reply
User avatar
/RaspberryPi
Corporate
Posts: 2449
Joined: Wed Jun 05, 2019 1:29 am

Depth from Stereo using multiple Pi Zeros?

Post by /RaspberryPi »


I'm new to using Raspberry Pis, and am trying to do a project that involves using two OV5647 cameras to perform DfS. For this project, we want to stream synched video frames from the cameras to an external Linux computer for processing.

We initially purchased an Arducam DoublePlexer, and followed the directions for setup (basically just plug the flex cable into the camera connector and run the software they listed in the instructions), however the unit broke multiple Raspberry Pi boards. We are looking either for ways to use the doubleplexer successfully, or alternative approaches using Pis 3B+s/0s.

We have multiple copies of each of the following components: Pi 3B+ boards, Pi Zero V.13s, OV5647 cameras, extenders/adapters we use to connect the Pi Zeros to the cameras. I was wondering if we would be able to connect each camera to a Pi Zero or Pi 3B+, synchronize those somehow, and send the the resulting stereo video they capture either directly to a Linux computer or through another Pi to the Linux computer?

A lot of the solutions we see online involve using Arducam multiplexers like the one we had tried before, so we were wondering if this approach was feasible with the equipment mentioned above (rather than having to get something like StereoPi+Computer Module), or if anyone has experienced similar issues with the doubleplexer and know how to resolve them?

Thanks



EDIT:

Sorry folks, I should have specified - we want very tiny and easily positioned cameras for this, which is why we opted to use the Raspberry Pi cameras - we're building a prototype wearable with egocentric camera recording, and have tiny cameras that we want to put in glasses frames. They can be physically connected by wiring, as they will be in close proximity, or through other boards - the cameras just need to be synchronized so we can perform DfS on egocentric video captured from our prototype.

EDIT 2:

Firm/soft realtime is what we're shooting for, likely for a video in the range of 24-30 fps and we don't have an exact number, but as low latency as possible
submitted by /u/ShortCircuity
[link] [comments]

Source: https://www.reddit.com/r/raspberry_pi/c ... _pi_zeros/
/RaspberryPi
Post Reply

Return to “Raspberry Pi Forum”