Live streaming the borehole camera to LAN

Hi,

It would be great to be able to access a stream from the camera on the local area network. Having it go to the cloud and back would be cool too, but is a nice-to-have rather than essential.

Here’s the use case: I’d like to use a external Python app with deep learning to identify and track the plants as the gantry moves. This would run on a PC on the local WiFi/ Ethernet network and also access the REST API and message broker.

I assume getting the camera to stream locally should be relatively easy, assuming FarmbotOS is running on top of a Linux distro of some sort.

I’ve worked on a similar agritech project for my PhD: a lettuce harvesting robot. I’d like to apply some of this tech to my new genesis XL!

Here’s the paper, for anyone interested:
https://onlinelibrary.wiley.com/doi/full/10.1002/rob.21888

Simon Birrell

FarmBotOS is the C and Elixir/Erlang-based code and the OTP Erlang runtime.
This runs as a handful of Linux processes on a Buildroot-based Linux OS which not really a “distro” as such but is a hand-crafted Linux build tailored for and cross-compiled for the target platform : Raspberry Pi.

[edit 0]
@SimonB Did you search this forum ? I found this post which still seems relevant :slight_smile:

[edit 1]
@SimonB at least your post is in the most promising category :slight_smile:
This feature should be readily accomplished with a tweaked version of farmbot_system_rpi3 Elixir/Nerves package.

Thanks for the clarification - I’d looked through the FarmBotOS docs but couldn’t find out the underlying infrastructure.

Buildroot seems to support gstreamer, so integration would presumably not be too hard:

https://gstreamer.freedesktop.org/documentation/tutorials/basic/streaming.html?gi-language=c

Simon

One more comment: even if Pi to cloud streaming is implemented, it would still be important to also have The option of local area streaming, to keep latency low. Visual servoing of the UTM down to the plant, for example, would only be practical with this feature.

2 Likes

I solved the problem by purchasing a secondary USB camera from the Farmbot shop, and attached it to a secondary Raspberry Pi, on the opposite side of the gantry powered it using the peripheral 24V pwr and a motorcycle 24V to USB 5v cigarette adapter . On the secondary Raspberry Pi, I installed a program called Motion and then live stream that to the web app. The second USB camera is currently just taped adjacent to the primary camera mount, soon a 3D part will be completed to allow both cameras to be side by side. I set it up this way so that I can keep Farmbot farmware camera for its primary duty, but then use the secondary for navigation, live stream and OpenCV. The secondary Raspberry Pi can also inject commands to the MQTT message service to manipulate the farmbot, testing python and Node.js now, may end up with both…

3 Likes

This is an excellent idea, and far superior to trying to hack FarmBot OS! Thanks for sharing that!

1 Like

I also used a second PI and the standard PI camera. I then 3d printed a security camera style housing, and mounted it above the FB. There is a freely downloadable app for thePI called RPi Cam Control. You can find it easily with Google. I shared the video stream over the web and made sure that a friend in another city could see it. RPi Cam Control supports motion detection, using the motion app for linux, as well as interval based pictures.
Here is a video that it took.


The only part that I haven’t been able to get working is to embed the camera stream in the FB webpage. I tried with the internet ip address of the camera ie: http://xxx.xxx.xxx.xxx/html, but no luck. I can see the camera using this address from any browser on any other site.
1 Like

Nice setup @dmbgo!

I imagine this is caused by the lack of HTTPS. What do you see in your Javascript console (Chrome: ctrl + shift + J, Firefox: Ctrl + shift + k)?

Hmm… for local ip it works without HTTPS for me. javascript console has the warning, mixed content, “this content should be served over HTTPS” but its permitted checked on these browsers:, chrome, safari, Firefox, opera.

It would still be very helpful to have the base FarmBot setup stream the built-in camera, though!

1 Like

its resource intensive, sometimes 64% of my Raspberry Pi 4…

1 Like

@SimonB That is something we would like to implement long term.

It seems like the option are:

  • Direct / Peer-to-peer streaming: Won’t work at many schools (which is a large portion of our user base) due to port blockage. This might also be a performance issue for Express users that are on an RPi0.
  • Make the web app a media server: Will work at most enterprises but will require some significant changes to how our server is architected.

If anyone has ideas that:

  • Will work well in fire walled environments like schools
  • Will scale to a large device count on one server (most FarmBot users are not self hosting)
  • Will support slower Express devices.

I would be open to suggestions. This feature is still in the “idea” phase.

The main free software webcam management apps of the last decade+ have been Zoneminder plus motion. They were a bit tedious to use, but could be made to work.

As part of setting up my Farmbot (which has XYZ motion as of last night!), I started exploring what is available for free software webcam monitoring nowadays. I highly suggest checking out Shinobi. It is basically the next generation of free software in this class. Zoneminder is from another era and you can feel it. Shinobi development branch ties into Tensorflow (!!!), so you have all the artificial intelligence tools right at your fingertips: object recognition and classification, etc… So I was brainstorming how I could tie them all together. Even without AI, using Shinobi could be useful just for webcam management.

Happy hacking,

-Jeff

https://shinobi.video/


http://www.zoneminder.com/
https://motion-project.github.io/
3 Likes

Thanks for letting us know about this @jebba.

I was shopping around not too long ago and as you mentioned, some of the options are showing their age in 2020. This one looks promising.

1 Like

In my other projects, I use the WebRTC streaming built right into GStreamer. GStreamer can be tied directly to OpenCV. More significantly I can create a pipeline that sends the encoded stream directly through but also tees off into the image processing pipeline. This keeps the data as a single copy in GPU/VPU memory as much as possible.

2 Likes

just added a new Toy to the secondary pi, might help in your deep learning requirement.

2 Likes

Holy smokes. TPU on the edge…

3 Likes

Hi Rick,

I think schools and universities (I have experience of the latter) probably don’t need remote access to the streaming video as they’ll have campus-wide WiFi. If they do need it, then port forwarding is the only realistic option. So, the borehole camera would just become another IP camera with a URL.

For single-frame remote image analysis you already have the software infrastructure to upload photos. I can’t see you doing real time analysis on remote video because of (1) latency and (2) server cost. I’m well aware that you’re paying for the Heroku cost without charging us a subscription. I think image analysis needs to be performed at the edge, perhaps with a second optional Pi.

As for the RPi0, that is a problem. Perhaps this a high-end feature, or will require optional hardware.

A final option would be to try to stream to the server via web socket, and charge a subscription for this feature.

Simon

3 Likes

I would like to recommend the ESP32 Cam for live streaming if you have enough WiFi capacity. You can usually find them for $8-$9 US.
https://amazon.com/dp/B07S5PVZKV
Don’t forget to grab a cheap USB to UART capable of 3V logic if you don’t already have one.

3 Likes

I was looking into what the OpenPNP recommends. PNP = pick and place, a bot that puts chips on a circuit board. They are doing something similar to FarmBot seeding. Each chip/component (like a seed), gets picked up from a specific location using a vacuum nozzle. Instead of just “planting” it directly, the OpenPNP bot carries the part over a “bottom vision” camera (camera looking up), and uses OpenCV to correctly rotate the part. It rotates the part, then places it. FarmBot could do something similar with a seed. Pick it up, then have a camera check to see if the seed is actually picked up or not. Unlike OpenPNP, rotation doesn’t matter, so it is less complex. Long story short, what OpenPNP recommends for cameras is ELP:

http://www.webcamerausb.com/

http://openpnp.org/

Edit: these look good too:

https://www.robotshop.com/en/cameras-vision-sensors.html

2 Likes