Camera image over MQTT



  • Hi all, I discovered that it's possible to send images over MQTT and have done this with a raspberry pi. Is there a way to connect a camera to a pycom board - or via a pi? I want to send images long range via lowra and have the camera sleeping most of the time. It's for a bird monitoring project.. Thanks in advance!


  • Pybytes Beta

    @bucknall said in Camera image over MQTT:

    Hi @robmarkcole

    This is definitely possible over WiFi as there are working examples written in C for the ESP32, however I'm yet to see this ported to MicroPython so it might be a while before you're able to do it specifically in MicroPython.

    Here's a link to the example I'm referencing written in C;

    https://github.com/igrr/esp32-cam-demo

    I see that project converts color pictures into ASCII art. Can anyone enumerate some other options for capturing color pictures and sending via network, LoRa or otherwise? I know of the Arducam, what else?

    Anything native to the ESP32 that can just connect one of the Onvif cameras (OV7670, OV2640) and send?



  • @ahaw021 Hi Andrei, didn't make a start yet but its moving up my to-do list.
    Definitely confirming the image recognition algorithm is the first step and the more data the better. Perhaps you can post the image data on GitHub or Kaggle?
    Cheers



  • @robmarkcole

    did you get anywhere with this

    I have a JeVois camera and can provide you a reference code for a laptop (not an embedded board as I am waiting on my Pycoms) and actually have some eggs that I can send sample code for

    I believe simple blob detection should be good enough for what you need as the eggs should be fairly easy to make out

    Let me know

    Andrei



  • @robmarkcole

    This is a fairly simple computer vision problem

    The work we have done is proprietary to the customer so we aren't able to share it

    What I would suggest you do

    A) Use the storage available locally to save an image
    B) send the counts
    C) run for two weeks and see how accurate it is

    Sample of a related problem

    https://www.youtube.com/watch?v=PHwCK9ItDbc

    Andrei :D



  • @ahaw021 Hi Andrei, that is very interesting, I actually have one of those jevois cameras to try out. My project is monitoring of endangered birds, in particular capturing a daily image of a remote nest to check the number/status of eggs. The question is if this information could be analysed on the camera, or if a person would have to review the images.
    Is your work written up?
    Thanks



  • @robmarkcole

    I understand that it might be commercially sensitive etc

    My question comes from a problem approach

    So the question I specifically want to understand is why do you want to send the image - what is the purpose of having the image sent rather than jumping to a "solution" - I need to send the image

    For example:

    Project 1: Farmer wanted to analyse the health of his sheep (color of their wool)

    Approach 1: Send the Image over NB-IOT to Azure Image Processing
    Approach 2: What we ended up doing - local image processing and sending the results over LoRaWAN. We used a camera from JeVois to do all the processing locally and just send a small message with LoRaWAN - http://jevois.org/

    Project 2: Security at Remote Substations

    Approach 1: Try Stream over long range protocols such as LoRaWAN or Satellite links
    Approach 2: Use the local IP cameras with computer vision to identify when someone is in the substation - send an alert via LoRaWAN to security office and they could decide what to do (most of the time by the time the responders got on site the people entering the station left)

    You can use deltas to minimise the size of the data you are sending. In computer vision there is a concept of background subtraction which is one approach to minimise the ammount of data

    Yet another option is to use the JeVois to store the actual images (you can have up to 8GB of memory i believe) and only send notifications when the camera is full

    Andrei



  • @ahaw021 hi Andrei, the use case is to send a single image once every 24 hours over long distance



  • @robmarkcole - why do you want to send images over LoRa and specifically why have you chosen MQTT as the Transport

    The reason why I ask is that I have done a few projects where customers have asked for this but when we broke up the problem they were actually looking for something else

    Andrei



  • @robmarkcole
    The other problem is you can't work with the image in RAM on the LoPy, there's just not enough of it. You'd need to dump it out to file in blocks first which makes compression difficult.

    However the FiPy does have a lot more RAM (4 Megabytes) :-) (...or the OEM modules)



  • @jmarcelino If using a pi camera, images are ~ 2 Mb https://www.raspberrypi.org/documentation/usage/camera/raspicam/raspistill.md
    Prob could get away with grayscale, lower resolution also



  • @robmarkcole
    How big are the images. By quick calculation it would take about 2 hours to send about 32KB over LoRa (SF7 4/5 on a 1% duty cycle) and that's before any re-transmissions (you'd have to implement a protocol for this over raw LoRa point to point )

    A large enough solar panel could power it but then you're adding even more complexity. It is an interesting mental exercise exploring all those options but at the end of the day you'd be trying to shoehorn functionality into a technology that is a wrong fit.



  • @jmarcelino 500 m would certainly bring some applications within range :-)



  • Very interesting.. With my current technique the image is converted to a bytearray for transmission, so presumably could be transmitted in 50 byte chunks, with sufficient time. .? I suppose power could be supplied by a solar panel?

    If none of these are feasible, then just transmitting 'significant effects' as detected by the camera /pi would be good enough e.g. bird present - bird absent @ midday



  • If you can work with a 500 meter range (maybe 1Km) WiFi Long Range mode - already in ESP-IDF 2 so should be coming soon to Python - could maybe do it.



  • @robmarkcole
    The other problem with taking days to send an image is you have to keep the LoPy running - at full speed - all that time so you could rule out battery power.



  • @robmarkcole The payload of LoRa is only around 50 bytes so using LoRa to transfer your images is out of the question. It is simply not designed for this purpose. You could however store it locally on a sd card, if cellular/Wi-Fi is not an option. You can then send a notification when it's time to collect your data.



  • @RobTuDelft Interesting, thanks! Would I have to transfer the whole image to the LoPy, or just use the LoPy as a Lowra router and transfer the image piecemeal? For my project, I do not require fast transfer of the image - if it took a couple of days to send the image I would be happy.

    4G is another option, but I am thinking about places where there is weak to non-existent mobile reception



  • If the Pi is close, use Wi-Fi to connect to the Pi and send your pictures over WLAN. Instead of the Fipy you can also hook up a separate cellular module to a wipy/lopy.

    Another option is something like this https://www.generationrobots.com/media/3G-GPRS-GPS-Arduino-Shield-With-Audio-Video-Kit.pdf. You can connect the arduino via UART to the wipy/lopy.

    alt text



  • @robmarkcole
    Regardless of camera support sending images over LoRa is unrealistic, it's a slow protocol and the duty cycle restrictions will make any transfer of more than a few bytes take forever.

    The FiPy - provided you have LTE-M network coverage - should be better suited for this


Log in to reply
 

Looks like your connection to Pycom Forum was lost, please wait while we try to reconnect.