Imp, Camera, Action!

Possible?

I have an impExplorer and also an imp005 board. Do either or both of these have the computing horsepower to read a still image from a camera and upload it to the cloud every 10 minutes or so?

I found this from ages past:

https://github.com/electricimp/examples/tree/master/vc0706

…as well as very old forum posts, but nothing in the last three years. But it seems like it should be possible?

The imp005 has a USB port. I have a USB webcam. Could I plug it in? Well, I could plug it in, definitely, but would I be able to get it to work?

USB webcams are possible, but generally they supply data in an isoc stream, which is not currently supported (we have played around with internal builds and UVC devices, though).

The simplest solution - and this works well - is to use an Arducam mini (SPI interface) and an imp. I can post this code if it’s helpful, though it’s not been polished as yet. Not tried it on an 005 but it should work.

Hi, thanks, Hugo!

Yes, code would help a great deal!

Cost is the big factor. I need some solution that can be as cheap as possible in quantity – we are planning to mass-manufacture. USB cameras can be a dollar or two for cheap, low-resolution (which is all that is needed) models. I don’t find any SPI cameras that are less than ten times that. So if there’s a way to do it, or if anyone has any other ideas, I’m all ears!

Ah we should definitely talk then. There are some minimal spec uvc chips we’ve been experimenting with (single die image sensor, jpeg and usb)

You can probably guess my email :slight_smile:

Very interested in this too.

Came across this from Cypress Semiconductor: http://www.cypress.com/documentation/application-notes/an75779-how-implement-image-sensor-interface-ez-usb-fx3-usb-video

Not directly related to Imp but may help understand image sensor interface + UVC.

At circa $10 (sale price), this option may well be worth looking at - has a usb port for data transfer. https://www.adafruit.com/product/3202

I know this is an old thread but a copy of this code would be useful to me. There is an empty placeholder for something similar on github… code would be awesome!
https://github.com/electricimp/ArduCam-mini-2MP-0V2640

If you switch branch to “develop” you’ll see the code. It’s not yet ready for release, hence why it’s not on the master branch yet.

Thanks Hugo - missed the obvious. Code will go a long way towards a quick PoC.
Cheers

I just saw this on Kickstarter and it reminded me of this forum post, as I’m sure this is something an Imp could handle with relative ease (i.e. encryption of local image data and transferring the data to the cloud, securely).

One thought I had is that the Imp team should maybe consider something like the STM32F417xx chip rather than STM32F415xx, if wanting to make a new IMP flavour for example, as I believe this provides a parallel camera interface / DCMI, making a camera option even more viable/probable.

The imp003 has the F405 in it, but parallel camera interfaces take a lot of pins. Strangely, the STM series isn’t yet available with a MIPI-CSI interface which is what modern cameras use, which limits camera choice rather a lot.

All I’ve gathered so far is that the stm32…05/15 options are the more commonly used and that the stm32…07 / 17 varieties are the options that offer some sort of camera interface, as per application note. All rather confusing as to how effective / how many pins required with these devices.

I’ve had a go at using the ArduCam example class from here:

I’m stuck on trying to view the image once it has been sent in base64 encoding.

My device code (omitting camera class):

spi <- hardware.spi257;
// SCK max is 10 MHz for the device
spi.configure(CLOCK_IDLE_LOW | MSB_FIRST, 7500);

cs_l <- hardware.pinD;

i2c <- hardware.i2c89;
i2c.configure(CLOCK_SPEED_400_KHZ);

// Set up camera
myCamera <- Camera(spi, cs_l, i2c);

myCamera.reset();

myCamera.setRGB();
myCamera.set_jpeg_size(160);

function captureSend() {
    myCamera.capture();
    local img = myCamera.saveLocal(); //img is a blob
    server.log("device got image, size:"+img.len());
    server.log("memory free=" + imp.getmemoryfree());
    agent.send("img", img);
    imp.wakeup(60,captureSend);
}

captureSend();

And agent:

device.on("img", function(img) {
    sendImg(img);
});

function sendImg(img) {
        local obj_to_send = img;
        local headers = {
            "Content-Type" : "text/plain"
        }
        local response = httpPostWrapper(url,headers,http.base64encode(img));
}

I’m sending the base64 encoded data to a file. An example is attached below.
When I try and decode the text using Base 64 Decoder (100% client-side) :: OpinionatedGeek and save the result as a jpg, the file can’t be opened by a picture viewer. Any ideas on what I’m doing wrong? Something to do with encoding/decoding maybe, or maybe something else is needed to convert to a jpeg?

img_base64.zip (28.3 KB)

I guess I needed to use myCamera.setYUV422(); instead of myCamera.setRGB(); to get a valid jpeg. Getting 1/4 of an image now, uncorrupted about half the time. So I’ve got something to work with. Thanks for the arducam library! :slight_smile:

Is it working now? I have working ArduCam code (thanks to Hugo and the Electric Imp team) if you need it or are interested.

Try decreasing imp-to-camera cable length to help with corruption and partial images. Also you could fiddle with the SCK rate.

Yep, working now, thanks @_john. Still working on some code on the receiving server end, but the data coming from the camera and imp is all good.

Another question. I’m using imp004m on the Murata breakout board, which I gather has 1MB flash. When I take a photo that is larger than about 60kB, the imp goes into a repeated restart loop, giving the “imp restarted, reason: out of memory” error. But imp.getmemoryfree() for images slightly lower than this size gives something more than 100kB still available. Is this due to fragmentation? Is there a way to either defrag. Or would the best way to go to store a large variable be to split it in half?

You should have almost 200kB free (with essentially blank squirrel) - what are you doing with the images and how are you storing them? Growing a blob is more likely to cause fragmentation than allocating the correct size at the start - you can read this from the camera - then filling it. Can you post code?

I’ve done much bigger images on the camera with an 001, but I’ve streamed them up to the agent.

There’s currently no defrag, but this is possible as VM objects can be moved - it’s just not been needed so far.

Hi Hugo,

I did a test with only this code on the device:

server.log("p1:"+imp.getmemoryfree())
img <- blob(65528);
server.log("p2:"+imp.getmemoryfree())

The log output is:

2017-10-23 07:23:49 +11:00	[Device]	p1:194288
2017-10-23 07:23:49 +11:00	[Device]	p2:128092

All good. If I increase the blob size by 1 byte I get:

2017-10-23 07:25:36 +11:00	[Exit Code]

This is on the Murata imp004m board. Any ideas?

Thanks

Interestingly, the imp002 seems to have exactly the same blob size limit (65528 bytes) before out-of-memory failure, even though the free memory is much smaller (83752 bytes).