I have an impExplorer and also an imp005 board. Do either or both of these have the computing horsepower to read a still image from a camera and upload it to the cloud every 10 minutes or so?
USB webcams are possible, but generally they supply data in an isoc stream, which is not currently supported (we have played around with internal builds and UVC devices, though).
The simplest solution - and this works well - is to use an Arducam mini (SPI interface) and an imp. I can post this code if it’s helpful, though it’s not been polished as yet. Not tried it on an 005 but it should work.
Cost is the big factor. I need some solution that can be as cheap as possible in quantity – we are planning to mass-manufacture. USB cameras can be a dollar or two for cheap, low-resolution (which is all that is needed) models. I don’t find any SPI cameras that are less than ten times that. So if there’s a way to do it, or if anyone has any other ideas, I’m all ears!
I know this is an old thread but a copy of this code would be useful to me. There is an empty placeholder for something similar on github… code would be awesome! https://github.com/electricimp/ArduCam-mini-2MP-0V2640
I just saw this on Kickstarter and it reminded me of this forum post, as I’m sure this is something an Imp could handle with relative ease (i.e. encryption of local image data and transferring the data to the cloud, securely).
One thought I had is that the Imp team should maybe consider something like the STM32F417xx chip rather than STM32F415xx, if wanting to make a new IMP flavour for example, as I believe this provides a parallel camera interface / DCMI, making a camera option even more viable/probable.
The imp003 has the F405 in it, but parallel camera interfaces take a lot of pins. Strangely, the STM series isn’t yet available with a MIPI-CSI interface which is what modern cameras use, which limits camera choice rather a lot.
All I’ve gathered so far is that the stm32…05/15 options are the more commonly used and that the stm32…07 / 17 varieties are the options that offer some sort of camera interface, as per application note. All rather confusing as to how effective / how many pins required with these devices.
I’ve had a go at using the ArduCam example class from here:
I’m stuck on trying to view the image once it has been sent in base64 encoding.
My device code (omitting camera class):
spi <- hardware.spi257;
// SCK max is 10 MHz for the device
spi.configure(CLOCK_IDLE_LOW | MSB_FIRST, 7500);
cs_l <- hardware.pinD;
i2c <- hardware.i2c89;
i2c.configure(CLOCK_SPEED_400_KHZ);
// Set up camera
myCamera <- Camera(spi, cs_l, i2c);
myCamera.reset();
myCamera.setRGB();
myCamera.set_jpeg_size(160);
function captureSend() {
myCamera.capture();
local img = myCamera.saveLocal(); //img is a blob
server.log("device got image, size:"+img.len());
server.log("memory free=" + imp.getmemoryfree());
agent.send("img", img);
imp.wakeup(60,captureSend);
}
captureSend();
And agent:
device.on("img", function(img) {
sendImg(img);
});
function sendImg(img) {
local obj_to_send = img;
local headers = {
"Content-Type" : "text/plain"
}
local response = httpPostWrapper(url,headers,http.base64encode(img));
}
I’m sending the base64 encoded data to a file. An example is attached below.
When I try and decode the text using Base 64 Decoder (100% client-side) :: OpinionatedGeek and save the result as a jpg, the file can’t be opened by a picture viewer. Any ideas on what I’m doing wrong? Something to do with encoding/decoding maybe, or maybe something else is needed to convert to a jpeg?
I guess I needed to use myCamera.setYUV422(); instead of myCamera.setRGB(); to get a valid jpeg. Getting 1/4 of an image now, uncorrupted about half the time. So I’ve got something to work with. Thanks for the arducam library!
Another question. I’m using imp004m on the Murata breakout board, which I gather has 1MB flash. When I take a photo that is larger than about 60kB, the imp goes into a repeated restart loop, giving the “imp restarted, reason: out of memory” error. But imp.getmemoryfree() for images slightly lower than this size gives something more than 100kB still available. Is this due to fragmentation? Is there a way to either defrag. Or would the best way to go to store a large variable be to split it in half?
You should have almost 200kB free (with essentially blank squirrel) - what are you doing with the images and how are you storing them? Growing a blob is more likely to cause fragmentation than allocating the correct size at the start - you can read this from the camera - then filling it. Can you post code?
I’ve done much bigger images on the camera with an 001, but I’ve streamed them up to the agent.
There’s currently no defrag, but this is possible as VM objects can be moved - it’s just not been needed so far.
Interestingly, the imp002 seems to have exactly the same blob size limit (65528 bytes) before out-of-memory failure, even though the free memory is much smaller (83752 bytes).