Imp SPI question with FLIR lepton camera

Hi all:
I have a FLIR lepton camera kit (https://www.tindie.com/products/GroupGets/lepton-breakout-board-with-flir-lepton-3/). It use SPI and I2C to transfer video data and control.
Then, I use imp April board to connect this camera (i2c89 and spi257).

I had reference other platform (stm32 , arduino) source code to read camera video data.
Reference source code : https://github.com/groupgets/LeptonModule/tree/master/software

The problem is : April board can’t read a complete video packet from camera.
Camera SPEC. : https://cdn.sparkfun.com/datasheets/Sensors/Infrared/FLIR_Lepton_Data_Brief.pdf
According to camera SPEC. “9.2.2.3.1 Establishing/Re-Establishing Sync” and “9.2.2.3.2 Maintaining Sync”, we know how to make a Sync and why Sync fail.

My fail situation is that: April only read few packets(1~5), then camera re-synchronize again.
So, I can’t receive a full video data.
According to SPEC. it seems a timeout issue ( 9.2.2.3.2: Intra-packet timeout).
That seems this camera can’t sync with imp SPI interface.

Does anyone has experience with this camera?
And has anyone used lepton camera with imp device and received video data well?
Thanks.

I’ve not tried one myself (always been tempted to buy one) but pretty sure some customers have.

Can you post your code? Would like to see what clock rate you’re using, etc. Are you needing to stream data, or just collect individual frames?

Hi hugo :
This device use SPI to transmit video stream data.
SPI Mode =3 , CS active low; clock supported 2MHz ~ 20MHz.
Below is my code, and the log message.
It should be read 60 packets, but camera always re-sync frame, so in the case only read 0~4 frame.
Thanks~

=>
class FLIRCam {
lepton_frame_packet = blob(164);
lepton_image = array(6400,0xFF);

_spi = null;
_cs_l = null;

constructor(spi, cs_l) {
_spi  = spi;
_cs_l = cs_l;
_cs_l.write(1);
_spi.configure(CLOCK_IDLE_HIGH |CLOCK_2ND_EDGE , 15000);
imp.sleep(0.2);
}


function reset(){
_cs_l.write(1);   
server.log("reset");
imp.sleep(1.1);	//wait >1s
_cs_l.write(0);  
imp.sleep(0.1);
}

function createsync() {//-->Establishing/Re-Establishing synchronization.
_cs_l.write(1);   
server.log("sync");
imp.sleep(0.2);	//>185ms
}

function transfer(){
    local i ;
local checkByte = 0x0f;
local packetNb;
local data = blob(1);
local frame_number = 0;
local need_resync = 0;
local lost_frame_counter = 0;
local last_frame_number;
local frame_number;
local last_crc;
local new_frame = 0;
    
_cs_l.write(0);//SPI: Chip Select, active low    
lepton_frame_packet = _spi.readblob(164);//Read VoSPI : 164 bytes
_cs_l.write(1);

if(((lepton_frame_packet[0]&0xf) != 0x0f))
{
	if(lepton_frame_packet[1] == 0  )
	{
		if(last_crc != (lepton_frame_packet[3]<<8 | lepton_frame_packet[4]))
		{
			new_frame = 1;
			server.log("New Frame");
		}
		last_crc = lepton_frame_packet[3]<<8 | lepton_frame_packet[4];
	}
	frame_number = lepton_frame_packet[1];

	if(frame_number < 60 )
	{
		lost_frame_counter = 0;
		for(i=0;i<80;i++)
		{
			lepton_image[frame_number * 80 + i] = (lepton_frame_packet[2*i+4] << 8 | lepton_frame_packet[2*i+5]);
		}
	
            	server.log(" Frame number:"+ frame_number);
	}
	else
	{
		lost_frame_counter++;
	}
	if( frame_number == 59)
	{
		frame_complete = 1;
		last_frame_number = 0;
		server.log("frame complete");
	}    
    }
    else
    {
	if(last_frame_number ==0)
	{}
    }
    
    lost_frame_counter++;
    if(lost_frame_counter>100)
    {
	    need_resync = 1;
	    server.log("need resync");
	    lost_frame_counter = 0;

    }

    if(need_resync)
    {
	    imp.sleep(0.2);
	    need_resync = 0;
    }
    
}

}

class Application {

static READING_INTERVAL = 0.01;
flircam = null;
constructor() {
    
    local spi = hardware.spi257;
    local cs_l = hardware.pin1;//Chip Select, active low
    cs_l.configure(DIGITAL_OUT, 1);
     
    flircam = FLIRCam(spi,cs_l);
    imp.wakeup(0.1, loop.bindenv(this));
}

function loop() {
    
    flircam.reset();// reset
    flircam.createsync();
    while(1)
    {
        flircam.transfer();
        imp.sleep(0.001);
        //server.log("transfer done");
    }
}

}

Application();

Log message:

2018-09-06 14:32:50 +08:00 [Device] reset
2018-09-06 14:32:51 +08:00 [Device] sync
2018-09-06 14:32:51 +08:00 [Device] GPIO3 on
2018-09-06 14:32:51 +08:00 [Device] New Frame
2018-09-06 14:32:51 +08:00 [Device] Frame number:0
2018-09-06 14:32:52 +08:00 [Device] Frame number:1
2018-09-06 14:32:52 +08:00 [Device] Frame number:2
2018-09-06 14:32:52 +08:00 [Device] Frame number:3
2018-09-06 14:32:52 +08:00 [Device] Frame number:4
2018-09-06 14:32:52 +08:00 [Device] Frame number:4
2018-09-06 14:32:52 +08:00 [Device] New Frame
2018-09-06 14:32:52 +08:00 [Device] Frame number:0
2018-09-06 14:32:52 +08:00 [Device] Frame number:1
2018-09-06 14:32:52 +08:00 [Device] Frame number:2
2018-09-06 14:32:52 +08:00 [Device] Frame number:3
2018-09-06 14:32:52 +08:00 [Device] Frame number:4
2018-09-06 14:32:52 +08:00 [Device] Frame number:4
2018-09-06 14:32:52 +08:00 [Device] New Frame
2018-09-06 14:32:52 +08:00 [Device] Frame number:0
2018-09-06 14:32:52 +08:00 [Device] Frame number:1
2018-09-06 14:32:52 +08:00 [Device] Frame number:2

A couple of things

  • Don’t call server.log in stuff you need to be quick locally. Sending data to the server involves a lot, eg marshalling the data, encrypting it, pushing it into the TCP stack etc

  • Don’t try to process the data into lepton_image in the frame grab. Just keep an array of blobs, and process it later, eg:

    local lepton_frames[60];
    if (frame_number < 60) lepton_frames[frame_number] = lepton_frame_packet;

…then when you have all the frames, process them into the image array at once.

It’s a bit hard for me to try this without an actual camera, though!

Hi hugo:

You are right the server.log let camera can’t sync,
I can receive 60 packet frame after move server.log.
Thanks~