Hello,
I am working on a project that uses the Imp001 as an interface between a Web app and a custom hardware. Currently I am in a study phase and I am performing tests to measure speed and memory.
My first test uses the imp.getmemoryfree() function, and I got two different values if I run the code from the Agent or from the Device. This is the program I used:
/* AGENT SIDE */
/*
Imp Agent memory test 001.
This test uses the function imp.getmemoryfree() that returns the amount of free memory in bytes.
[Agent] Agent memory at start: 1.046.213
*/
server.log("Agent memory at start "+ imp.getmemoryfree());
/* DEVICE SIDE */
/*
Imp Device memory test 001.
This test uses the function imp.getmemoryfree() that returns the amount of free memory in bytes.
[Device] Device memory at start : 81.092
*/
server.log("Device memory at start "+ imp.getmemoryfree());
As mentioned in the listing above this is the result:
Agent Memory: 1.046.213 bytes
Device Memory: 81.092 bytes
From what I know the Device memory is the Imp memory itself, so the program that I write in the device side is stored in the Imp non volatile memory. The Agent memory is the memory of my virtual Imp server running on the cloud. So when/if I turn off my Imp, the device side of the program isn’t lost, when I turn on the device again it just connect to the Agent, that lives permanently in the cloud, is this correct?
Regarding the speed I know I can measure the time elapsed in the device side. I know also I can do the same in the Agent side, but there aren’t similar functions. there are the time() and clock() functions only.
I was working on the clock() function that is supposed to returns the number of seconds elapsed since the start of the process in floating notation. So milli and micro seconds can be easily returned. My Idea is to perform several math calcs and measure how long it takes to accomplish the tasks. Then I can compare the results to the ones returned by other devices that runs the same tasks.
I am concerned by the accuracy of this method. I tried a first test to check if the clock function is accurate:
/*
Imp Agent time test 001.
*/
local StartTime = 0;
local EndTime = 0;
local ElapsedTime = 0;
local Iteration = 0;
function poll()
{
EndTime = clock();
ElapsedTime = EndTime - StartTime;
server.log(format("Wakeup elapsed time (%d): %.2fs ", Iteration, ElapsedTime));
StartTime = clock();
Iteration++;
imp.wakeup(3, poll);
}
StartTime = clock();
imp.wakeup(3, poll);
This test shows the time elapsed every “wakeup”, but the time shown seems not to be correct, in fact this is the dump from the log window:
2014-04-23 11:42:45 UTC-5: [Agent] Wakeup elapsed time (0): 2.19s
2014-04-23 11:42:48 UTC-5: [Agent] Wakeup elapsed time (1): 2.06s
2014-04-23 11:42:51 UTC-5: [Agent] Wakeup elapsed time (2): 1.88s
2014-04-23 11:42:54 UTC-5: [Agent] Wakeup elapsed time (3): 2.25s
2014-04-23 11:42:57 UTC-5: [Agent] Wakeup elapsed time (4): 2.00s
2014-04-23 11:43:00 UTC-5: [Agent] Wakeup elapsed time (5): 2.06s
2014-04-23 11:43:03 UTC-5: [Agent] Wakeup elapsed time (6): 1.94s
2014-04-23 11:43:06 UTC-5: [Agent] Wakeup elapsed time (7): 2.31s
2014-04-23 11:43:09 UTC-5: [Agent] Wakeup elapsed time (8): 1.81s
2014-04-23 11:43:12 UTC-5: [Agent] Wakeup elapsed time (9): 2.06s
2014-04-23 11:43:15 UTC-5: [Agent] Wakeup elapsed time (10): 2.00s
As you can see the time on the left is correct, but the time interval is wrong. I am surprised because I thought to read values greater than 3 seconds. Is something wrong in my way to measure the time interval?
Now the interesting part comes. If I run this code in the device side I get correct results:
2014-04-23 11:46:52 UTC-5: [Device] Wakeup elapsed time (0): 3.00s
2014-04-23 11:46:55 UTC-5: [Device] Wakeup elapsed time (1): 3.00s
2014-04-23 11:46:58 UTC-5: [Device] Wakeup elapsed time (2): 3.00s
2014-04-23 11:47:01 UTC-5: [Device] Wakeup elapsed time (3): 3.00s
2014-04-23 11:47:04 UTC-5: [Device] Wakeup elapsed time (4): 3.00s
2014-04-23 11:47:07 UTC-5: [Device] Wakeup elapsed time (5): 3.00s
2014-04-23 11:47:10 UTC-5: [Device] Wakeup elapsed time (6): 3.00s
2014-04-23 11:47:13 UTC-5: [Device] Wakeup elapsed time (7): 3.00s
2014-04-23 11:47:16 UTC-5: [Device] Wakeup elapsed time (8): 3.00s
2014-04-23 11:47:19 UTC-5: [Device] Wakeup elapsed time (9): 3.00s
2014-04-23 11:47:22 UTC-5: [Device] Wakeup elapsed time (10): 3.00s
So what’s happening exactly? The clock() function returns the time of the device where the code is running on or what? I am a bit confused.
Other than this way, is there another way for me to know the speed of my imp virtual server?
Thanks.