Twitter V 1.2.0 Streaming timeouts

I’m using V1.2.0 to search for keywords and flash lights in responce.

Works great but I get timeouts with error code 23 every so often so everything goes quite for 60 seconds then it recovers and carries on. Its not a deal-breaker but it would be nice to improve the responce times. Can I safely reduce the timeout time or will I get rate limited do you think ?
Here’s a typical responce: (truncated)
2015-11-03 22:26:39 UTC+0 [Agent] Found: STRICTLY
2015-11-03 22:26:40 UTC+0 [Agent] Found: CAMERON
2015-11-03 22:26:41 UTC+0 [Agent] Found: SWIFT
2015-11-03 22:27:29 UTC+0 [Agent] Stream Closed (23: ers\/2542537356\/1439735203",“default_profile”:true,“default_profile_image”:false,“following”:null,“follow_request_sent”:null,“notifications”:null},“geo”:null,“coordinates”:null,“place”:null,“contributors”:null,“retweeted_status”:{“created_at”:“Mon Nov 02 04:26:28 +0000 2015”,“id”:661036629583261696,“id_str”:“661036629583261696”,“text”:“Sonia Ben Ammar \u0e19\u0e32\u0e07\u0e41\u[…truncated…]
2015-11-03 22:27:29 UTC+0 [Agent] ERROR: 23: ers\/2542537356\/1439735203”,“default_profile”:true,“default_profile_image”:false,“following”:null,"fo

Here’s my agent code

`twitter.stream(search_string, onTweet,onError);

function onError(errors) 
    {
    // Log all the error messages
    foreach (err in errors) 
        {
        server.error(err.code + ": " + err.message);
        }

    // Close the stream then re-open it
    twitter.closeStream();
    device.send("reset_count",1);
    twitter.stream(search_string, onTweet,onError);
    }`

Any help appreciated. Be gentle . I’ve only had the imp for a week

What’s the sort of volume of tweets you’re getting? I suspect it’s possible that if you’re getting too much volume the data could be truncated with a 23.

Thanks Hugo, Its about 1 every second with popular search terms . Is that excessive ? Its a bit odd. Twitter stops responding and then after roughly 60 seconds the stream is closed and returns error 23. Happens about once every 5 minutes. Looking at the library code the resting re-connect timeout is 60 seconds .

No, that doesn’t seem over the top. Error 23 is an error from libcurl (within our system), which implies that there was an issue writing the received data - see http://curl.haxx.se/libcurl/c/libcurl-errors.html

…but it’s also strange that you’re seeing valid JSON in the error message field. Hmm. Will try running your code here and seeing if we can replicate.

Thanks Here’s the full code just in case I’m doing something really stupid.
Some redundant/innefficient stuff in here as I’m still learning :slight_smile:
I had a few memory warnings early on when I was logging a lot of tweets, could it be memory related ? Thanks for you help

(John) wheable

`// Use #require to load the Twitter library
#require “Twitter.class.nut:1.2.0”

s1<-“fred astaire”.toupper();
s2<-“david cameron”.toupper();
s3<-“strictly come dancing”.toupper();
s4<-“taylor swift”.toupper();
s5<-“david beckham”.toupper();
s6<-“mad museum”.toupper();

search_string<- s1+","+s2+","+s3+","+s4+","+s5+","+s6;
local count=0;
// Instantiate a Twitter object (enter your own Twitter app secrets)

local twitter = Twitter(“", "", "******","” );

// Start searching Twitter stream with default string

function onTweet(tweetData)
{
//search tweetdata for text and flash appropriate lights

    local result=tweetData.text.toupper();
    local leds=0;
   if(result.find(s1)!=null)
        {
            server.log(format("Found: %s ",s1));
            leds=leds|(0x01);
        }
    
    if(result.find(s2)!=null)
        {
            server.log(format("Found: %s ",s2));
            leds=leds|(0x02);
        }
        if(result.find(s3)!=null)
        {
            server.log(format("Found: %s ",s3));
            leds=leds|(0x04);
        }
        if(result.find(s4)!=null)
        {
            server.log(format("Found: %s ",s4));
            leds=leds|(0x08);
        }
        if(result.find(s5)!=null)
        {
            server.log(format("Found: %s ",s5));
            leds=leds|(0x10);
        }
        if(result.find(s6)!=null)
        {
            server.log(format("Found: %s ",s6));
            leds=leds|(0x20);
        }
    
    
        if(leds!=0)
            {
           device.send("tweet", leds); 
           //server.log(format("%s - %s - %s", tweetData.text, tweetData.user.screen_name, tweetData.id_str));
            }
        else
            {
               //  server.log(format("Tweet but no match in text : %s - %s - %s", tweetData.text, tweetData.user.screen_name, tweetData.id_str));
            }
        
    
    local tweetTable =
        {
        "in_reply_to_status_id" : tweetData.id_str,
        "status" : format("@%s Thanks for saying hello! (tweeted at %i)", tweetData.user.screen_name, time())
        };

    //twitter.tweet(tweetTable);//disabled for the moment
    }

function onError(errors)
{
// Log all the error messages
foreach (err in errors)
{
server.error(err.code + ":-- " + err.message);
}

// Close the stream then re-open it
twitter.closeStream();
device.send("reset_count",1);
twitter.stream(search_string, onTweet,onError);
}

function myFind(match_string,string)
{
//
local lm=match_string.len();
local ls=string.len();
local m=0;
local s=0;
local match=false;
while((s<ls)&&(!match))
{
// server.log(format(" %s %s ",match_string.slice(m,m+1),string.slice(s,s+1)));
if(match_string.slice(m,m+1)==string.slice(s,s+1))
{

                m++;
                if(m==lm) match=true;
            }
        else
            {
                m=0;
            }
            s++;
    }
    return match;

}
function requestHandler(request, response) {
server.log(“receieved http request”);
try {
if (“setting” in request.query) {
// ‘setting’ is a URL-encoded parameter, ie. '/setting=4’
local settingValue = request.query.setting;
server.log (request.query.setting);
// Use the ‘response’ object to acknowledge reception of the request
// to the request’s source. ‘200’ is HTTP status code for 'OK’
response.send(200, “Setting received and applied”);
}
} catch (error) {
// Something went wrong; inform the source of the request
// ‘500’ is HTTP status code for 'Internal Server Error’
response.send(500, error);
}
}

// Register the handler function as a callback
http.onrequest(requestHandler);
twitter.stream(search_string, onTweet,onError);

imp code// Create a global variabled called ‘led’ and assign the ‘pin9’ object to it
// The <- is Squirrel’s way of creating a global variable and assigning its initial value
count <- 0;
led1 <- hardware.pin1;
led2 <- hardware.pin2;
led3 <- hardware.pin5;
led4 <- hardware.pin7;
led5 <- hardware.pin8;
led6 <- hardware.pin9;

// Configure pin

// Configure ‘led’ to be a digital output with a starting value of digital 0 (low, 0V)
led1.configure(DIGITAL_OUT, 0);
led2.configure(DIGITAL_OUT, 0);
led3.configure(DIGITAL_OUT, 0);
led4.configure(DIGITAL_OUT, 0);
led5.configure(DIGITAL_OUT, 0);
led6.configure(DIGITAL_OUT, 0);

// Function called to turn the LED on or off
// according to the value passed in (1 = on, 0 = off)
function setLedState(state) {
//server.log("Set LED to state: " + state);
if (state&0x01) led1.write(1);else led1.write(0);
if (state&0x02) led2.write(1);else led2.write(0);
if (state&0x04) led3.write(1);else led3.write(0);
if (state&0x08) led4.write(1);else led4.write(0);
if (state&0x10) led5.write(1);else led5.write(0);
if (state&0x20) led6.write(1);else led6.write(0);
}
function reset_count(parameter)
{
count=0;
}
function poll()
{
// Read the pin and log its value

count++;
//server.log(count + " seconds since last error");
// Wake up in 0.1 seconds and do it again
setLedState(0);
imp.wakeup(0.5, poll);

}
// Register a handler for incoming “set.led” messages from the agent
agent.on(“tweet”, setLedState);
agent.on(“reset_count”,reset_count);
poll();
`

Hi Again,
I think I may have stumbled on the problem with my code. I think I’m not processing the tweets quick enough. If I remove all my searches of tweetdata.text string
and just log the fact that a matching tweet arrived then the 23 error dissapears. I get the occasional ‘28’ but I reconnect immediately so that’s not a problem.

I’m trying to set up to filter tweets on six search terms and then determine which of the terms matched in the agent code.

this code is slowing things down:
`

local result=tweetData.text.toupper();
if(result.find(s1)!=null)
{
//
}
`
I’ll see if I can’t speed things up. If you have any tips …

Regards

John (wheable)
P.S. Didn’t realise that I was talking to the CEO and co-founder . Nice product , Great customer service :slight_smile:

Ahhh!! Spoke too soon, Problem is still there . Seems to be less frequent though

Shouldn’t be your search - that will be happening very fast (it’s running on an amazon server!), but with the full code we can take a look.

Note that you can make the device side code a bit more elegant, eg:

`// Create a global variabled called ‘led’ and assign the ‘pin9’ object to it
// The <- is Squirrel’s way of creating a global variable and assigning its initial value
count <- 0;
leds <- [hardware.pin1,hardware.pin2,hardware.pin5,hardware.pin7,hardware.pin8,hardware.pin9];

// Configure pins
foreach(led in leds) led.configure(DIGITAL_OUT, 0);

// Function called to turn the LED on or off
// according to the value passed in (1 = on, 0 = off)
function setLedState(state) {
//server.log("Set LED to state: " + state);
for(local a=0; a<5; a++) leds[a].write((state&(1<<a))?1:0);
}`

Thanks. Didn’t know I could do that.
leds is an array of integers ?
Whats the best way of implementing an array of variable length strings ? (My individual search terms ) (just being lazy, I’ll figure it out eventually :-

Regards

John

leds is an array of objects - pin objects. Pretty much everything is an object which means you can put it in an array or table, and iterate over it.

An array of strings is just:

`array <- [ "one", "two", "three".toupper(), "four" ];
`

and so on. You’d then build the search string something like this:

searchstring <- ""; foreach(a in array) searchstring+=a+","; // remove trailing comma searchstring = searchstring.slice(0,-1);

Hi Again,
Still getting '23 errors. The agent stops talking to me for aroung 2.30 mins now and then returns with the error. It doesn’t seem related to the hit rate on my searches. I’m getting a few 'Memory briefly exceeded messages as well.(around 2500 Kb, I think)
Any ideas ? Can you reproduce the error ? Its really frustrating because it works really well for a while and then stops.

Regards

John

FWIW, I’ve been trying this without error, just using the search string '#movember,#starwars" and a simple error pick-up that just says ‘error’ when one is detected. So far zippo, but going to leave it running overnight to see.

Hmm, I’m get errors regularly , every two or three minutes.
Let me know if there’s anything I can try this side.
Thanks for looking at this.

Regards

John

I’ve now reproduced the error, @wheable.

Twitter is returning some big block of data that the agent can’t handle (data too large), but why Twitter is returning all this stuff, and not the simple tweet data the class expects, I’m not sure. I think it might be a ‘while you were away’ digest, but I can’t be sure without the complete text of the data returned.

I’m now working with a modified form of the library to trap 23 errors and simply restart the stream in response to one. I’ll let you know how it goes.

Error 23 is, as @hugo says, a libcurl error (‘write error’), so that’s obviously being tripped when all the Twitter data is being written by the agent’s OS code into the agent memory space and the limit is reached.

That’s a relief , I was begining to think it was just me :slight_smile:
I’ve created new Twitter auth tokens as the old ones were about 6 months old and well abused . Strangely I haven’t had a '23 error for 25 mins (a record) . Not sure if this relavent.

I’m getting excited now :slight_smile:

John

I’m afraid I’m going to describe this as a bug in our Twitter library. Look at this:

`
                    local data = null;
                    try {
                        data = http.jsondecode(body);
                    } catch(ex) {
                        _buffer += body;
                        try {
                            data = http.jsondecode(_buffer);
                        } catch (ex) {
                            return;
                        }
                    }
`

That says: try and decode the current buffer as a single JSON object; if that doesn’t work, append the newly-arrived data to the buffer and then try it again as a single JSON object. But what if it now contains two JSON objects? The http.jsondecode call will fail (it expects one object only), and then every subsequent call will also fail, as more and more incoming data gets whacked into the buffer until the agent’s memory fills up.

According to https://dev.twitter.com/streaming/overview/processing, streaming search results are delimited by “\r
”… the loop should probably look something like:

`
_buffer += body;
while (1) {
    local pos = _buffer.find("\\r\
");
    if (pos == null) {
        break;
    }
    message = _buffer.slice(0,pos);
    _buffer = _buffer.slice(pos+2);

    local data = null;
    try {
        data = http.jsondecode(message);
    } catch (ex) {
        continue;
    }
    ... process 'data' ...
}`

Peter

Thanks Peter. Is this something that can be fixed easily ? I’m happy to test the library to destruction :slight_smile:

Regards

John

Well spotted, @peter - I’ll submit a pull request to the library.

Hmmm. Even with @peter’s code change, I’m still seeing these.

Message relayed. There’s a long pause (ie. longer than I’d expect given the rate of tweets I’m seeing otherwise) and then the 23 is tripped… or sometimes not. Even if it does, the size of _buffer (see code above) is only 4KB.

ie. can deal with 23’s politely but not (yet) prevent them