If I’ve got some data in the agent that’s compressed with gzip, is it possible to send it to myself (via my own agent endpoint) so that nginx does the work for me? What sort of headers would I need to put on it to “distract” the agent so it doesn’t compress it on the way out, before decompressing it on the way back in again? I’d rather not send it to an external web service, if I can help it.
Would be even better if gzip compression and decompression methods could be exposed in the http object.
There’s nothing that currently exists to help. We’ve – purposefully – taken a very light hand in re: intermediate processing between agents and the outside world. This isn’t to say we wouldn’t consider it, but it doesn’t exist at the moment.
It’s an interesting request, though – at the very least the gzip-in-agent idea could be a good way to deal with over-large server.save data. I’ll make sure that @harryreiss sees this (and I just have!).
That’s exactly my problem. I have about 100-200K of JSON that I need to hold between restarts. With gzip, it compresses down to less than 20K, ideal for plonking in the server.save area.
So, interestingly (and this is an implementation detail, subject to change), your server.save()
data is converted to JSON before being persisted. This is the cause of the restrictions in the https://electricimp.com/docs/resources/serialisablesquirrel/#json-is-a-blob-free-zone documentation (which casually mentions JSON, without explaining why; now you know).
Ironically, it means that persisting squirrel strings that contain JSON will result in expansion of your data. Obviously not in the realms of 20K versus 100K, though.
We’re looking at relaxing this restriction (by changing the underlying storage format), so that you could store squirrel blobs in server.save()
. No promises on timeline however (the change is not as simple as you think it is).
Adding explicit gzip support to the agent sounds like a good idea to me, though I don’t have control over that particular part of the roadmap.
I’ll raise this story internally as a candidate for our development backlog (we have fortnightly reviews to prioritise these). Thanks for suggesting the feature!
This was discussed in an imp team review yesterday. Have you considered using MessagePack in place of JSON?
Yes,
I had a conversation with Peter about this a while ago. MessagePack has more efficient serialisation than JSON, but it doesn’t compress the data. GZIP’s use of a dictionary and Huffman encoding goes way beyond what MessagePack is capable of. HTML/Javascript and JSON compress briliantly with gzip.
Do you know if progress was ever made on a gzip library? We are trying to make some external API calls that are returned gzipped. Currently, we have a small proxy server in place that does the decompression for us, but it would be nice to make the calls directly.
If they’re external requests, can’t you just ensure the content type is gzip? The agent front end should just unzip it for you.
Regarding the feature, it’s (hopefully) still on their backlog