The nodejs server receives this JSON stream from Twitter and sends it to the client:
stream.twitter.com/1/statuses/filter.json?track=gadget
The data returned to the client is "chunked" JSON, and both JSON.parse (chunk) and eval ("+ chunk + ')") on the client side lead to parsing errors. Concatenating pieces and waiting for the end event is not a solution either
I noticed that the previous samples used something like this on the client side, which apparently worked before then:
socket.onmessage = function(chunk) { data = eval("(" + chunk.data + ")"); alert(data.user.screen_name);
I use this on the client side and this leads to a parsing error:
var socket = new io.Socket(); socket.on('message', function(chunk) { var data = eval('(' + chunk + ')');
I know that it successfully returns a JSON piece with:
var socket = new io.Socket(); socket.on('message', function(chunk) { alert(chunk):
Server:
response.on('data', function (chunk) { client.each(function(e) { e.send(chunk); });
Has something changed or what else am I doing wrong?
UPDATE: The event "end" does not fire because its streaming?
http.get({ headers: { 'content-type': 'application/json' }, host: 'stream.twitter.com', path: '/1/statuses/filter.json?track... }, function(res) { res.setEncoding('utf8'); res.on('data', function (chunk) { client.each(function(e) { e.send(chunk); }); }); // does not fire res.on('end', function () { }); ...
I am studying the difference with http 1.0 and http 1.1 regarding sending data with channels.