Reading large file and sending it through a websocket leads to OutofMemoryError exception

#1

Hi,

I am reading a large file of 25 MiB and sending it over the websocket. Just after a few seconds I get OutofMemoryError exception. Below is my code for reference

byte[] chunk = new byte[CHUNK_SIZE];

int read;

while ((read = fis.read(chunk)) != -1) {

data.sendBinaryMessage(chunk);

// Can’t reuse chunk as it’s used by socket

chunk = new byte[CHUNK_SIZE];

}

I suspect the file read is faster than socket write. Is there any design pattern I can follow to overcome this situation.

Thanks

Vishal

0 Likes

#2

Hi Vishal,

the correct way of doing such things would be

http://autobahn.ws/python/tutorials/producerconsumer

Producer/consumer pattern and frame-based/streaming APIs are currently
only available in AutobahnPython Client + Server.

Probably this does not help you out of the box, but it points into the
right direction.

Cheers,
Tobias

···

Am 31.07.2012 20:01, schrieb Vishal:

Hi,

I am reading a large file of 25 MiB and sending it over the websocket.
Just after a few seconds I get OutofMemoryError exception. Below is my
code for reference

byte[] chunk = new byte[CHUNK_SIZE];
int read;
while ((read = fis.read(chunk)) != -1) {
   data.sendBinaryMessage(chunk);
   // Can't reuse chunk as it's used by socket
   chunk = new byte[CHUNK_SIZE];
}

I suspect the file read is faster than socket write. Is there any design
pattern I can follow to overcome this situation.

0 Likes

#3

Thanks Tobias and I really appreciate your work on Autobahn.

···

On Wednesday, August 1, 2012 4:33:25 AM UTC-7, Tobias Oberstein wrote:

Am 31.07.2012 20:01, schrieb Vishal:

Hi,

I am reading a large file of 25 MiB and sending it over the websocket.

Just after a few seconds I get OutofMemoryError exception. Below is my

code for reference

byte[] chunk = new byte[CHUNK_SIZE];

int read;

while ((read = fis.read(chunk)) != -1) {

data.sendBinaryMessage(chunk);

// Can’t reuse chunk as it’s used by socket

chunk = new byte[CHUNK_SIZE];

}

I suspect the file read is faster than socket write. Is there any design

pattern I can follow to overcome this situation.

Hi Vishal,

the correct way of doing such things would be

http://autobahn.ws/python/tutorials/producerconsumer

Producer/consumer pattern and frame-based/streaming APIs are currently

only available in AutobahnPython Client + Server.

Probably this does not help you out of the box, but it points into the

right direction.

Cheers,

Tobias

0 Likes

#4

Vishal,

Is it possible you are loosing the network connection and the data is piling up on the client side.

Pat,

···

On Tuesday, July 31, 2012 2:01:28 PM UTC-4, Vishal wrote:

Hi,

I am reading a large file of 25 MiB and sending it over the websocket. Just after a few seconds I get OutofMemoryError exception. Below is my code for reference

byte[] chunk = new byte[CHUNK_SIZE];

int read;

while ((read = fis.read(chunk)) != -1) {

data.sendBinaryMessage(chunk);

// Can’t reuse chunk as it’s used by socket

chunk = new byte[CHUNK_SIZE];

}

I suspect the file read is faster than socket write. Is there any design pattern I can follow to overcome this situation.

Thanks

Vishal

0 Likes

#5

Connection is intact, the problem is that the read is faster than write! For now I am creating a buffer of 4 MB to read chunks, server response creates a new space for next chunk.

···

On Saturday, August 4, 2012 4:36:48 AM UTC-7, patrickbrown wrote:

On Tuesday, July 31, 2012 2:01:28 PM UTC-4, Vishal wrote:

Hi,

I am reading a large file of 25 MiB and sending it over the websocket. Just after a few seconds I get OutofMemoryError exception. Below is my code for reference

byte[] chunk = new byte[CHUNK_SIZE];

int read;

while ((read = fis.read(chunk)) != -1) {

data.sendBinaryMessage(chunk);

// Can’t reuse chunk as it’s used by socket

chunk = new byte[CHUNK_SIZE];

}

I suspect the file read is faster than socket write. Is there any design pattern I can follow to overcome this situation.

Thanks

Vishal

Vishal,

Is it possible you are loosing the network connection and the data is piling up on the client side.

Pat,

0 Likes