Sending Image Stream over Socket Issue - Android

2019-03-30 09:16发布

问题:

I've implemented an application that takes a picture with the SP camera and sends it over a socket to the server.

I'm using the following code to read the image file stored locally and send it in successive chunks over the socket:

FileInputStream fileInputStream = new FileInputStream( "my_image_file_path" );
ByteArrayOutputStream buffer = new ByteArrayOutputStream();

int nRead;
byte[] data = new byte[16384];

try {
    while( (nRead = fileInputStream.read(data, 0, data.length)) != -1 ){
        buffer.write(data, 0, nRead);
        networkOutputStream.write( buffer.toByteArray() );
        buffer.flush();
    }
} catch( IOException e ){
    e.printStackTrace();
}

The issue I'm facing is that changing the size of the array of bytes data[] affects how much of the image is actually sent to the server.

The images posted below should help you understand:

  • byte[] data = new byte[16384];

  • byte[] data = new byte[32768];

  • byte[] data = new byte[65536];

And so on.

As you can imagine I can find a size that allows me to send the full image, but such ad hoc solution is not acceptable since images of any dimension could need to be sent.

In my opinion there seems to be a problem in the way I am reading the image file in a buffered way, can you help me?

Thanks in advance!

回答1:

The use of ByteArrayOutputStream is redundant, and you are sending its entire contents every time it grows. Change your loop as follows:

FileInputStream fileInputStream = new FileInputStream( "my_image_file_path" );

int nRead;
byte[] data = new byte[16384];

try {
    while( (nRead = fileInputStream.read(data)) != -1 ){
        networkOutputStream.write( data, 0, nRead );
    }

} catch( IOException e ){
    e.printStackTrace();
}
fileInputStream.close();