I'm writing an application where I need to read blocks in from a single file, each block is roughly 512 bytes. I am also needing to write blocks simultaneously.
One of the ideas I had was BlockReader implements Runnable
and BlockWriter implements Runnable
and BlockManager
manages both the reader and writer.
The problem that I am seeing with most examples that I have found was locking problems and potential deadlock situations. Any ideas how to implement this?
I would recommend the book java Concurrency in Practice, in this case section 5.3 (Producer-consumer pattern).
your solution could look something like:
BlockingQueue<Data> queue =
new LinkedBlockingQueue<Data>(MAX_BLOCKS_IN_QUEUE_UNTIL_BLOCK );
for (int i=0; i < MAX_DATA_PRODUCERS; i++ ) {
new Thread( new DataProducer( queue ) ).start();
}
new Thread(DataWriter( queue )).start
Obviously DataProducer and DataWriter are runnables.
class DataProducer implements Runnable {
...
queue.put(data); // blocks if MAX_BLOCKS_IN_QUEUE_UNTIL_BLOCK
// are waiting to be written
// This prevents an OutOfMemoryException
...
}
class DataConsumer implements Runnable {
...
try {
while(true) {
writeData(queue.take()); // blocks until there is a block of data
}
} catch (InteruptedException e) {
Thread.currentThread().interrupt();
}
...
}
You can have an array of locks e.g. 32 and use the index of the block as a hash to determine which lock to obtain. This way you can have concurrent read/writes (most of the time) and still ensure you won't be read/writing the same block in multiple threads.