I have created a simple program to read from file and generate a file.Its working perfectly.I am worrying about how to make it real time topology.I want if i modify source file means added a new record it should come in my target file how i will do it without redeploying my topology on cluster.What else i need to configure to achieve this behavior.Below is code of submitting topology locally:-
Config conf= new Config();
conf.setDebug(false);
conf.put(Config.TOPOLOGY_MAX_SPOUT_PENDING,1);
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout("file-reader",new FileReaderSpout(args[0]));
builder.setBolt("file-writer",new WriteToFileBolt(args[0])).shuffleGrouping("file-reader");
LocalCluster cluster= new LocalCluster();
cluster.submitTopology("File-To-File",conf,builder.createTopology());
Thread.sleep(10000);
cluster.shutdown();
Having read your comments in the other answer, you probably need to implement a queueing system before updating the rows in the DB.
I have personally used RabbitMQ with Storm and I know Kafka is also an option. Specifically, try add a queue such that one part of your topology (can be outside Storm too) reads off of the queue and updates the DB and the other part implements the processing logic you want.
Implementing a trigger to send events to the Storm topology is probably a bad idea, unless you have no other option.
-- Michael
What you can probably do is use a message queue integrated with your storm cluster. Kafka could be a very nice candidate for this. It basically a publish-subscribed message system. There are producers responsible for adding messages to the queue and consumer on the other end to retrieve the same.
So if you integrate Kafka with storm as soon as your producer send/published a message to the queue it will be available to your storm topology. There is something called KafkaSpout which is a normal spout implementation capable of reading from a Kafka Queue.
So it goes like this your topology starts with a KafaSpout (subscribed to a particular topic) and emitting as soon as it receives anything and then chain the output to your corresponding bolts.
You can also look for Kestrel as an alternative to Kafka . You should select based on what exactly solve your purpose.