-->

Running Flows in Mule Parallel

2019-07-30 04:54发布

问题:

I have two flows in Mule that i want to run in parallel. The first flow should transfer a file from a remote machine using sftp to a local directory (does this none stop as long as the file is constantly updated in the remote directory). The second flow must take the data in the file at update/insert them into the database by invoking a Pentaho kettle transformation/job (also continuous process as long as the files keep coming in). However, when i run my flow, it is somehow by passing the first flow and only tries to perform the second flow. How can i fix this? Here is my Mule FLow:

<flow name="flow1">
  <sftp:inbound-endpoint
    address="sftp://username:password@ip_ddress:22/path"
    responseTimeout="1000" />
  <echo-component />
  <file:outbound-endpoint path="/path/to/OutputFolder" 
    responseTimeout="10000"/>
</flow>

<flow name="flow2"> 
  <custom-transformer class="org.transformation.kettle.InvokeMain" /> 
</flow> 

回答1:

Your second flow should have a file:outbound-endpoint to pick up the file dropped by the first flow:

<flow name="flow1">
  <sftp:inbound-endpoint
      address="sftp://username:password@ip_ddress:22/path"
      responseTimeout="1000" />
  <logger level="INFO"
      message="#[message.payloadAs(java.lang.String)]" />
  <file:outbound-endpoint path="/path/to/OutputFolder" />
</flow>

<flow name="flow2"> 
  <file:inbound-endpoint path="/path/to/OutputFolder"
      fileAge="10000" />
  <custom-transformer class="org.transformation.kettle.InvokeMain" /> 
</flow>

Notice that I've:

  • Replaced the super old <echo-component /> with the more modern logger. I log the message payload, which I think was your intention?
  • Dropped the useless responseTimeout="10000" from the file:outbound-endpoint,
  • Set a fileAge of 10 seconds on the inbound file endpoint to prevent picking up a file that is still written by the SFTP inbound endpoint. Tune the value if too big or small.