I need to kick off a process every 60 seconds. But I want to make sure the previous run ends before I start a new. I am wondering what will be the best practice to achieve it?
This is native functionality within Boomi to not allow the process to be running simultaneously. Click the options tab in the the process build canvas and make sure the 'Allow Simultaneous Execution' checkbox us unchecked.
Also it sounds like you're basically wanting this thing to just continue to run back to back? I'm picturing this type of recursive layout for such
'Worker Process' executes 'Worker Executor Process' with Wait for process to complete & Abort if subprocess fails options unchecked. The 'Worker Executor Process' has a 5 second wait/delay to allow the originator 'Worker Process' to complete and then executes 'Worker Process' again. Displayed below.
Worker Process (executes 'Worker Executor Process' at the end with wait for process to complete & abort if subprocess fails options unchecked):
Worker Executor Process (waits 5 seconds and executes 'Worker Process' with wait for process to complete & abort if subprocess fails options unchecked):
Simply uncheck the "Allow simultaneous executions" option on your process.
While I understand the recursive nature I would be concerned about threading and resources around a constantly recursive main process. Also i'd imagine it would be difficult to diagnose any patterns that may evolve over time. Also a perpetual service may result in boomi pending updates being stuck while awaiting processes to complete.
I agree - there are some quirks. Resources I wouldn't be too worried about since the resourcing would be nearly the same as if it's on a 1 minute schedule.
Not necessarily, in essence, you've got a host thread that stays active, constantly spawning a new thread that keeps recursively keeps this cycle.
I don't see how the host thread stays active if the sub-process it executes is considered a separate & distinct process execution. Which puts an execution record in the execution queue to be executed completely separate/abstract from the parent executor.
Where would the host process go? It's keeping track of the sub processes to show in process reporting? Agreed that the sub processes would in fact be a separate thread execution but isn't a true fire and forget.
Yes, I agree with Brian.
Host process keeps tracks of the sub processes to be finished to get host process to execute in next run (i.e. it isn't a true fire and forget)
Sub process would be its own separate execution log as well as every sub process spawned off of it for the above scenario.
I also would suggest against this and shoot toward a scheduled event. The worst that happens is a lot of failed executions because it's already running, which you can easily filter out of the log by filtering hide successful executions with zero documents option. Otherwise if you are wanting and needing a process that is running more constantly than once every minute (extended if still running) then you should be using a listener, in my opinion.
Although enabling 'Allow Multiple Executions' would work as a feature, I'd not recommend it for batch based processes. This is because we don't know what step of the process your earlier execution is at before your next execution kicks on. This would cause same data to process or miss during both executions. This may also cause other issues as both processes may try to access same resources (say same file in SFTP folder).
Here is a design orchestration we have implemented to solve similar situation with our customers:
1. Build a sub process that accepts a canonical message in xml or json. In this message have the process component id, atom id, and status. Say from the above example, these would be the component id of 'Do work' process, atom id of the production atom, and status as 'complete'
2. You could pass this message to a queue or web services listener that would trigger the process on that particular atom.
* You may still implement delay (wait for 5 sec or so) at the web services level before you
However 'Allow Multiple Executions' option will work seamlessly for web services because each request runs as an independent process.
Hope this helps.
Retrieving data ...