On 8/2/2020 11:45 AM, Ted Roche wrote:
This is a classic queue pattern: Make a task list of the items to be processed (and perhaps breaking up multiple tasks for each item), and run multiple workers against the list, updating the status as they go. As Christof says, ultimately you have to LOCK, UPDATE, UNLOCK and think about REFRESH.
64,000 records should take seconds, except for the slow parts. Focus on those: is it a slow API? Is it setup and takedown that you could cache?
Think about the idea of separating the slow parts from the fast parts, so your queue might have a slow part 1 and fast part 2 to be completed by different executables. Then you could launch a dozen EXEs to process the slow parts, improving the throughput, and a few EXEs to do the fast parts.
Thanks for the ideas. Honestly, it's the API that's the slow part of the whole mix. That's the vendor's programmers; out of my control.