Queued API calls?

nickCR

New member
Feb 5, 2010
1,073
12
0
CR
Hello All,

I'm making several API calls and sometimes because the internet is what it is these API calls fail. Some of the API calls are 'critical' as they update data on the ESP and due to that are important. We also make non-critical such as stats or rapleaf etc.

I want to setup all my 'critical' calls to queue on failure. The problem I see is that there are different functions that are called, with a different number of variables.

- I thought about resolving this with the function capture feature in PHP. But not sure if that makes the most sense? What other ways could this be done?

Also how do I make sure that on a 5 min cron I don't get double calls for the same one?

- Would I just mark the ones that I am processing as in-process or something?

Would you put everything in the queue even the ones that 'completed' without error and just mark it as such?

Thanks in advance!
 


Would you put everything in the queue even the ones that 'completed' without error and just mark it as such?

You're thinking about it backwards dude, a process doesn't go into queue on failure (or success) you put it into queue to be processed and your application logic determines what happens next depending on failure/success, i.e. if the call fails because a downstream service isn't available the message is still in queue to be processed when the service returns. Helps with scalability and durability of your application.

Looks into Amazon's queue service
Amazon Simple Queue Service (Amazon SQS)

Other cloud platforms have them like Micrsoft's Azure Service Bus

Or you can use an on premises solution like NService Bus.
 
Don't really want to use an external service for this, just a table with the fields and a cron job running it. I understand what you mean about queuing it, that's what some guys I talked to suggested too.

User -> Queue

CRON -> Fail -> Retry