Replies: 3 comments
-
One option to consider is instead of using The psuedocode could look something like this. public RateLimiter : IRateLimiter
{
public int MaxConcurrentRequests = 100;
[JsonProperty("NextScheduledTime")]
public DateTime? NextScheduleTime = {get; set;}
[JsonProperty("BufferedRequests")]
public List<HttpRequestData> BufferedRequests {get; set;}
public Task ScheduleRequests(List<HttpRequestData> newRequests)
{
BufferedRequests.AddRange(newRequests)
if (NextScheduledRequest == null) {
await this.ProcessRequests();
}
}
public Task ProcessRequests()
{
var batchToProcess = BufferedRequests.Take(MaxRequests);
// async code to schedule http requests and measure time
foreach (var response in httpResponses)
{
// Code to send notification back to orchestration that scheduled the request
// that the request is completed and what the result was.
}
//Set to new state
BufferedRequests = BufferedRequests.Skip(MaxRequests).ToList();
if (rateLimitHit)
{
NextScheduleTime = nextTimeToProcessRequests;
} else
{
NextScheduleTime = DateTime.UtcNow;
this.ProcessRequests();
}
}
} The orchestrator helper function would look something like this: public async Task<List<HttpResponseData> ScheduleHttpRequests(IDurableOrchestrationContext context, List<HttpRequestData> requestsToSend)
{
//NOTE: Since the rate limiter will handle that for me, can schedule all at once.
await context.CallEntity(new EntityKey("RateLimiter", "api/path"), requestsToSend);
// Code that waits for events from the entity that contain the responses. Can be one per http request or batched, depending on complexity of rate limiting code
} Pros:
Cons:
|
Beta Was this translation helpful? Give feedback.
-
@IanKemp What granularity do the 3rd party endpoints rate limit at? And what is the max number of requests in that window? i.e. Do they allow max 10 requests per second? 10 max request per minute? Per hour? Per rolling 24 hour period etc? Do they allow concurrent requests, if so how many? or is it sequential-only? (you can't start the next request until the current in-flight request completes) |
Beta Was this translation helpful? Give feedback.
-
FYI see also the similar discussion here, which gives sample code for an entity acting as a semaphore: and we have sample code checked into the Netherite repo here, which was used for testing: |
Beta Was this translation helpful? Give feedback.
-
I've been tasked with building an Azure durable function app that is aware of rate limits on HTTP endpoints. Essentially I'm going to have a timer trigger, that fires an orchestrator, that fires off an activity that gets a list of data. Using that list the orchestrator will then fire off a call to an activity for each item in that list; each activity will perform a single HTTP request.
The catch here is that the HTTP endpoints are rate-limited, but don't support status code 429 or the Retry-After header, so I need to be the one limiting my requests to them.
My current solution is for the orchestrator to batch the activities into sets of N items, where N is the maximum number of requests the endpoint allows, and run each batch as a fan-out. After a batch has completed, review the total time it took to complete said batch:
The above seems like it should work, but it feels a little... coarse, and I'm worried there are pitfalls I'm not aware of. Am I overthinking this, or is there a simpler/more effective approach that I'm missing?
I've considered using Polly for this scenario, but I'm not certain if I even need it because I'm trying to control and limit the number of durable functions activities - which will indirectly accomplish rate-limiting the HTTP requests.
Beta Was this translation helpful? Give feedback.
All reactions