AWS Lambda for mobile app and throttling

amazon-web-services aws-lambda

97 просмотра

1 ответ

5132 Репутация автора

According to the docs, "by default, AWS Lambda limits the total concurrent executions across all functions within a given region to 100."

Consider a simple mobile app using Lambda for back end processing. If I'm understanding the constraint correctly, not more than 100 concurrent executions can happen at one time meaning that if I have 100 users invoking lambda functions at the same time, there will be throttling constraints?

I understand I can call customer support and increase that limit but is this the correct interpretation of the constraint? How is this supposed to scale to 1000, 10,000 or 1,000,000 users?

Автор: Jason Strimpel Источник Размещён: 18.07.2016 12:03

Ответы (1)


0 плюса

109195 Репутация автора

Решение

update: Since this answer was written, the default limit for concurrent executions was increased by a factor of 10, from 100 to 1,000. The limit is per account, per region.

By default, AWS Lambda limits the total concurrent executions across all functions within a given region to 1000

http://docs.aws.amazon.com/lambda/latest/dg/concurrent-executions.html#concurrent-execution-safety-limit (link visited 2017-05-02)

However, as before, this is a protective control, and AWS support will increase the limit if you present them with your use case and it is approved. There isn't a charge for creating this type of request in the support center and there isn't a charge for raising your limits.

The Lambda platform also may allow excursions beyond your limit if it deems the action appropriate. The logic behind such an action isn't documented, but a reasonable assumption would be that if the traffic appears to be genuine demand/load driven, rather than a result of a runaway loopback condition where Lambda functions invoke more Lambda functions, directly or indirectly.

A fun example of a runaway condition might be something like this: A bucket has a create object event that invokes a Lambda function, which creates 2 objects in the same bucket... which invokes the same Lambda function 4 times, creating 8 objects... invoking the lambda function 8 times, creating 16 objects.

On about the 15th iteration, which would only require a matter of seconds, you theoretically would have 32,768 concurrent invocations trying to create 65,536 objects. Real world traffic ramps up much more slowly, in most cases.


if I have 100 users invoking lambda functions at the same time, there will be throttling constraints

Yes, that's the idea behind "concurrent."

How is this supposed to scale

Nobody said it would, with the limit in place.

This limit is a protective control, not a reflection of an actual limitation of the platform.

But also, how likely is it that your users are making concurrent requests to Lambda? Assuming your Lambda function runs for 100ms, you could handle something like 750 invocations per second within a limit of 100 concurrent invocations at a blocking probability of only 0.1%.

(That's an Erlang B calculation, which seems applicable here. With no random arrivals, of course, the "pure" capacity would be 100 × 10 = 1000 invocations/sec for a 100ms function).

Автор: Michael - sqlbot Размещён: 18.07.2016 09:12
Вопросы из категории :
32x32