λ Serverless Benchmark

An independent and continuous benchmark of serverless providers including AWS Lambda, Google Cloud Functions, Azure Functions, IBM Cloud Functions and Cloudflare Workers

Intro

Help wanted! GitHub Repository

All workloads are Node.js functions. A note on the definition of concurrency: If concurrency is set 50 the server starts 50 requests at once and as soon as the server got a response the next request will be started. However if the response is faster than the server can fire requests, this might lead to actutal concurrency being lower than 50. This will be examined soon.

I had to cut the query down to the last 3 days because of performance issues. I will deal with it as soon as possible.

Overhead

is defined as the time from request to response without the actual time the function took.

Average

Concurrency: 50

only hot

last 3 days

86ms avg

(32,400 data points)

AWS Lambda

642ms avg

(32,400 data points)

Google Cloud Functions

136ms avg

(32,400 data points)

IBM Cloud Functions

760ms avg

(15,750 data points)

Azure Functions

70ms avg

(32,400 data points)

Cloudflare Workers

Percentiles

Concurrency: 50

only hot

last 3 days

75ms median

15338ms max

102ms 90th percentile

138ms 99th percentile

AWS Lambda

679ms median

3078ms max

754ms 90th percentile

1062ms 99th percentile

Google Cloud Functions

127ms median

8308ms max

190ms 90th percentile

265ms 99th percentile

IBM Cloud Functions

704ms median

12755ms max

985ms 90th percentile

3240ms 99th percentile

Azure Functions

60ms median

603ms max

128ms 90th percentile

222ms 99th percentile

Cloudflare Workers

Coldstart

functions are called every 3 hours.

The functions were called every 3 hours, at some providers this will not necessarily lead to an actual coldstart. This will be regarded to the providers benefit in the percentile metrics. More data soon!

Overhead Percentiles

Concurrency: 10

589ms median

4621ms max

1374# actual cold

6# warm

AWS Lambda

168ms median

6490ms max

247# actual cold

1133# warm

Google Cloud Functions

2103ms median

5285ms max

1356# actual cold

24# warm

IBM Cloud Functions

5907ms median

71809ms max

478# actual cold

722# warm

Azure Functions

76ms median

200ms max

192# actual cold

1188# warm

Cloudflare Workers

Roadmap

I work on this project in my free-time. If you want to support the development, consider becoming a sponsor in exchange for a place on this page to showcase your product.

  • Offer mobile friendly version.
  • Working on: Open-source the code.
  • Fix query performance.
  • Use highest node version for all providers
  • Include coldstart metrics.
  • Include computation speed metrics.
  • Integrate information into this site.
  • Show real concurrency.
  • Add zeit.co serverless.
  • Add tracking and disclaimer.
  • Add configuration of the resources to infos.
  • Add kubernetes based offerings.

Missing something? I'm open for your suggestions! Send me a mail or leave a tweet.

Info / Disclaimer

The #1 rule for this benchmark is neutrality and usage of rigorous methods.

I'm not affiliated with any of the providers and will never accept any compensation in return for improving data, however I will accept sponsoring and might show related adverts. I want to be open about the methods I used to obtain the data, if you're interested in this information, please read the medium article. If you are concerned with practices and metrics used in this benchmark feel free to contact me via mail or Twitter.

If you refer to this data please include this project as source.