I wanted to check the temperature of this project and so I ran a quick, very simple, benchmark with wrk
and the default example:
from fastapi import FastAPI
app = FastAPI()
@app.get("/")
def read_root():
return {"Hello": "World"}
Everything default with wrk, regular Ubuntu Linux, Python 3.8.2, latest FastAPI as of now.
wrk http://localhost:8000
Uvicorn with logging disabled (obviously), as per the README:
python3 -m uvicorn fast:app --log-level critical
I get very poor performance, way worse than Node.js and really, really far from Golang:
Running 10s test @ http://localhost:8000
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.83ms 365.59us 3.90ms 75.34%
Req/Sec 2.74k 116.21 2.98k 65.00%
54447 requests in 10.00s, 7.37MB read
Requests/sec: 5442.89
Transfer/sec: 754.78KB
This machine can do 400k req/sec on one single thread using other software, so 5k is not at all fast. Even Node.js does 20-30k on this machine, so this does not align at all with the README:
The key features are:
Fast: Very high performance, on par with NodeJS and Go (thanks to Starlette and Pydantic). One of the fastest Python frameworks available.
Where do you post benchmarks? How did you come to that conclusion? I cannot see you have posted any benchmarks at all?
Please fix marketing, it is not at all true.
If you read README further...
Independent TechEmpower benchmarks show FastAPI applications running under Uvicorn as one of the fastest Python frameworks available, only below Starlette and Uvicorn themselves (used internally by FastAPI). (*)
To understand more about it, see the section Benchmarks.
Okay, so taking the source you gave me (entirely disregarding my own test), I can read the following:
268 | fastapi | 159,445 | 2.2%
-- | -- | -- | --
199 | uvicorn | 382,930 | 5.2%
-- | -- | -- | --
124 | nodejs | 884,444 | 12.0%
-- | -- | -- | --
27 | fasthttp | 5,962,266 | 81.2%
-- | -- | -- | --
Which is in very stark contrast with the README:
Very high performance, on par with NodeJS and Go
https://www.collinsdictionary.com/dictionary/english/on-a-par-with
2.2% is not "on par with" 12%. It's like comparing wine with light beer - they are entirely disjoint, you cannot possibly claim light beer gets you as hammered as wine?
And the golang thing.... jeeeez!
@alexhultman Your point might be valid, but I think you might be oversimplifying your tests here. Benchmarks are a tricky thing, but it's important to know what is it that you are comparing.
FastApi is a Web application framework that provides quite a bit over just an application server.
So if you are comparing FastAPI, say, to NodeJs, then the test should be done over a Web Application Framework as NestJS or similar.
Same thing with Golang. the comparison should be against Revel or something like this.
In @tiangolo's documentation on benchmarks you can read:
If you didn't use FastAPI and used Starlette directly (or another tool, like Sanic, Flask, Responder, etc) you would have to implement all the data validation and serialization yourself. So, your final application would still have the same overhead as if it was built using FastAPI. And in many cases, this data validation and serialization is the biggest amount of code written in applications.
https://fastapi.tiangolo.com/benchmarks/
I believe that when the developers say:
Very high performance, on par with NodeJS and Go
They mean a full application on Golang or NodeJS (on some framework) vs a Full application on FastAPI.
I must agree with @alexhultman that the performance claims are misleading ...I learned it the hard way too. Performance is not really what it claims to be.
Taking another example, the one just serving a chunk of text:
To boldly state as the first feature "Fast: Very high performance, on par with NodeJS and Go" is well... I guess I don't have to say it. ...It leads to disappointments down the road when you discover the truth.
Probably it would be better to just keep "Among the fastest Python frameworks available" and emphasize on the other good features.
There seem to be two intertwined discussions here that I think we can address separately.
There is definitely contention around the phrase "on par with NodeJS and Go" in the documentation. I believe the purpose of that phrase was to be encouraging so that people will try out the framework for their purpose instead of just assuming "it's Python, it'll be too slow". However, clearly the phrase can also spawn anger and be off-putting which would be the opposite of what we're trying to achieve here.
I believe if the comparison is causing bad feelings toward FastAPI that it should simply be removed. We can claim FastAPI is fast without specifically calling out other languages (which almost always leads to defensiveness). Obviously this is up to @tiangolo and we'll need his input here when he gets to this issue.
If you ask "is it fast" about anything, there will be evidence both for and against. I think the point of linking to TechEmpower instead of listing numbers directly is so that people can explore on their own and see if FastAPI makes sense for _their_ workloads. However, we may be able to do a better job of guiding people about what is "fast" about FastAPI.
For the numbers I'm about to share, I'm using TechEmpowers "Round 19" looking at only the "micro" classification (which is what FastAPI falls under) for "go", "javascript", "typescript", and "python". I don't use Go or NodeJS in production, so I'm picking some popular frameworks which appear under this micro category to compare: "ExpressJS" (javascript), "NestJS" (typescript), and "Gin" (golang). I don't know how their feature sets compare to FastAPI.
I believe this is what most of the comparisons above me are using. FastAPI is much slower than nest/express which is much slower than Gin. Exactly what people are saying above. If your primary workload is serving text, go with Go.
Requests must fetch data from a database, update, and commit it back, then serialize and return the result to the caller. Here FastAPI is much faster than NestJS/Express which are much faster than Gin.
This test uses an ORM and HTML templating. Here all the frameworks are very close to each other but, in order from fastest to slowest, were Gin, NestJS, FastAPI, Express.
This is just fetching multiple rows from the database and serializing the results. Here, FastAPI slightly edges out Gin. Express and NestJS are much slower in this test.
Single row is fetched and serialized. Gin is much faster than the rest which are, in order, FastAPI, NestJS, and Express.
No database activity, just serializing some JSON. Gin blows away the competition. Express, then Nest, then FastAPI follow.
So the general theme of all the tests combined seems to be if you're working with large amounts of data from the database, FastAPI is the fastest of the bunch. The less database activity (I/O bound), the further FastAPI falls and Gin rises. The real takeaway here is that the answer to "is it fast" is always "it depends". However, we can probably do a better job of pointing out FastAPI's database strengths in the sections talking about speed.
if you're working with large amounts of data from the database, FastAPI is the fastest of the bunch
[...]
we can probably do a better job of pointing out FastAPI's database strengths in the sections talking about speed.
Here is a short lesson in critical thinking:
But yes, I guess we should attribute this victory to FastAPI. Because the fact it used PostgreSQL in a test that clearly favors PostgreSQL has nothing at all to do with the outcome. Nothing at all 馃幎 馃槈
And the fact FastAPI scores last in every single test that does not involve the variability of database selection, that is just random coincidence. 馃幍 馃幑
@alexhultman If you are not happy about the different DB choices of TechEmpower, you can probably raise an issue there (e.g. https://github.com/TechEmpower/FrameworkBenchmarks/issues/2845 - that repo is open to contributions), or pick another comprehensive benchmark you prefer, which we can all benefit from when choosing a framework.
Also please be reminded that so far everyone replying to you in this thread is community member only; we are not maintainers of fastapi. If you want to know who wrote that claim, please use git blame. Please be kind to people who are trying to have a discussion here.
I think this issue has gone as deep as it goes already, nothing can be said that hasn't already been. Alright, thank you and have a nice day.
I just ran all-defaults comparison FastAPI vs ExpressJS:
$ wrk http://localhost:8000
Running 10s test @ http://localhost:8000
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 3.80ms 0.86ms 21.87ms 94.90%
Req/Sec 1.33k 145.34 1.44k 90.00%
26495 requests in 10.01s, 3.59MB read
Requests/sec: 2647.61
Transfer/sec: 367.16KB
$ wrk http://localhost:3000
Running 10s test @ http://localhost:3000
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 1.43ms 567.31us 16.51ms 91.17%
Req/Sec 3.58k 583.68 4.10k 88.00%
71280 requests in 10.00s, 16.25MB read
Requests/sec: 7125.65
Transfer/sec: 1.62MB
I love the syntax and ease of use of FastAPI, but it's disappointing to see misleading claims about its speed. 367kb/s is NOT "on par" with 1620kb/s. that's 400% higher throughput than "Fast"Api
but it is about twice as fast as Flask:
$ wrk http://localhost:5000
Running 10s test @ http://localhost:5000
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 7.47ms 2.38ms 27.67ms 82.41%
Req/Sec 580.47 142.46 848.00 58.50%
11568 requests in 10.01s, 1.82MB read
Requests/sec: 1155.08
Transfer/sec: 186.12KB
How are you running fastapi to ensure that your benchmark is valid?
How are you running fastapi to ensure that your benchmark is valid?
python3 -m uvicorn fast:app --log-level critical
I don't know how you did the benchmarks, but from TechEmpower benchmarks, this is the result.
In a real world scenario like Data Updates, 1-20 Queries, etc. FastAPI is much faster.
Framework | JSON | 1-query | 20-query | Fortunes | Updates | Plaintext
-- | -- | -- | -- | -- | -- | --
fastapi | 171,055 | 66,185 | 13,022 | 52,080 | 5,926 | 159,445 |
express | 246,627 | 57,588 | 4,261 | 44,166 | 2,075 | 369,533 |
I don't know how you did the benchmarks, but from TechEmpower benchmarks, this is the result.
In a real world scenario like Data Updates, 1-20 Queries, etc. FastAPI is much faster.
Framework JSON 1-query 20-query Fortunes Updates Plaintext
fastapi 171,055 66,185 13,022 52,080 5,926 159,445
express 246,627 57,588 4,261 44,166 2,075 369,533
i did exactly what @alexhultman did. created a "hello world" application in both fastapi and expressjs - using all defaults. I didn't optimize anything. then ran wrk
commands as shown in my comment.
I also ran a bare uvicorn server (with hello world app):
$ wrk http://localhost:8000
Running 10s test @ http://localhost:8000
2 threads and 10 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 2.70ms 246.37us 6.30ms 87.68%
Req/Sec 1.86k 58.33 1.96k 81.50%
37026 requests in 10.00s, 5.30MB read
Requests/sec: 3701.70
Transfer/sec: 542.27KB
and already it's slower than expressjs
@andreixk please check the benchmarks that i sent, if you think it is inaccurate please open an issue in TechEmpower's GitHub Repository
Most helpful comment
There seem to be two intertwined discussions here that I think we can address separately.
The NodeJS and Go comparison
There is definitely contention around the phrase "on par with NodeJS and Go" in the documentation. I believe the purpose of that phrase was to be encouraging so that people will try out the framework for their purpose instead of just assuming "it's Python, it'll be too slow". However, clearly the phrase can also spawn anger and be off-putting which would be the opposite of what we're trying to achieve here.
I believe if the comparison is causing bad feelings toward FastAPI that it should simply be removed. We can claim FastAPI is fast without specifically calling out other languages (which almost always leads to defensiveness). Obviously this is up to @tiangolo and we'll need his input here when he gets to this issue.
FastAPI's Performance
If you ask "is it fast" about anything, there will be evidence both for and against. I think the point of linking to TechEmpower instead of listing numbers directly is so that people can explore on their own and see if FastAPI makes sense for _their_ workloads. However, we may be able to do a better job of guiding people about what is "fast" about FastAPI.
For the numbers I'm about to share, I'm using TechEmpowers "Round 19" looking at only the "micro" classification (which is what FastAPI falls under) for "go", "javascript", "typescript", and "python". I don't use Go or NodeJS in production, so I'm picking some popular frameworks which appear under this micro category to compare: "ExpressJS" (javascript), "NestJS" (typescript), and "Gin" (golang). I don't know how their feature sets compare to FastAPI.
Plain Text
I believe this is what most of the comparisons above me are using. FastAPI is much slower than nest/express which is much slower than Gin. Exactly what people are saying above. If your primary workload is serving text, go with Go.
Data Updates
Requests must fetch data from a database, update, and commit it back, then serialize and return the result to the caller. Here FastAPI is much faster than NestJS/Express which are much faster than Gin.
Fortunes
This test uses an ORM and HTML templating. Here all the frameworks are very close to each other but, in order from fastest to slowest, were Gin, NestJS, FastAPI, Express.
Multiple Queries
This is just fetching multiple rows from the database and serializing the results. Here, FastAPI slightly edges out Gin. Express and NestJS are much slower in this test.
Single query
Single row is fetched and serialized. Gin is much faster than the rest which are, in order, FastAPI, NestJS, and Express.
JSON serialization
No database activity, just serializing some JSON. Gin blows away the competition. Express, then Nest, then FastAPI follow.
So the general theme of all the tests combined seems to be if you're working with large amounts of data from the database, FastAPI is the fastest of the bunch. The less database activity (I/O bound), the further FastAPI falls and Gin rises. The real takeaway here is that the answer to "is it fast" is always "it depends". However, we can probably do a better job of pointing out FastAPI's database strengths in the sections talking about speed.