FastAPI vs Werkzeug

At the heart of Frappe and hence ERPNext is the Werkzeug framework. This is the workhorse at the heart of the matter. Frappe partners Werkzeug with Gunicorn server for production setups.

The problem with Werkzeug and Gunicorn is that these are synchronous and wsgi which are so Python 2. The world has moved on to async and asynchronous. In fact, Werkzeug has not received an upgrade version since early of 2020.

May I suggest that we look into FastAPI and Uvicorn as replacements of Werkzeug and Gunicorn respectively.

FastAPI uses jinja2 and other similar tools. And it comes with a APIs like GraphQL, etc.

Frappe and ERPNext on FastAPI and Uvicorn may be the way to go for the future.

5 Likes

FastAPI docs says,

  • Fast : Very high performance, on par with NodeJS and Go (thanks to Starlette and Pydantic).

Starlette is what does the magic.

Example code Applications - Starlette

Scope seems, 15 files in frappe framework

I don’t know how sync vs async plays out in code.

Web sockets is a plus. Everything becomes python.

I’ll try few things.

4 Likes

sync is the one that causes http timeout problems. It also limits the number of connections and requests a server can serve.

async is the new way. It enables the server to get the request and promise a response later, and serve another connection. So, it is able to serve way more connections simultaneously.

To be fair, almost everything used to be synchronous (wsgi) in the past - including node (javascript), java, and python. Now, async (asgi) has become mature, especially in Python 3.

As an illustration:

Imagine you are at a burger shop like McDonalds or Jollibee (hehehe Filipino).

For Synchronous - WSGI mode:

  1. Customers line up in front of the Cashier.
  2. Front Customer gives the Cashier his or her order.
  3. Cashier takes the order.
  4. The Cashier gives the order to the kitchen.
  5. The Cashier and Customer wait for the order to come out of the kitchen
  6. Kitchen gives Cashier the Order.
  7. Cashier gives the Order to the Customer
  8. Customer leaves the line.
  9. Cashier serves the next customer.

Note:

  1. If the Kitchen is not able to serve the Order within the timeout limit, the Cashier tells the Customer, Oops sorry, time is up. (http timeout).
  2. Cashier cannot serve the next Customer unless the kitchen comes back with the First customer’s order.
  3. Customer cannot do anything else while waiting (sync).
  4. Kitchen can server one order at a time (sync)

For Async - ASGI mode

  1. Customers line up in front of the Cashier.
  2. Customer gives the Cashier his or her Order
  3. Cashier takes the Order.
  4. Cashier gives the Customer a Number which shall be called when the Order is done “promise”.
  5. Cashier gives the kitchen the order.
  6. Customer leaves the line with the Number promise and “await” the order.
  7. Cashier serves next Customer.
  8. When Kitchen staff comes out with the order(s), these are delivered to the "await"ing customers.

Note:

  1. Cashier can serve the next customer after taking the Customer order and passing it to the kitchen.
  2. Kitchen staff(s) can do several orders concurrently “async”.
  3. Customers can do other things while "await"ing.
  4. No timeout issue
4 Likes

FastAPI is awesome. It will be awesome to replace, it is very straight forward.

You can also run gunicorn with async workers.

Maybe try this out in production for a while. Fix issues you face and your proposal will have more weight.

Reference: Settings — Gunicorn 20.1.0 documentation

Sorry gevent is still synchronous.

gevent is a coroutine -based Python networking library that uses greenlet to provide a high-level synchronous API on top of the libev or libuv event loop.

It is about making Frappe stay relevant and up to date.
ASync technology has matured to a point that every one and every framework (React, Angular, Java Spring Boot) is rushing to adapt.

If Frappe does not keep up, it will not be able to compete with those that use async technology.

Also gevent is related to Gunicorn not Werkzeug. Frappe can exist without Gunicorn. Tweaking Gunicorn is quite simple and you don’t even have to go to gevents level. A simple change in the gunicorn command line parameter (–threads) will do.

This is about the foundational technology on which Frappe stands. It is like having a car with an old engine. Suddenly a new, faster engine is available. Would you continue to tweak at the old engine? Or is it worthwhile to try out the new engine.

Revant has shown this can be done with his work on docker and kubernetes - which is monumental.

Frappe will benefit a lot from async - await - promises technology available in Python 3 and frameworks like FastAPI (which brings in other modern technologies like GraphQL as a bonus).

Just to clarify.

Starlette, the underlying toolkit used by FastAPI is what can replace Werkzeug.
FastAPI seems to be opinionated and Flask like. (Flask uses Werkzeug, FastAPI uses Starlette)

Frappe Framework has its own opinions so starlette.io gives more control, just like Werkzeug.

following app.py (reference from starlette docs), It looks similar to frappe/app.py.

from starlette.applications import Starlette
from starlette.responses import (
    PlainTextResponse,
    JSONResponse,
)
from starlette.routing import (
    Route,
    Mount,
    WebSocketRoute,
)
from starlette.staticfiles import StaticFiles


def homepage(request):
    return PlainTextResponse("Hello, world!")


def user_me(request):
    username = "starlette"
    return PlainTextResponse("Hello, {}!".format(username))


def user(request):
    username = request.path_params["username"]
    return PlainTextResponse("Hello, %s!" % username)


async def websocket_endpoint(websocket):
    await websocket.accept()
    await websocket.send_text("Hello, websocket!")
    await websocket.close()


def startup():
    print("Ready to go")


async def server_error(request, exc):
    print({"exc": exc})
    return JSONResponse(
        content={"error": exc.detail},
        status_code=exc.status_code,
    )


exception_handlers = {
    404: server_error,
    500: server_error,
}

routes = [
    Route("/", homepage),
    Route("/user/me", user_me),
    Route("/user/{username}", user),
    WebSocketRoute("/ws", websocket_endpoint),
    Mount("/static", StaticFiles(directory="static")),
]

application = Starlette(
    debug=True,
    routes=routes,
    on_startup=[startup],
    exception_handlers=exception_handlers,
)

in app.py file we have application used by gunicorn/uvicorn.

Edit:

What happens to “from werkzeug.local import Local, release_local” that stores globals? Things are dependent on frappe.local.*

Need to understand starlette and global variables, found this What is the best way to store globally accessible "heavy" objects? · Issue #374 · encode/starlette · GitHub

4 Likes

Yes. Werkzeug does request-response, and more.

Frappe also uses Werkzeug to manage the “state” - meaning values.

gevent is now available in docker images.

Who ever is willing to experiment gevent worker class can try WORKER_CLASS environment variable https://github.com/frappe/frappe_docker/blob/develop/docs/environment-variables.md#frappe-worker-and-erpnext-worker

2 Likes

Use Frappe Framework with FastAPI Serverless function using Frappe Framework

2 Likes

Hi, I’m new but as I understand the werkzeug.local is basically a singleton class to store objects common to all the instances?
And as I understand in starlette because is async is handled in another way.

Sorry maybe sound stupid but for me help to understood it better.

I find this library @revant_one do u think can help this?
https://starlette-context.readthedocs.io/en/latest/quickstart.html

The only thing that I’m in doubit is that in werkzeug the context locals is global, or is tied up to an app lifecycle.

Cheers

Werkzeug and Quart are merging, meaning that out-of-the-box async will be available with an existing dependency. It is noteworthy that this (or any other ASGI implementation) doesn’t provide a lot that actually helps. Django has a roadmap for integrating python’s async await that they released three+ years ago and have now largely delivered on.

What does this actually solve:

  • The ExpressJS dependency and stack can go away, meaning that all server-side execution is in python. That said, it isn’t broken and isn’t a point of friction either.
  • Maybe there are some other things

What this doesn’t solve is:

  • The Frappe ORM and document model is synchronous and every call to the database is blocking. Generally this is what is intended. You wouldn’t want the on_submit hook to complete before the validate hook, for example. The document model orchestrates this and deviating from it would not be helpful in most cases.

Trade-offs:

  • I believe the only meaningful gain that an async implementation might deliver on is CPU resource utilization. This means rewriting major portions of Frappe to be async, threaded (details in the Werkzeug release notes) or both. So a single machine could support more concurrent users, which theoretically turns this into a memory-bound or network-bound problem. It doesn’t solve the individual user’s database-bound performance problem, especially on writes. CPU cycles are pretty cheap, VMs with lots of memory raise the cost. Both of these gains pale in comparison to the added complexity of the codebase and the additional ongoing developer time.

Ultimately, I think that using or adding an async python framework in Frappe is not a good use of contributor resources. If the project were starting from scratch, it might be a different story.

5 Likes