librelist archives

« back to archive

Flask with multiprocessing/threading

Flask with multiprocessing/threading

From:
Martin Piper
Date:
2014-06-28 @ 14:36
Running with Python 2.7.3 on Ubuntu 12.04.3
cc, gcc, easy_install, pip, Flask, celery etc are all freshly installed
from a new Ubuntu installation.

Given this simple code saved to foo.py:

from flask import Flask
import time

app = Flask('foo')

@app.route("/")
def hello():
    print "foo1"
    time.sleep(10)
    print "foo2"
    return "Hello World!"

if __name__ == "__main__":
    app.run()



Using the browser to connect to http://localhost:5000/ I see "Hello World!"
after about 10 seconds. Very good.

Obviously the time.sleep() causes each web request to take just over 10
seconds and each request waits for the previous request to finish ("foo2"
is displayed) before starting ("foo1" is displayed). i.e. Every request is
synchronous.

It would be really great to find out the minimal number of things to add to
make each request use asynchronous processing.

I tried:     app.run(processes = 10) and     app.run(threaded=True)

Both do not seem to enable asynchronous processing in the sample code. The
connections are still processed synchronously.


So after a little bit more digging...
I looked at http://flask.pocoo.org/docs/patterns/celery/
I followed the sparse instructions and ended up with this code:

from flask import Flask
import time

app = Flask('foo')

@app.route("/")
def hello():
    print "foo1"
    time.sleep(10)
    print "foo2"
    return "Hello World!"

from celery import Celery

def make_celery(app):
    celery = Celery(app.import_name, broker=app.config['CELERY_BROKER_URL'])
    celery.conf.update(app.config)
    TaskBase = celery.Task
    class ContextTask(TaskBase):
        abstract = True
        def __call__(self, *args, **kwargs):
            with app.app_context():
                return TaskBase.__call__(self, *args, **kwargs)
    celery.Task = ContextTask
    return celery

from flask import Flask

flask_app = Flask(__name__)
flask_app.config.update(
    CELERY_BROKER_URL='redis://localhost:6379',
    CELERY_RESULT_BACKEND='redis://localhost:6379'
)
celery = make_celery(flask_app)

if __name__ == "__main__":
    app.run()



Then running:
celery -A foo.celery worker

I see this warning:
CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']

Then I see lots of:
[2014-06-28 22:26:40,545: ERROR/MainProcess] consumer: Cannot connect to
redis://localhost:6379//: Error 111 connecting to localhost:6379.
Connection refused..
Trying again in 2.00 seconds...


Obviously trying http://localhost:5000/ doesn't work, the connection is
refused.

Why?

Is there a really simple minimal example with latest Flask and latest
Celery (3.1.12) that actually works?