librelist archives

« back to archive

Parallel execution with Sqlite access ?

Parallel execution with Sqlite access ?

From:
Christophe Meessen
Date:
2013-03-18 @ 10:21
Hello,

I have a web application using jQuery Mobile and a web server in Flask.
In Flask I have a class wrapping execute() calls on an sqlite database.

Most of the web application works well except one functionality. It 
shows a list of items where the user can select one and with a up and 
down button may move the item up or down in the list.

If I click the up or down button at as slow rate (>1s) everything works 
fine. If I click faster, I get two types of problems.

What I would like to know is how Flask behaves about concurrent 
execution and especially when accessing a database like sqlite which is 
not designed for concurrent access.

May multiple POST requests send to Flask be executed by different 
threads ? What about sqlite database access ?
Are these concurrent competing database operations ? If the db connect 
could fail for some requests it could explain why some operations are 
dropped.

I wished I could pipeline the ops and eventually pack them if they apply 
to the same item.
Should I use another database that properly handles concurrent access 
and operations ?

Re: [flask] Parallel execution with Sqlite access ?

From:
Philip Goh
Date:
2013-03-18 @ 10:30
> May multiple POST requests send to Flask be executed by different
> threads ?

How are you running Flask? If it's just running as the development server,
it's going to be running in a single thread and so you shouldn't have
issues with concurrent access. If you're running on multiple
threads/processes, then there is no guarantee that requests will be handled
by the same thread.

What about sqlite database access ?
> Are these concurrent competing database operations ? If the db connect
> could fail for some requests it could explain why some operations are
> dropped.
> Should I use another database that properly handles concurrent access
> and operations ?
>

Use an appropriate database. SQLite is brilliant when you can guarantee
that you'll only ever have one process writing to the database. It sounds
like you're after concurrent writes, so switching to a different database
(PostgreSQL gets my vote) will be a lot easier than writing your own
middleware in an attempt to pipeline writes.

Cheers,
Phil

Re: [flask] Parallel execution with Sqlite access ?

From:
Christophe Meessen
Date:
2013-03-18 @ 17:01
I use nginx with uWSGI for which I see multiple process. 
Previously I used the Flask server and never saw such problems. So it 
could be that POST messages are processed in parallel and connecting to 
the db or performing ops on it fails.
Indeed miving to PostgreSql could solve this problem.

What is the overhead of establishing a db connection on each request ? 
Couldn't this be optimized ?


--
Ch.Meessen

Le 18 mars 2013 à 11:30, Philip Goh <philip.wj.goh@gmail.com> a écrit :

> 
> May multiple POST requests send to Flask be executed by different
> threads ?
> How are you running Flask? If it's just running as the development 
server, it's going to be running in a single thread and so you shouldn't 
have issues with concurrent access. If you're running on multiple 
threads/processes, then there is no guarantee that requests will be 
handled by the same thread.
> 
> What about sqlite database access ?
> Are these concurrent competing database operations ? If the db connect
> could fail for some requests it could explain why some operations are
> dropped.
> Should I use another database that properly handles concurrent access
> and operations ?
> 
> Use an appropriate database. SQLite is brilliant when you can guarantee 
that you'll only ever have one process writing to the database. It sounds 
like you're after concurrent writes, so switching to a different database 
(PostgreSQL gets my vote) will be a lot easier than writing your own 
middleware in an attempt to pipeline writes.
> 
> Cheers,
> Phil
> 

Re: [flask] Parallel execution with Sqlite access ?

From:
Juan-Pablo Scaletti
Date:
2013-03-18 @ 17:21
Yes, production servers process requests in paralell. You want this, or 
your application would be unbearable slow with many connected users.

As has been said, you must use another database, like PostgreSQL. The 
usual is to have a pool of connections to reuse, but don't worry about 
that, use an ORM like SQLAlchemy to manage that details for you 
automatically.

JP

On 18/03/2013, at 10:01 a.m., Christophe Meessen <christophe@meessen.net> wrote:

> I use nginx with uWSGI for which I see multiple process. 
> Previously I used the Flask server and never saw such problems. So it 
could be that POST messages are processed in parallel and connecting to 
the db or performing ops on it fails.
> Indeed miving to PostgreSql could solve this problem.
> 
> What is the overhead of establishing a db connection on each request ? 
Couldn't this be optimized ?
> 
> 
> --
> Ch.Meessen
> 
> Le 18 mars 2013 à 11:30, Philip Goh <philip.wj.goh@gmail.com> a écrit :
> 
>> 
>>> May multiple POST requests send to Flask be executed by different
>>> threads ?
>> How are you running Flask? If it's just running as the development 
server, it's going to be running in a single thread and so you shouldn't 
have issues with concurrent access. If you're running on multiple 
threads/processes, then there is no guarantee that requests will be 
handled by the same thread.
>> 
>>> What about sqlite database access ?
>>> Are these concurrent competing database operations ? If the db connect
>>> could fail for some requests it could explain why some operations are
>>> dropped.
>>> Should I use another database that properly handles concurrent access
>>> and operations ?
>> 
>> Use an appropriate database. SQLite is brilliant when you can guarantee
that you'll only ever have one process writing to the database. It sounds 
like you're after concurrent writes, so switching to a different database 
(PostgreSQL gets my vote) will be a lot easier than writing your own 
middleware in an attempt to pipeline writes.
>> 
>> Cheers,
>> Phil
>> 

Re: [flask] Parallel execution with Sqlite access ?

From:
Christophe Meessen
Date:
2013-03-19 @ 07:11
I prefer avoiding SQLAlchemy. 

How could I create my own pool of connections and use my existing class to
do the operations on the database ? I have my own class encapsulating a 
persistent connection and all te ops on my database.

later I might need to connect to other services trough ssl connections and
a pool would be very handy.

I guess each process has it's Flask app instance and these may be executed
in paralel. What happens in one of these processe ? Are the requests 
processed sequentially or are they executed in parallel by threads ?


--
Ch.Meessen

Le 18 mars 2013 à 18:21, Juan-Pablo Scaletti <juanpablo@jpscaletti.com> a écrit :

> Yes, production servers process requests in paralell. You want this, or 
your application would be unbearable slow with many connected users.
> 
> As has been said, you must use another database, like PostgreSQL. The 
usual is to have a pool of connections to reuse, but don't worry about 
that, use an ORM like SQLAlchemy to manage that details for you 
automatically.
> 
> JP
> 
> On 18/03/2013, at 10:01 a.m., Christophe Meessen <christophe@meessen.net> wrote:
> 
>> I use nginx with uWSGI for which I see multiple process. 
>> Previou sly I used the Flask server and never saw such problems. So it 
could be that POST messages are processed in parallel and connecting to 
the db or performing ops on it fails.
>> Indeed miving to PostgreSql could solve this problem.
>> 
>> What is the overhead of establishing a db connection on each request ? 
Couldn't this be optimized ?
>> 
>> 
>> --
>> Ch.Meessen
>> 
>> Le 18 mars 2013 à 11:30, Philip Goh <philip.wj.goh@gmail.com> a écrit :
>> 
>>> 
>>> May multiple POST requests send to Flask be executed by different
>>> threads ?
>>> How are you running Flask? If it's just running as the development 
server, it's going to be running in a single thread and so you shouldn't 
have issues with concurrent access. If you're running on multiple 
threads/processes, then there is no guarantee that requests will be 
handled by the same thread.
>>> 
>>> What about sqlite database access ?
>>> Are these concurrent competing database operations ? If the db connect
>>> could fail for some requests it could explain why some operations are
>>> dropped.
>>> Should I use another database that properly handles concurrent access
>>> and operations ?
>>> 
>>> Use an appropriate database. SQLite is brilliant when you can 
guarantee that you'll only ever have one process writing to the database. 
It sounds like you're after concurrent writes, so switching to a different
database (PostgreSQL gets my vote) will be a lot easier than writing your 
own middleware in an attempt to pipeline writes.
>>> 
>>> Cheers,
>>> Phil
>>> 

Re: [flask] Parallel execution with Sqlite access ?

From:
Philip Goh
Date:
2013-03-19 @ 08:08
What you are saying is ringing alarm bells.

Can you elaborate on why you are keen to avoid SQLAlchemy  and a proper
RDBMS in favor of rolling your own connection pool and sticking with
SQLite? What problem are you attempting to solve?

Kind regards,
Phil
On 19 Mar 2013 07:12, "Christophe Meessen" <christophe@meessen.net> wrote:

> I prefer avoiding SQLAlchemy.
>
> How could I create my own pool of connections and use my existing class to
> do the operations on the database ? I have my own class encapsulating a
> persistent connection and all te ops on my database.
>
> later I might need to connect to other services trough ssl connections and
> a pool would be very handy.
>
> I guess each process has it's Flask app instance and these may be executed
> in paralel. What happens in one of these processe ? Are the requests
> processed sequentially or are they executed in parallel by threads ?
>
>
> --
> Ch.Meessen
>
> Le 18 mars 2013 à 18:21, Juan-Pablo Scaletti <juanpablo@jpscaletti.com> a
> écrit :
>
> Yes, production servers process requests in paralell. You want this, or
> your application would be unbearable slow with many connected users.
>
> As has been said, you must use another database, like PostgreSQL. The
> usual is to have a pool of connections to reuse, but don't worry about
> that, use an ORM like SQLAlchemy to manage that details for you
> automatically.
>
> JP
>
> On 18/03/2013, at 10:01 a.m., Christophe Meessen <christophe@meessen.net>
> wrote:
>
> I use nginx with uWSGI for which I see multiple process.
> Previou sly I used the Flask server and never saw such problems. So it
> could be that POST messages are processed in parallel and connecting to the
> db or performing ops on it fails.
> Indeed miving to PostgreSql could solve this problem.
>
> What is the overhead of establishing a db connection on each request ?
> Couldn't this be optimized ?
>
>
> --
> Ch.Meessen
>
> Le 18 mars 2013 à 11:30, Philip Goh <philip.wj.goh@gmail.com> a écrit :
>
>
> May multiple POST requests send to Flask be executed by different
>> threads ?
>
> How are you running Flask? If it's just running as the development server,
> it's going to be running in a single thread and so you shouldn't have
> issues with concurrent access. If you're running on multiple
> threads/processes, then there is no guarantee that requests will be handled
> by the same thread.
>
> What about sqlite database access ?
>> Are these concurrent competing database operations ? If the db connect
>> could fail for some requests it could explain why some operations are
>> dropped.
>> Should I use another database that properly handles concurrent access
>> and operations ?
>>
>
> Use an appropriate database. SQLite is brilliant when you can guarantee
> that you'll only ever have one process writing to the database. It sounds
> like you're after concurrent writes, so switching to a different database
> (PostgreSQL gets my vote) will be a lot easier than writing your own
> middleware in an attempt to pipeline writes.
>
> Cheers,
> Phil
>
>

Re: [flask] Parallel execution with Sqlite access ?

From:
Steven Kryskalla
Date:
2013-03-19 @ 08:41
On Tue, Mar 19, 2013 at 12:11 AM, Christophe Meessen
<christophe@meessen.net> wrote:
> I prefer avoiding SQLAlchemy.
>
> How could I create my own pool of connections and use my existing class to
> do the operations on the database ? I have my own class encapsulating a
> persistent connection and all te ops on my database.
>
> later I might need to connect to other services trough ssl connections and a
> pool would be very handy.
>
> I guess each process has it's Flask app instance and these may be executed
> in paralel. What happens in one of these processe ? Are the requests
> processed sequentially or are they executed in parallel by threads ?

There isn't much benefit to having a pool of sqlite connections,
because sqlite is just reading & writing to a file on disk.

Yes, each process or thread will execute in parallel, that's why
sqlite can't be used in a situation where there are multiple
concurrent writers. Only one connection can write at a time, all
others will be locked. A connection pool could actually make things
worse, if you're re-using locked connections.

http://sqlite.org/faq.html#q5

If you're going to try using sqlite with multiple concurrent writers,
you should recreate the connection on every request, add retry logic,
and make the errors very visible when they occur. Doing this is
extremely error prone, which is why people are recommending a real
RDBMS. But if you have a low number of concurrent writes, and program
it properly, sqlite can work.

-Steve

Re: [flask] Parallel execution with Sqlite access ?

From:
Christophe Meessen
Date:
2013-03-19 @ 10:49
Sorry for the misunderstanding, I forgot to clarify that I will of 
course switch to PostgreSQL because sqlite won't cut it.

Since I already have my class encapsulating all database ops I would 
like to use it instead of going for SQLAchemy. I also already have the 
database created with data instead.

What I currently do is shown in the following code snippet, where ShopDB 
is a stateless interface class (model) to the database apart for the 
connection.

@app.before_request
def before_request():
     g.db = ShopDB(app.config['SQLIGHT_DATABASE'])

@app.teardown_request
def teardown_request(exception):
     g.db.close()

Are these handlers execute before and after each request ?
If yes, this is where to add the code to extract a db connection from 
the pool and release it.
Could there be another place ?

I guess the object managing the connection pool is to be instantiated 
next to instantiating app. Right ?
So my code would be something like this:

# create Flask application and load configuration
app = Flask(__name__)
app.config.from_pyfile('app.cfg')
sdb = ShopDBPool(app.config['SQLIGHT_DATABASE'])

@app.before_request
def before_request():
     g.db = sbb.connect()  # Return a ShopDB object from the pool

@app.teardown_request
def teardown_request(exception):
     sdb.release( g.db ) # return the ShopDB object to the pool
     g.db = None          # make sure ShopDB instance is not used after

Of course connect() and release() must be thread safe.
Would this code do what I want ?


Le 19/03/2013 09:41, Steven Kryskalla a écrit :
> On Tue, Mar 19, 2013 at 12:11 AM, Christophe Meessen
> <christophe@meessen.net>  wrote:
>> I prefer avoiding SQLAlchemy.
>>
>> How could I create my own pool of connections and use my existing class to
>> do the operations on the database ? I have my own class encapsulating a
>> persistent connection and all te ops on my database.
>>
>> later I might need to connect to other services trough ssl connections and a
>> pool would be very handy.
>>
>> I guess each process has it's Flask app instance and these may be executed
>> in paralel. What happens in one of these processe ? Are the requests
>> processed sequentially or are they executed in parallel by threads ?
> There isn't much benefit to having a pool of sqlite connections,
> because sqlite is just reading & writing to a file on disk.
>
> Yes, each process or thread will execute in parallel, that's why
> sqlite can't be used in a situation where there are multiple
> concurrent writers. Only one connection can write at a time, all
> others will be locked. A connection pool could actually make things
> worse, if you're re-using locked connections.
>
> http://sqlite.org/faq.html#q5
>
> If you're going to try using sqlite with multiple concurrent writers,
> you should recreate the connection on every request, add retry logic,
> and make the errors very visible when they occur. Doing this is
> extremely error prone, which is why people are recommending a real
> RDBMS. But if you have a low number of concurrent writes, and program
> it properly, sqlite can work.
>
> -Steve

Re: [flask] Parallel execution with Sqlite access ?

From:
Philip Goh
Date:
2013-03-19 @ 11:17
On 19 Mar 2013, at 10:49, Christophe Meessen <christophe@meessen.net> wrote:

> I guess the object managing the connection pool is to be instantiated 
> next to instantiating app. Right ?
> So my code would be something like this:
> 
> # create Flask application and load configuration
> app = Flask(__name__)
> app.config.from_pyfile('app.cfg')
> sdb = ShopDBPool(app.config['SQLIGHT_DATABASE'])
> 
> @app.before_request
> def before_request():
>     g.db = sbb.connect()  # Return a ShopDB object from the pool
> 
> @app.teardown_request
> def teardown_request(exception):
>     sdb.release( g.db ) # return the ShopDB object to the pool
>     g.db = None          # make sure ShopDB instance is not used after
> 
> Of course connect() and release() must be thread safe.
> Would this code do what I want ?

This looks correct. You retrieve the connection from the pool before the 
request and then release it on termination. You don't need g.db=None, but 
that's a stylistic issue.

As for being thread safe, it's hard to tell as you are accessing the app 
and sdb without locking them. However, before worrying about these ask 
yourself whether they need to be thread safe? Are you running multiple 
threads? From what you described before where you have nginx and multiple 
uWSGI *processes*, you only ever have a single thread running per process 
*unless* you went out of your way to explicitly spawn more threads. If you
didn't spawn more threads,  you should be fine as it is. No need to worry 
about thread safety. 

Kind regards,
Phil

Re: [flask] Parallel execution with Sqlite access ?

From:
Christophe Meessen
Date:
2013-03-19 @ 16:24
Thank you very much Philip, I thought that Flask was multithreaded and 
that the handlers may be executed in parallel by cocurrent threads. So if 
the app is singlethreaded I don't even need a connection pool then. My 
ShopDB class has no state beside the connection. So changes could be even 
simpler than the proposed code. 

--
Ch.Meessen

Le 19 mars 2013 à 12:17, Philip Goh <philip.wj.goh@gmail.com> a écrit :

> 
> On 19 Mar 2013, at 10:49, Christophe Meessen <christophe@meessen.net> wrote:
> 
>> I guess the object managing the connection pool is to be instantiated 
>> next to instantiating app. Right ?
>> So my code would be something like this:
>> 
>> # create Flask application and load configuration
>> app = Flask(__name__)
>> app.config.from_pyfile('app.cfg')
>> sdb = ShopDBPool(app.config['SQLIGHT_DATABASE'])
>> 
>> @app.before_request
>> def before_request():
>>     g.db = sbb.connect()  # Return a ShopDB object from the pool
>> 
>> @app.teardown_request
>> def teardown_request(exception):
>>     sdb.release( g.db ) # return the ShopDB object to the pool
>>     g.db = None          # make sure ShopDB instance is not used after
>> 
>> Of course connect() and release() must be thread safe.
>> Would this code do what I want ?
> 
> This looks correct. You retrieve the connection from the pool before the
request and then release it on termination. You don't need g.db=None, but 
that's a stylistic issue.
> 
> As for being thread safe, it's hard to tell as you are accessing the app
and sdb without locking them. However, before worrying about these ask 
yourself whether they need to be thread safe? Are you running mu ltiple 
threads? From what you described before where you have nginx and multiple 
uWSGI *processes*, you only ever have a single thread running per process 
*unless* you went out of your way to explicitly spawn more threads. If you
didn't spawn more threads,  you should be fine as it is. No need to worry 
about thread safety. 
> 
> Kind regards,
> Phil
> 
> 

Re: [flask] Parallel execution with Sqlite access ?

From:
Lucas Vickers
Date:
2013-04-01 @ 14:46
In python there is really no such thing as concurrent threads inside one
GIL.  My setup for production deployment (and I assume others) is using
multiple instances of uwsgi to handle concurrent incoming requests.  In
that situation you'll have multiple instances of python running and no
concurrency issues.

Someone correct me if I'm wrong!


On Tue, Mar 19, 2013 at 12:24 PM, Christophe Meessen <christophe@meessen.net
> wrote:

> Thank you very much Philip, I thought that Flask was multithreaded and
> that the handlers may be executed in parallel by cocurrent threads. So if
> the app is singlethreaded I don't even need a connection pool then. My
> ShopDB class has no state beside the connection. So changes could be even
> simpler than the proposed code.
>
> --
> Ch.Meessen
>
> Le 19 mars 2013 à 12:17, Philip Goh <philip.wj.goh@gmail.com> a écrit :
>
>
> On 19 Mar 2013, at 10:49, Christophe Meessen <christophe@meessen.net>
> wrote:
>
> I guess the object managing the connection pool is to be instantiated
> next to instantiating app. Right ?
> So my code would be something like this:
>
> # create Flask application and load configuration
> app = Flask(__name__)
> app.config.from_pyfile('app.cfg')
> sdb = ShopDBPool(app.config['SQLIGHT_DATABASE'])
>
> @app.before_request
> def before_request():
>     g.db = sbb.connect()  # Return a ShopDB object from the pool
>
> @app.teardown_request
> def teardown_request(exception):
>     sdb.release( g.db ) # return the ShopDB object to the pool
>     g.db = None          # make sure ShopDB instance is not used after
>
> Of course connect() and release() must be thread safe.
> Would this code do what I want ?
>
>
> This looks correct. You retrieve the connection from the pool before the
> request and then release it on termination. You don't need g.db=None, but
> that's a stylistic issue.
>
> As for being thread safe, it's hard to tell as you are accessing the app
> and sdb without locking them. However, before worrying about these ask
> yourself whether they need to be thread safe? Are you running mu ltiple
> threads? From what you described before where you have nginx and multiple
> uWSGI *processes*, you only ever have a single thread running per process
> *unless* you went out of your way to explicitly spawn more threads. If you
> didn't spawn more threads,  you should be fine as it is. No need to worry
> about thread safety.
>
> Kind regards,
> Phil
>
>
>

Re: [flask] Parallel execution with Sqlite access ?

From:
Olav Grønås Gjerde
Date:
2013-04-15 @ 07:53
Have you tried enabling WAL for SQLite?
https://www.sqlite.org/wal.html


On Mon, Apr 1, 2013 at 4:46 PM, Lucas Vickers <lucas@localprojects.net>wrote:

> In python there is really no such thing as concurrent threads inside one
> GIL.  My setup for production deployment (and I assume others) is using
> multiple instances of uwsgi to handle concurrent incoming requests.  In
> that situation you'll have multiple instances of python running and no
> concurrency issues.
>
> Someone correct me if I'm wrong!
>
>
> On Tue, Mar 19, 2013 at 12:24 PM, Christophe Meessen <
> christophe@meessen.net> wrote:
>
>> Thank you very much Philip, I thought that Flask was multithreaded and
>> that the handlers may be executed in parallel by cocurrent threads. So if
>> the app is singlethreaded I don't even need a connection pool then. My
>> ShopDB class has no state beside the connection. So changes could be even
>> simpler than the proposed code.
>>
>> --
>> Ch.Meessen
>>
>> Le 19 mars 2013 à 12:17, Philip Goh <philip.wj.goh@gmail.com> a écrit :
>>
>>
>> On 19 Mar 2013, at 10:49, Christophe Meessen <christophe@meessen.net>
>> wrote:
>>
>> I guess the object managing the connection pool is to be instantiated
>> next to instantiating app. Right ?
>> So my code would be something like this:
>>
>> # create Flask application and load configuration
>> app = Flask(__name__)
>> app.config.from_pyfile('app.cfg')
>> sdb = ShopDBPool(app.config['SQLIGHT_DATABASE'])
>>
>> @app.before_request
>> def before_request():
>>     g.db = sbb.connect()  # Return a ShopDB object from the pool
>>
>> @app.teardown_request
>> def teardown_request(exception):
>>     sdb.release( g.db ) # return the ShopDB object to the pool
>>     g.db = None          # make sure ShopDB instance is not used after
>>
>> Of course connect() and release() must be thread safe.
>> Would this code do what I want ?
>>
>>
>> This looks correct. You retrieve the connection from the pool before the
>> request and then release it on termination. You don't need g.db=None, but
>> that's a stylistic issue.
>>
>> As for being thread safe, it's hard to tell as you are accessing the app
>> and sdb without locking them. However, before worrying about these ask
>> yourself whether they need to be thread safe? Are you running mu ltiple
>> threads? From what you described before where you have nginx and multiple
>> uWSGI *processes*, you only ever have a single thread running per process
>> *unless* you went out of your way to explicitly spawn more threads. If you
>> didn't spawn more threads,  you should be fine as it is. No need to worry
>> about thread safety.
>>
>> Kind regards,
>> Phil
>>
>>
>>
>

Re: [flask] Parallel execution with Sqlite access ?

From:
bruce bushby
Date:
2013-03-19 @ 11:07
Perhaps "pgbouncer" is the Postgres connection pool you're after?

Some more Postgres examples here:
https://github.com/sean-/flask-skeleton




On Tue, Mar 19, 2013 at 10:49 AM, Christophe Meessen <christophe@meessen.net
> wrote:

> Sorry for the misunderstanding, I forgot to clarify that I will of
> course switch to PostgreSQL because sqlite won't cut it.
>
> Since I already have my class encapsulating all database ops I would
> like to use it instead of going for SQLAchemy. I also already have the
> database created with data instead.
>
> What I currently do is shown in the following code snippet, where ShopDB
> is a stateless interface class (model) to the database apart for the
> connection.
>
> @app.before_request
> def before_request():
>      g.db = ShopDB(app.config['SQLIGHT_DATABASE'])
>
> @app.teardown_request
> def teardown_request(exception):
>      g.db.close()
>
> Are these handlers execute before and after each request ?
> If yes, this is where to add the code to extract a db connection from
> the pool and release it.
> Could there be another place ?
>
> I guess the object managing the connection pool is to be instantiated
> next to instantiating app. Right ?
> So my code would be something like this:
>
> # create Flask application and load configuration
> app = Flask(__name__)
> app.config.from_pyfile('app.cfg')
> sdb = ShopDBPool(app.config['SQLIGHT_DATABASE'])
>
> @app.before_request
> def before_request():
>      g.db = sbb.connect()  # Return a ShopDB object from the pool
>
> @app.teardown_request
> def teardown_request(exception):
>      sdb.release( g.db ) # return the ShopDB object to the pool
>      g.db = None          # make sure ShopDB instance is not used after
>
> Of course connect() and release() must be thread safe.
> Would this code do what I want ?
>
>
> Le 19/03/2013 09:41, Steven Kryskalla a écrit :
> > On Tue, Mar 19, 2013 at 12:11 AM, Christophe Meessen
> > <christophe@meessen.net>  wrote:
> >> I prefer avoiding SQLAlchemy.
> >>
> >> How could I create my own pool of connections and use my existing class
> to
> >> do the operations on the database ? I have my own class encapsulating a
> >> persistent connection and all te ops on my database.
> >>
> >> later I might need to connect to other services trough ssl connections
> and a
> >> pool would be very handy.
> >>
> >> I guess each process has it's Flask app instance and these may be
> executed
> >> in paralel. What happens in one of these processe ? Are the requests
> >> processed sequentially or are they executed in parallel by threads ?
> > There isn't much benefit to having a pool of sqlite connections,
> > because sqlite is just reading & writing to a file on disk.
> >
> > Yes, each process or thread will execute in parallel, that's why
> > sqlite can't be used in a situation where there are multiple
> > concurrent writers. Only one connection can write at a time, all
> > others will be locked. A connection pool could actually make things
> > worse, if you're re-using locked connections.
> >
> > http://sqlite.org/faq.html#q5
> >
> > If you're going to try using sqlite with multiple concurrent writers,
> > you should recreate the connection on every request, add retry logic,
> > and make the errors very visible when they occur. Doing this is
> > extremely error prone, which is why people are recommending a real
> > RDBMS. But if you have a low number of concurrent writes, and program
> > it properly, sqlite can work.
> >
> > -Steve
>
>

Re: [flask] Parallel execution with Sqlite access ?

From:
Owein Reese
Date:
2013-03-19 @ 11:50
I second using an already established connection pool. There's some great
libraries out there for working with PostGRES. Don't forget to check out
psycogreen if you're using greenlets via gevent or eventlet.
On Mar 19, 2013 7:08 AM, "bruce bushby" <bruce.bushby@gmail.com> wrote:

>
> Perhaps "pgbouncer" is the Postgres connection pool you're after?
>
> Some more Postgres examples here:
> https://github.com/sean-/flask-skeleton
>
>
>
>
> On Tue, Mar 19, 2013 at 10:49 AM, Christophe Meessen <
> christophe@meessen.net> wrote:
>
>> Sorry for the misunderstanding, I forgot to clarify that I will of
>> course switch to PostgreSQL because sqlite won't cut it.
>>
>> Since I already have my class encapsulating all database ops I would
>> like to use it instead of going for SQLAchemy. I also already have the
>> database created with data instead.
>>
>> What I currently do is shown in the following code snippet, where ShopDB
>> is a stateless interface class (model) to the database apart for the
>> connection.
>>
>> @app.before_request
>> def before_request():
>>      g.db = ShopDB(app.config['SQLIGHT_DATABASE'])
>>
>> @app.teardown_request
>> def teardown_request(exception):
>>      g.db.close()
>>
>> Are these handlers execute before and after each request ?
>> If yes, this is where to add the code to extract a db connection from
>> the pool and release it.
>> Could there be another place ?
>>
>> I guess the object managing the connection pool is to be instantiated
>> next to instantiating app. Right ?
>> So my code would be something like this:
>>
>> # create Flask application and load configuration
>> app = Flask(__name__)
>> app.config.from_pyfile('app.cfg')
>> sdb = ShopDBPool(app.config['SQLIGHT_DATABASE'])
>>
>> @app.before_request
>> def before_request():
>>      g.db = sbb.connect()  # Return a ShopDB object from the pool
>>
>> @app.teardown_request
>> def teardown_request(exception):
>>      sdb.release( g.db ) # return the ShopDB object to the pool
>>      g.db = None          # make sure ShopDB instance is not used after
>>
>> Of course connect() and release() must be thread safe.
>> Would this code do what I want ?
>>
>>
>> Le 19/03/2013 09:41, Steven Kryskalla a écrit :
>> > On Tue, Mar 19, 2013 at 12:11 AM, Christophe Meessen
>> > <christophe@meessen.net>  wrote:
>> >> I prefer avoiding SQLAlchemy.
>> >>
>> >> How could I create my own pool of connections and use my existing
>> class to
>> >> do the operations on the database ? I have my own class encapsulating a
>> >> persistent connection and all te ops on my database.
>> >>
>> >> later I might need to connect to other services trough ssl connections
>> and a
>> >> pool would be very handy.
>> >>
>> >> I guess each process has it's Flask app instance and these may be
>> executed
>> >> in paralel. What happens in one of these processe ? Are the requests
>> >> processed sequentially or are they executed in parallel by threads ?
>> > There isn't much benefit to having a pool of sqlite connections,
>> > because sqlite is just reading & writing to a file on disk.
>> >
>> > Yes, each process or thread will execute in parallel, that's why
>> > sqlite can't be used in a situation where there are multiple
>> > concurrent writers. Only one connection can write at a time, all
>> > others will be locked. A connection pool could actually make things
>> > worse, if you're re-using locked connections.
>> >
>> > http://sqlite.org/faq.html#q5
>> >
>> > If you're going to try using sqlite with multiple concurrent writers,
>> > you should recreate the connection on every request, add retry logic,
>> > and make the errors very visible when they occur. Doing this is
>> > extremely error prone, which is why people are recommending a real
>> > RDBMS. But if you have a low number of concurrent writes, and program
>> > it properly, sqlite can work.
>> >
>> > -Steve
>>
>>
>

Re: [flask] Parallel execution with Sqlite access ?

From:
cbrueggenolte
Date:
2013-03-19 @ 09:18
Hello everyone,

following this discussion, I am aksin gmy self if SQlite is still the right
tool to work with. And some of you are suggesting to use a RDBMS like
SQLAlchemy. I am new to python, what RDBMS could you recommend?

Thanks.

@TOPIC:
In the past we had some problems with sqlite and an application which was
used by 10 users. it caused a problems when people try to save things at
the same time. so having a managemen layer is good.


 --
Carsten Brueggenolte
http://cbrueggenolte.de



2013/3/19 Steven Kryskalla <skryskalla@gmail.com>

> On Tue, Mar 19, 2013 at 12:11 AM, Christophe Meessen
> <christophe@meessen.net> wrote:
> > I prefer avoiding SQLAlchemy.
> >
> > How could I create my own pool of connections and use my existing class
> to
> > do the operations on the database ? I have my own class encapsulating a
> > persistent connection and all te ops on my database.
> >
> > later I might need to connect to other services trough ssl connections
> and a
> > pool would be very handy.
> >
> > I guess each process has it's Flask app instance and these may be
> executed
> > in paralel. What happens in one of these processe ? Are the requests
> > processed sequentially or are they executed in parallel by threads ?
>
> There isn't much benefit to having a pool of sqlite connections,
> because sqlite is just reading & writing to a file on disk.
>
> Yes, each process or thread will execute in parallel, that's why
> sqlite can't be used in a situation where there are multiple
> concurrent writers. Only one connection can write at a time, all
> others will be locked. A connection pool could actually make things
> worse, if you're re-using locked connections.
>
> http://sqlite.org/faq.html#q5
>
> If you're going to try using sqlite with multiple concurrent writers,
> you should recreate the connection on every request, add retry logic,
> and make the errors very visible when they occur. Doing this is
> extremely error prone, which is why people are recommending a real
> RDBMS. But if you have a low number of concurrent writes, and program
> it properly, sqlite can work.
>
> -Steve
>

Re: [flask] Parallel execution with Sqlite access ?

From:
Philip Goh
Date:
2013-03-19 @ 10:21
On 19 March 2013 09:18, cbrueggenolte <cbrueggenolte@gmail.com> wrote:

> Hello everyone,
>
> following this discussion, I am aksin gmy self if SQlite is still the
> right tool to work with. And some of you are suggesting to use a RDBMS like
> SQLAlchemy. I am new to python, what RDBMS could you recommend?
>

SQLAlchemy is an ORM that makes working with the database easier in that it
allows you to work with objects instead of dealing with SQL directly
(though that option is still available). An RDBMS would be a database like
PostgreSQL or MySQL. I'm personally working with PostgreSQL and find it a
capable database.

@TOPIC:
> In the past we had some problems with sqlite and an application which was
> used by 10 users. it caused a problems when people try to save things at
> the same time. so having a managemen layer is good.
>

It sounds like you hit the limit with SQLite. A decent database will handle
concurrent writes for you.

Kind regards,
Phil

Re: [flask] Parallel execution with Sqlite access ?

From:
Markus Unterwaditzer
Date:
2013-03-19 @ 09:41
cbrueggenolte <cbrueggenolte@gmail.com> wrote:

>Hello everyone,
>
>following this discussion, I am aksin gmy self if SQlite is still the
>right
>tool to work with. And some of you are suggesting to use a RDBMS like
>SQLAlchemy. I am new to python, what RDBMS could you recommend?
>
>Thanks.
>
>@TOPIC:
>In the past we had some problems with sqlite and an application which
>was
>used by 10 users. it caused a problems when people try to save things
>at
>the same time. so having a managemen layer is good.
>
>
> --
>Carsten Brueggenolte
>http://cbrueggenolte.de
>
>
>
>2013/3/19 Steven Kryskalla <skryskalla@gmail.com>
>
>> On Tue, Mar 19, 2013 at 12:11 AM, Christophe Meessen
>> <christophe@meessen.net> wrote:
>> > I prefer avoiding SQLAlchemy.
>> >
>> > How could I create my own pool of connections and use my existing
>class
>> to
>> > do the operations on the database ? I have my own class
>encapsulating a
>> > persistent connection and all te ops on my database.
>> >
>> > later I might need to connect to other services trough ssl
>connections
>> and a
>> > pool would be very handy.
>> >
>> > I guess each process has it's Flask app instance and these may be
>> executed
>> > in paralel. What happens in one of these processe ? Are the
>requests
>> > processed sequentially or are they executed in parallel by threads
>?
>>
>> There isn't much benefit to having a pool of sqlite connections,
>> because sqlite is just reading & writing to a file on disk.
>>
>> Yes, each process or thread will execute in parallel, that's why
>> sqlite can't be used in a situation where there are multiple
>> concurrent writers. Only one connection can write at a time, all
>> others will be locked. A connection pool could actually make things
>> worse, if you're re-using locked connections.
>>
>> http://sqlite.org/faq.html#q5
>>
>> If you're going to try using sqlite with multiple concurrent writers,
>> you should recreate the connection on every request, add retry logic,
>> and make the errors very visible when they occur. Doing this is
>> extremely error prone, which is why people are recommending a real
>> RDBMS. But if you have a low number of concurrent writes, and program
>> it properly, sqlite can work.
>>
>> -Steve
>>

The ones i know are sqlalcheny and peewee. Both have Flask extensions that
are supposed to make stuff easier.

-- Markus (from phone)

Re: [flask] Parallel execution with Sqlite access ?

From:
Jose Ayerdis
Date:
2013-03-19 @ 10:30
First neither sqlalchemy nor peewee are RDBMS, that been said personally I
think *SQLALchemy* is a good tool, tested in other framework and the plugin
is very simple basically it's a wrapper exposing and making your life
easier some of the Toolkit functionality.

Sincerly yours,

[Jose Luis Ayerdis Espinoza]
Necronet.info |
LinkedIn<http://www.linkedin.com/pub/jose-luis-ayerdis-espinoza/28/7b4/704>|
Careers
StackOverflow <http://careers.stackoverflow.com/necronet>


2013/3/19 Markus Unterwaditzer <markus@unterwaditzer.net>

> cbrueggenolte <cbrueggenolte@gmail.com> wrote:
>
> >Hello everyone,
> >
> >following this discussion, I am aksin gmy self if SQlite is still the
> >right
> >tool to work with. And some of you are suggesting to use a RDBMS like
> >SQLAlchemy. I am new to python, what RDBMS could you recommend?
> >
> >Thanks.
> >
> >@TOPIC:
> >In the past we had some problems with sqlite and an application which
> >was
> >used by 10 users. it caused a problems when people try to save things
> >at
> >the same time. so having a managemen layer is good.
> >
> >
> > --
> >Carsten Brueggenolte
> >http://cbrueggenolte.de
> >
> >
> >
> >2013/3/19 Steven Kryskalla <skryskalla@gmail.com>
> >
> >> On Tue, Mar 19, 2013 at 12:11 AM, Christophe Meessen
> >> <christophe@meessen.net> wrote:
> >> > I prefer avoiding SQLAlchemy.
> >> >
> >> > How could I create my own pool of connections and use my existing
> >class
> >> to
> >> > do the operations on the database ? I have my own class
> >encapsulating a
> >> > persistent connection and all te ops on my database.
> >> >
> >> > later I might need to connect to other services trough ssl
> >connections
> >> and a
> >> > pool would be very handy.
> >> >
> >> > I guess each process has it's Flask app instance and these may be
> >> executed
> >> > in paralel. What happens in one of these processe ? Are the
> >requests
> >> > processed sequentially or are they executed in parallel by threads
> >?
> >>
> >> There isn't much benefit to having a pool of sqlite connections,
> >> because sqlite is just reading & writing to a file on disk.
> >>
> >> Yes, each process or thread will execute in parallel, that's why
> >> sqlite can't be used in a situation where there are multiple
> >> concurrent writers. Only one connection can write at a time, all
> >> others will be locked. A connection pool could actually make things
> >> worse, if you're re-using locked connections.
> >>
> >> http://sqlite.org/faq.html#q5
> >>
> >> If you're going to try using sqlite with multiple concurrent writers,
> >> you should recreate the connection on every request, add retry logic,
> >> and make the errors very visible when they occur. Doing this is
> >> extremely error prone, which is why people are recommending a real
> >> RDBMS. But if you have a low number of concurrent writes, and program
> >> it properly, sqlite can work.
> >>
> >> -Steve
> >>
>
> The ones i know are sqlalcheny and peewee. Both have Flask extensions that
> are supposed to make stuff easier.
>
> -- Markus (from phone)
>

Re: [flask] Parallel execution with Sqlite access ?

From:
Markus Unterwaditzer
Date:
2013-03-19 @ 10:50
Jose Ayerdis <joseayerdis@gmail.com> wrote:

>First neither sqlalchemy nor peewee are RDBMS, that been said
>personally I
>think *SQLALchemy* is a good tool, tested in other framework and the
>plugin
>is very simple basically it's a wrapper exposing and making your life
>easier some of the Toolkit functionality.
>
>Sincerly yours,
>
>[Jose Luis Ayerdis Espinoza]
>Necronet.info |
>LinkedIn<http://www.linkedin.com/pub/jose-luis-ayerdis-espinoza/28/7b4/704>|
>Careers
>StackOverflow <http://careers.stackoverflow.com/necronet>
>
>
>2013/3/19 Markus Unterwaditzer <markus@unterwaditzer.net>
>
>> cbrueggenolte <cbrueggenolte@gmail.com> wrote:
>>
>> >Hello everyone,
>> >
>> >following this discussion, I am aksin gmy self if SQlite is still
>the
>> >right
>> >tool to work with. And some of you are suggesting to use a RDBMS
>like
>> >SQLAlchemy. I am new to python, what RDBMS could you recommend?
>> >
>> >Thanks.
>> >
>> >@TOPIC:
>> >In the past we had some problems with sqlite and an application
>which
>> >was
>> >used by 10 users. it caused a problems when people try to save
>things
>> >at
>> >the same time. so having a managemen layer is good.
>> >
>> >
>> > --
>> >Carsten Brueggenolte
>> >http://cbrueggenolte.de
>> >
>> >
>> >
>> >2013/3/19 Steven Kryskalla <skryskalla@gmail.com>
>> >
>> >> On Tue, Mar 19, 2013 at 12:11 AM, Christophe Meessen
>> >> <christophe@meessen.net> wrote:
>> >> > I prefer avoiding SQLAlchemy.
>> >> >
>> >> > How could I create my own pool of connections and use my
>existing
>> >class
>> >> to
>> >> > do the operations on the database ? I have my own class
>> >encapsulating a
>> >> > persistent connection and all te ops on my database.
>> >> >
>> >> > later I might need to connect to other services trough ssl
>> >connections
>> >> and a
>> >> > pool would be very handy.
>> >> >
>> >> > I guess each process has it's Flask app instance and these may
>be
>> >> executed
>> >> > in paralel. What happens in one of these processe ? Are the
>> >requests
>> >> > processed sequentially or are they executed in parallel by
>threads
>> >?
>> >>
>> >> There isn't much benefit to having a pool of sqlite connections,
>> >> because sqlite is just reading & writing to a file on disk.
>> >>
>> >> Yes, each process or thread will execute in parallel, that's why
>> >> sqlite can't be used in a situation where there are multiple
>> >> concurrent writers. Only one connection can write at a time, all
>> >> others will be locked. A connection pool could actually make
>things
>> >> worse, if you're re-using locked connections.
>> >>
>> >> http://sqlite.org/faq.html#q5
>> >>
>> >> If you're going to try using sqlite with multiple concurrent
>writers,
>> >> you should recreate the connection on every request, add retry
>logic,
>> >> and make the errors very visible when they occur. Doing this is
>> >> extremely error prone, which is why people are recommending a real
>> >> RDBMS. But if you have a low number of concurrent writes, and
>program
>> >> it properly, sqlite can work.
>> >>
>> >> -Steve
>> >>
>>
>> The ones i know are sqlalcheny and peewee. Both have Flask extensions
>that
>> are supposed to make stuff easier.
>>
>> -- Markus (from phone)
>>

Yeah sorry. I wrote that with ORMs in my mind.

-- Markus (from phone)

Re: [flask] Parallel execution with Sqlite access ?

From:
Jose Ayerdis
Date:
2013-03-19 @ 11:09
Don't connect and disconnect per request, in general connecting is a
expensive operation try to keep the connection alive per application

http://www.youtube.com/watch?v=KOvgfbBFZxk&t=13m10s



Sincerly yours,

[Jose Luis Ayerdis Espinoza]
Necronet.info |
LinkedIn<http://www.linkedin.com/pub/jose-luis-ayerdis-espinoza/28/7b4/704>|
Careers
StackOverflow <http://careers.stackoverflow.com/necronet>


2013/3/19 Markus Unterwaditzer <markus@unterwaditzer.net>

> Jose Ayerdis <joseayerdis@gmail.com> wrote:
>
> >First neither sqlalchemy nor peewee are RDBMS, that been said
> >personally I
> >think *SQLALchemy* is a good tool, tested in other framework and the
> >plugin
> >is very simple basically it's a wrapper exposing and making your life
> >easier some of the Toolkit functionality.
> >
> >Sincerly yours,
> >
> >[Jose Luis Ayerdis Espinoza]
> >Necronet.info |
> >LinkedIn<
> http://www.linkedin.com/pub/jose-luis-ayerdis-espinoza/28/7b4/704>|
> >Careers
> >StackOverflow <http://careers.stackoverflow.com/necronet>
> >
> >
> >2013/3/19 Markus Unterwaditzer <markus@unterwaditzer.net>
> >
> >> cbrueggenolte <cbrueggenolte@gmail.com> wrote:
> >>
> >> >Hello everyone,
> >> >
> >> >following this discussion, I am aksin gmy self if SQlite is still
> >the
> >> >right
> >> >tool to work with. And some of you are suggesting to use a RDBMS
> >like
> >> >SQLAlchemy. I am new to python, what RDBMS could you recommend?
> >> >
> >> >Thanks.
> >> >
> >> >@TOPIC:
> >> >In the past we had some problems with sqlite and an application
> >which
> >> >was
> >> >used by 10 users. it caused a problems when people try to save
> >things
> >> >at
> >> >the same time. so having a managemen layer is good.
> >> >
> >> >
> >> > --
> >> >Carsten Brueggenolte
> >> >http://cbrueggenolte.de
> >> >
> >> >
> >> >
> >> >2013/3/19 Steven Kryskalla <skryskalla@gmail.com>
> >> >
> >> >> On Tue, Mar 19, 2013 at 12:11 AM, Christophe Meessen
> >> >> <christophe@meessen.net> wrote:
> >> >> > I prefer avoiding SQLAlchemy.
> >> >> >
> >> >> > How could I create my own pool of connections and use my
> >existing
> >> >class
> >> >> to
> >> >> > do the operations on the database ? I have my own class
> >> >encapsulating a
> >> >> > persistent connection and all te ops on my database.
> >> >> >
> >> >> > later I might need to connect to other services trough ssl
> >> >connections
> >> >> and a
> >> >> > pool would be very handy.
> >> >> >
> >> >> > I guess each process has it's Flask app instance and these may
> >be
> >> >> executed
> >> >> > in paralel. What happens in one of these processe ? Are the
> >> >requests
> >> >> > processed sequentially or are they executed in parallel by
> >threads
> >> >?
> >> >>
> >> >> There isn't much benefit to having a pool of sqlite connections,
> >> >> because sqlite is just reading & writing to a file on disk.
> >> >>
> >> >> Yes, each process or thread will execute in parallel, that's why
> >> >> sqlite can't be used in a situation where there are multiple
> >> >> concurrent writers. Only one connection can write at a time, all
> >> >> others will be locked. A connection pool could actually make
> >things
> >> >> worse, if you're re-using locked connections.
> >> >>
> >> >> http://sqlite.org/faq.html#q5
> >> >>
> >> >> If you're going to try using sqlite with multiple concurrent
> >writers,
> >> >> you should recreate the connection on every request, add retry
> >logic,
> >> >> and make the errors very visible when they occur. Doing this is
> >> >> extremely error prone, which is why people are recommending a real
> >> >> RDBMS. But if you have a low number of concurrent writes, and
> >program
> >> >> it properly, sqlite can work.
> >> >>
> >> >> -Steve
> >> >>
> >>
> >> The ones i know are sqlalcheny and peewee. Both have Flask extensions
> >that
> >> are supposed to make stuff easier.
> >>
> >> -- Markus (from phone)
> >>
>
> Yeah sorry. I wrote that with ORMs in my mind.
>
> -- Markus (from phone)
>

[flask] Parallel execution with Sqlite access ?

From:
Qing Yan
Date:
2013-03-19 @ 07:48
Hello Christophe,
I think sqlite is just a simple db which is mainly used for single
connection,if you'd like to use the multiple connections,mysql should be
better.
Kevin
在 2013年3月19日星期二,Christophe Meessen 写道:

> I prefer avoiding SQLAlchemy.
>
> How could I create my own pool of connections and use my existing class to
> do the operations on the database ? I have my own class encapsulating a
> persistent connection and all te ops on my database.
>
> later I might need to connect to other services trough ssl connections and
> a pool would be very handy.
>
> I guess each process has it's Flask app instance and these may be executed
> in paralel. What happens in one of these processe ? Are the requests
> processed sequentially or are they executed in parallel by threads ?
>
>
> --
> Ch.Meessen
>
> Le 18 mars 2013 à 18:21, Juan-Pablo Scaletti <juanpablo@jpscaletti.com> a
> écrit :
>
> Yes, production servers process requests in paralell. You want this, or
> your application would be unbearable slow with many connected users.
>
> As has been said, you must use another database, like PostgreSQL. The
> usual is to have a pool of connections to reuse, but don't worry about
> that, use an ORM like SQLAlchemy to manage that details for you
> automatically.
>
> JP
>
> On 18/03/2013, at 10:01 a.m., Christophe Meessen <christophe@meessen.net>
> wrote:
>
> I use nginx with uWSGI for which I see multiple process.
> Previou sly I used the Flask server and never saw such problems. So it
> could be that POST messages are processed in parallel and connecting to the
> db or performing ops on it fails.
> Indeed miving to PostgreSql could solve this problem.
>
> What is the overhead of establishing a db connection on each request ?
> Couldn't this be optimized ?
>
>
> --
> Ch.Meessen
>
> Le 18 mars 2013 à 11:30, Philip Goh <philip.wj.goh@gmail.com> a écrit :
>
>
> May multiple POST requests send to Flask be executed by different
>> threads ?
>
> How are you running Flask? If it's just running as the development server,
> it's going to be running in a single thread and so you shouldn't have
> issues with concurrent access. If you're running on multiple
> threads/processes, then there is no guarantee that requests will be handled
> by the same thread.
>
> What about sqlite database access ?
>> Are these concurrent competing database operations ? If the db connect
>> could fail for some requests it could explain why some operations are
>> dropped.
>> Should I use another database that properly handles concurrent access
>> and operations ?
>>
>
> Use an appropriate database. SQLite is brilliant when you can guarantee
> that you'll only ever have one process writing to the database. It sounds
> like you're after concurrent writes, so switching to a different database
> (PostgreSQL gets my vote) will be a lot easier than writing your own
> middleware in an attempt to pipeline writes.
>
> Cheers,
> Phil
>
>