blueprint state and datastores
- Struggling Developer
- 2012-11-21 @ 18:23
I am trying to figure out what is the most elegant and efficient way
of creating a modular blueprint (i.e., can be used with different
apps) that supports different datastores (i.e., sqlalchemy,
mongoengine, etc.). The blueprint itself will provide a default
datastore implemention, in this case this default implemention will be
using sqlalchemy, but that is not relevant for my question.
The first problem is that in order to define models in the blueprint,
one would need access to the db object which was defined in the app.
If modularity is not an issue, then the usual solution is to do
"import app.db" inside myblueprint/models.py. In this case, I do not
have a models.py anymore since I am following a datastore pattern.
Thus instead I have a myblueprint/datastore.py which has a function
called sqlalchemy_datastore(db) that returns a datastore object which
later I hope to be able to use in my views.
The most natural way of getting this object instantiated in the first
place is to have it be a parameter to received by the blueprint when
it is registered with the application.
app = Flask(__name__)
db = SQLAlchemy(app)
from myblueprint import module
from myblueprint.datastore import sqlalchemy_datastore
app.register_blueprint(module, datastore = sqlalchemy_datastore(db))
Then in my blueprint I would check this was the case:
if not 'datastore' in state.options:
raise Exception('When registering this blueprint you must
provide a datastore.')
Now, the part that I can't figure out, is what is the best way to
access my datastore inside the other views. Ideally, I would like the
code in my views to look like this:
post = datastore.find_post_by_email(email)
do more stuff with post object
All the solutions I came up with seem a bit hacky (modifying the app
object in the record function, etc), and I would like to ask the flask
community for suggestions.