A project I’m working on stores JSON objects in Postgres using its native JSON datatype. For developing the frontend of the site, running Postgres locally is unwanted, but some components still require a database to exist. To deal with this, we’re using SQLite for any local development that doesn’t modify the schema.
The problem caused by this setup is that SQLite doesn’t natively support the
JSON datatype. Yes, it has the json1 extension, but SQLAlchemy’s
types.JSON doesn’t work with it. To solve this, I’ve decided to simply
serialize the JSON and store it as TEXT when SQLite is used.
The following code snippet shows how this works.
import sqlalchemy.types as types import json class StringyJSON(types.TypeDecorator): """Stores and retrieves JSON as TEXT.""" impl = types.TEXT def process_bind_param(self, value, dialect): if value is not None: value = json.dumps(value) return value def process_result_value(self, value, dialect): if value is not None: value = json.loads(value) return value # TypeEngine.with_variant says "use StringyJSON instead when # connecting to 'sqlite'" MagicJSON = types.JSON().with_variant(StringyJSON, 'sqlite')
MagicJSON is the datatype you specify when you’re defining your
columns. It’s important to note that this new type is immutable, but that is
fine for my specific use case. For more information about this see the Custom
documentation for SQLAlchemy; there’s also instructions on how to make it
mutable in there.
Some nice future work would be to create an
sqlite.JSON type that
implements SQLite’s JSON support in json1 and works with
Hopefully I’ll have some time to look into this over the coming weeks.