python - SqlAlchemy/Sqlite: InterfaceError - is there a data size limit? -
i try store (an admittedly large) blob sqlite database using sqlalchemy.
for mcve use ubuntu-14.04.2-desktop-amd64.iso blob want store. size:
$ ls -lhubuntu-14.04.2-desktop-amd64.iso ... 996m ... ubuntu-14.04.2-desktop-amd64.iso the code
from pathlib import path sqlalchemy import (column, integer, string, blob, create_engine) sqlalchemy.ext.declarative import declarative_base sqlalchemy.orm import sessionmaker sqlite3 import dbapi2 sqlite sa_base = declarative_base() class dbpath(sa_base): __tablename__ = 'file' pk_path = column(integer, primary_key=true) path = column(string) data = column(blob, default=none) def create_session(db_path): db_url = 'sqlite+pysqlite:///{}'.format(db_path) engine = create_engine(db_url, module=sqlite) sa_base.metadata.create_all(engine) session = sessionmaker(bind=engine) return session() if __name__ == '__main__': pth = path('/home/user/downloads/iso/ubuntu-14.04.2-desktop-amd64.iso') pth.open('rb') file_pointer: iso_data = file_pointer.read() db_pth = dbpath(path=str(pth), data=iso_data) db_session = create_session('test.sqlite') db_session.add(db_pth) db_session.commit() running raises error
interfaceerror: (interfaceerror) error binding parameter 1 - unsupported type. 'insert file (path, data) values (?, ?)' ('/home/user/downloads/iso/ubuntu-14.04.2-desktop-amd64.iso', <memory @ 0x7faf37cc18e0>) i looked @ sqlite limitations found nothing should prevent me doing this. sqlalchemy have limitation?
everything of works fine file:
$ ls -lh ubuntu-14.04.2-server-amd64.iso ... 595m ... ubuntu-14.04.2-server-amd64.iso is there data size limit? or have differently when file size surpasses (where be?) limit?
and whatever answer limit: im interested how can store files of size sqlite using sqlalchemy?
Comments
Post a Comment