python - read data in chunks insert chunks in Sqlite using transaction -


my issue refers old post importing data set in sqlite in chunks using transaction sqlite transaction csv importing:

import csv, sqlite3, time  def chunks(data, rows=10000):      in range (0, len(data), rows):          yield data[i:i+rows]  if __name__ == "__main__":     t = time.time()  con = sqlite3.connect('test.db')  cur = con.cursor()  cur.execute("drop table if exists sensor;")  cur.execute("create table sensor(key int, reading real);")  filename = 'dummy.csv'  reader = csv.reader(open(filename,"r"))  divdata = chunks(list(reader))   chunk in divdata:      cur.execute('begin transaction')      col1, col2 in chunk:          cur.execute('insert sensor (key, reading) values (?, ?)', col1, col2))      con.commit() 

while csv.reader reads whole file in memory , file gets chopped calling function chunks, looking solution reads file in chunks (of say, 10k rows) , each chunk inserted in sqlite table above until whole file inserted. how modify above?

divdata = chunks(list(reader)) 

list(reader) iterate through entire csv file , store results in list. don't want that, because csv file huge.

also, don't want run cur.execute("begin transaction;"); sqlite3 module you.

keep counter while you're iterating through csv reader. check counter , use commit.

for counter, row in enumerate(reader):     if counter % chunk_size == 0:         cur.commit()     cur.execute("insert ...")  cur.commit() 

(note: don't think it's idea chunk transactions this. maybe helps, you'd have profile first.)


Comments

Popular posts from this blog

toolbar - How to add link to user registration inside toobar in admin joomla 3 custom component -

linux - disk space limitation when creating war file -