python - How do I lazily pass csv rows to executemany()? -


i'm using mysql connector/python 1.0.11 python 3.3 , mysql 5.6.12 load csv file table via cursor.executemany('insert ... values'). sqlite3 or psycopg2 can pass _csv.reader object seq_of_parameters, mysql fails with:

mysql.connector.errors.programmingerror: parameters query must list or tuple. 

fair enough, docs must sequence, , _csv.reader object enumerator (it defines iter , next), not sequence. pass 'list(my_csv_reader)', i'm pretty sure not lazy, , these files can have 10^6+ rows. there way pass lazily? or wasting time because executemany() expand list before performing insert? (a hint cursor.py: "insert statements optimized batching data, using mysql multiple rows syntax.") maybe wrapping iterator in generator, ala http://www.logarithmic.net/pfh/blog/01193268742 . i'm looking forward thoughts!


Comments

Popular posts from this blog

php - get table cell data from and place a copy in another table -

javascript - Mootools wait with Fx.Morph start -

php - Navigate throught databse rows -