So I saved a copy of the thing as a plain text file with the mysql tags trimmed off and tried a few options; a posting on the UK Python list reminded me that the csv module in Python can take arbitrary characters as delimiters, not just commas, which sounded useful. After all, I couldn't just split it at the commas because the contents are basically a lot of tuples, like this:
(data, data, data),(data, data, data)And I need them in groups.
I thought I was being clever when I did a global find/replace, taking out the ),( because the csv module doesn't support multiple characters as delimiters, and replacing it with \t; then I wrote this script:
#! usr/bin/env python
f = open('/home/yorksranter/Documents/Geekery/airports.txt')
csv.field_size_limit(100000) #see below!
input = csv.reader(f, delimiter='\t')
conn = sqlite3.connect('/home/yorksranter/Desktop/airport.sql')
curse = conn.cursor()
curse.execute('''CREATE TABLE airports ('id', 'country', 'latitude', 'longitude', 'name', 'timezone', 'shortname')''')
for item in input:
........curse.execute('INSERT INTO airports VALUES (%s)' % item)
itemshould be a tuple of seven values, and they should be in the same order they were in the original db, so this ought to recreate the data in an SQLite 3 file.
Then my problems began; I got the following error message:
. Google found me this and this; it seems as far as I understand that the DB is too big for the csv module; there does seem to be a way of altering the limit, going by the module source code.
_csv.error field larger than field limit
Thoughts? Update: There is; csv.field_size_limit(), and I altered it until the thing ran properly; but there's still no data in the db!