This script parses the SQL files exported form sqlite3 .dump
, and make it compatible for MySQL import.
sqlite3 <database_file> .dump > dump.sql
python parse_sqlite_sql.py dump.sql
Two files would be generated: dump.sql.schema.sql
and dump.sql.data.sql
One is for DB schema, and the other is for DB data, both are updated for MySQL import purpose.
After final manual modification, one could use the following commands to import the database:
mysql -u <user_name> -p <database_name> --default-character-set=utf8 < dump.sql.schema.sql
mysql -u <user_name> -p <database_name> --default-character-set=utf8 < dump.sql.data.sql
It's strongly advised that one should further modify the DB schema for his own purpose, especially:
- Replace some
text
field withvarchar(255)
, for better performance - Replace some
integer
withbigint
- add quote for tables named by reserved keywords
One should also note that this script would replace all values of t
with 1
, and all values of f
with 0
, in order to adapt to boolean field change. If you really need a t
there, you might change back manually.
Unlike most other line based parsers, this parser treat literal strings and non-literal strings separately. So even if you table data contains some special statements like CREATE TABLE
, INSERT VALUE
or 'AUTOINCREMENT`, they would not be updated.
It's very slow. Took about 2 seconds to parse a SQL file of 100,000 lines.
For other Perl or Python based scripts, it could be done in less than 0.1 second.
The following methods are likely to be modified for futher customization:
process_literal
process_schema