I have a relatively extensive sqlite database that I'd like to import into my Google App Engine python app.
I've created my models using the appengine API which are close, but not quite identical to the existing schema. I've written an import script to load the data from sqlite and create/save new appengine objects, but the appengine environment blocks me from accessing the sqlite library. This script is only to be run on my local app engine instance, and from there I hope to push the data to google.
Am I approaching this problem the wrong way, or is there a way to import the sqlite library while running in the local instance's environment?
If you need to access your datastore outside of the App Engine environment (like if you need to use libraries not present in App Engine or do other things App Engine does not support with the datastore) then the best option is the Remote Api.
There is an excellent tutorial on that here: http://code.google.com/appengine/articles/remote_api.html
Essentially you import the remote_api module, authenticate with Google to access your datastore, then run your data access commands (query, update, delete, etc) as you normally would in app engine.
I have not had any trouble importing pysqlite2, reading data, then transforming it and writing it to AppEngine using the remote_api.
What error are you seeing?
According to Google, you're doing it backwards. The app should be pulling data from you where you have more flexibility in converting to the new model anyway.
I would make suitable CSV files from the Sqlite data, in a separate script, then use bulk loading to push the data from the CSV files up to app engine.