Question: Why can't I open the database?
Info: I'm working on a project using sqlite3
database. I wrote a test program that runs and passes it the database:
/tmp/cer/could.db
The unit test program can make the db
without any problem. But, when I actually use the program passing the same location to it, i got below error:
OperationalError: unable to open database file
I've tried doing it with:
1) an empty database.
2) the database and the unit test left behind.
3) no database at all.
In three cases, I got the above error. The most frustrating part has to be the fact that the unittest
can do it just fine, but the actual program can't.
Any clues as to what on earth is going on?
On unix I got that error when using the
~
shortcut for the user directory. Changing it to/home/user
resolved the error.This is definitely a permissions issue. If someone is getting this error on linux, make sure you're running the command with
sudo
as the file most likely is owned by root. Hope that helps!In my case, the solution was to use an absolute path, to find an existing file:
I don't know why this fix works: the path only contained ASCII characters and no spaces. Still it made the difference.
For reference: Windows 7, Python 3.6.5 (64-bit).
I was not able to reproduce the issue on another machine (also Windows 7, Python 3.6.4 64-bit), so I have no idea why this fix works.
This worked for me:
Note: Double slashes in the full path
Using python v2.7 on Win 7 enterprise and Win Xp Pro
Hope this helps someone.
Use the fully classified name of database file
Use- /home/ankit/Desktop/DS/Week-7-MachineLearning/Week-7-MachineLearning/soccer/database.sqlite
instead-
For any one who has a problem with airflow linked to this issue.
In my case, I've initialized airflow in
/root/airflow
and run its scheduler as root. I used therun_as_user
parameter to impersonate the web user while running task instances. However airflow was always failing to trigger my DAG with the following errors in logs:I also found once I triggered a DAG manually, a new airflow resource directory was automatically created under
/home/web
. I'm not clear about this behavior, but I make it work by removing the entire airflow resources from/root
, reinitializing airflow database under/home/web
and running the scheduler as web under:If you want to try this approach, I may need to backup your data before doing anything.