I have read this question but i am using something different, How can I log into a website using python?
I am using Scrapy for web crawling and i dont want to use Mechanize.
What i want to do is login to my webiste with python and then i have to submit the forms using my data from database.
Provided all data are valid . How can i do that with scrapy. or I need to use any other libraries as well?
There is an example in the scrapy docs of logging into a website before starting your crawl.
You can send request with your user data to auth controller. For example your site get POST requests from form on site. So you can directly send POST request to it.
Check out a similar question here: How to use Python to login to a webpage and retrieve cookies for later usage?
Check out URLib: http://docs.python.org/2/library/urllib.html
That should answer your question, but remember that there are authentication bits you have to satisfy to log into some website. I suggest looking into OpenID and python libs relating to that.
You can use a python library like requests or you can simply use urllib2,cookielib here is a reference
In this case you can write a method which handle login,before submitting form you can load your data using django ORM,and then submit form.