I need to scrape data from a table on a web page. I'd then like to store this data in an array, so that I can later store it in a database. I'm very unfamiliar with this functionality, so I'd like to use the most simple method possible.
Which should I use? file_get_contents
, file_get_html
, cURL
?
curl()
orfile_get_contents()
to get the contents of the page.preg_match()
)You can using the crontab command
(Linux: crontab -e)
to make the php script execute automatically.My English is poor, so I hope anyone give me opinion. Thanks!
I prefer PHP Simple HTML Dom Parser:
You can then loop through certain elements with their syntax. For example, to get the names of all the teams on the link you sent over, save it to an array and then do a MySQL insert statement, you'd do something like this:
Then prepare a MySQL statement:
You could always create a multidimensional array with all the elements you'd like, pull them from the array when you loop through it and upload multiple items for every query.