This question already has an answer here:
Hi I need to create something automatic that connects to another server using a php library and then I need to load the data in a mysql database only the first file uploads a new file is uploaded everyday. The problem is how do I continue uploading a file everyday to the database I am almost there
Here is the code:
<?php
include 'core/init.php';
include 'includes/overall/header.php';
//connection to linux server
$conn = ssh2_connect('xxx.xxx.xx.xxx', 22);
$destinationPath = '/path/to/destination/path/';
$localPath = 'C:\path\to\local\path\';
//checks if the connection is successful or not
if(ssh2_auth_password($conn, 'username', 'password')){
echo '<script type="text/javascript">alert("Authentication was successful"); </script>'; //javascript pop up when successful
}else{
die("Authentication failed");
}
if(ssh2_scp_recv($conn, $destinationPath, $localPath)){
echo '<h2>Todays file recieved</h2>'; //if file was recieved from server to local echo todays file recieved, putting the file in localpath
}else{ //if the file was not uploaded send an email for radar file too be uploaded
$to = 'testemail@yahoo.co.uk';
$subject = 'the subject';
$message = 'hello';
$headers = "From: The Sender Name <senderEmail@yahoo.co.uk>\r\n";
$headers .= "Reply-To: senderEmail@yahoo.coom\r\n";
$headers .= "Content-type: text/html\r\n";
mail($to, $subject, $message, $headers);
}
$string = file_get_contents('http://localhost/Prototype/core/edit.txt', 'r');//get contents of file from web used to read the file
$myFile = 'C:wampwwwPrototypecoreedit.txt';//file directory
$fh = fopen($myFile, 'w') or die("Could not open: " .mysql_error());//open the file
fwrite($fh, $string);
fclose($fh);
$result = mysql_query("LOAD DATA LOCAL INFILE '$myFile'". "INTO TABLE `restartdata` FIELDS TERMINATED BY ',' ");
if (!$result) {
die("Could not load." . mysql_error());
}else{
echo 'data loaded in the database';
}
Definitely not using PHP. And definitely nothing that's home made. There is a built in mechanism for that. It's called replication tried and tested over more than 15 years and it's being used on thousands of installations.
To do this in PHP would mean the entire database is dumped daily or hourly during which means the site is unresponsive during the time the dump is being made. Then you have to transfer the whole database over HTTP.
Last but not least your PHP approach does not allow continous archiving. If you are archiving once a day, what happens if the system fails 23:50 hours after the last backup was made?