How to transfer data from one server to another?

There is a working server on which made complex calculations

Is there a hosting site that uses these data.

Need once a day to upload data from 1 table from local server on website.

How is better to implement?

Use php and mysql
March 23rd 20 at 19:01
5 answers
March 23rd 20 at 19:03
scp will help you:

How to do import/export from the database I think you know.

Generally depends on the size of the data. If there is a little you can some script to do that would be to Abisko the remote server to send in any convenient way.
March 23rd 20 at 19:05
How is better to implement?
Alternatively, you can try to configure replication. As I recall, in MySQL, you can replicate including separate tables. The only time they will be replicated immediately, not once a day...

If to speak about the "once a day"... vskidku, I would probably have done as something like this:
0. Custom sync folders in which to lay a table dump via rsync
1. By cron at startup the task of table dump (via mysqldump)
2. Dump out on a remote server, rsync'om
3. After the update file on the secondary server - trigger (after update file) - downloadable dump in the database. Trigger you can try to tie either the rsync or try to do it through inotify. More I will not say, I have not tested it personally, but in theory it should work.

Alternative to paragraph 3 option, as mentioned by the previous speaker, can the server just to host the file (I think the best option is packaged in gzip/bzip a dump of the table) in the usual way to unpack and pour it into the database. To send files from the source server, you can curl ω as an option.
March 23rd 20 at 19:07
And not the result of just save in the database and share access to both servers?
March 23rd 20 at 19:09
On the local server create the upload script, let takes the result from the database, wraps it into JSON, and preferably in base64. And using file_get_contents sends a POST request to the script inspector on the site. he does decode the timing and places the result in the database.

i.e. pulled from the database, added to the object:
$data = base64_encode( json_encode(object data) );
$hash = hash_hmac("sha256", $data, "blablabla");

$post_params = array('data' => $data, 'hash' => $hash);
$post_vars = http_build_query($post_params);
$post_options = array('http' => array(
'method' => 'POST',
'header' => 'Content-type: application/x-www-form-urlencoded',
'content' => $post_vars)
$post_context = stream_context_create($post_options);
$result = file_get_contents('', false, $post_context);

On the website took
if( !isset($_POST["data"]) || !isset($_POST["hash"])) exit();
$data = $_POST["data"];
$hash = $_POST["hash"];
$real_hash = hash_hmac("sha256", $data, "blablabla");
if($real_hash != $hash) exit();
$data = json_decode( base64_decode($data) );
added to the database of the website
The upload script to run Kron once a day, or immediately after computation as data is updated - Oscar.Fadel51 commented on March 23rd 20 at 19:12
March 23rd 20 at 19:11
Thank you all for the answers!!
Decided to use the dump to LAN + scp to upload to serv.
While I try this, if something does not work, I will try other suggested options.

Find more questions by tags MySQLPHP