How to speed up the update 700K records from a file?

There is a file about 700 000 records, I need periodically to check if there are changes I do update to database,if not then write to the database. The need to speed up the process. and I don't know how to keep this thing in my case. rested at that point. I do not judge strictly a beginner. I hope the code will explain better.
DB::table('resources')->select('hash')->orderBy('id')->chunk(50000,
 function ($resources) use ($file, $updatedDate) {
 $lineCount = 1;
 echo 1 . '<>';

 while (!feof($file)) {
 $line = iconv('cp1251', 'utf-8', fgets($file));
 $csv = str_getcsv($line, ';');

 if (count($csv) === 6) {
 $ipPool = explode('|', $csv[0]);
 foreach ($ipPool as $ip) {
 $date = new \DateTime($csv[5]);
 $hash = md5($csv[1] . $csv[2]);

 foreach ($resources as $resource) {
 if ($hash === $resource->hash) {
 DB::table('resources')->where('hash', $hash)->update([
 'version_date' => $updatedDate,
]);
 echo $lineCount++ . "<br>" ;
}
 // here we need to record if there is no such record
}
}
}

$lineCount++;
}
 });
March 23rd 20 at 19:36
2 answers
March 23rd 20 at 19:38
Solution
First, why pull the record is 50 000? Feel figure is randomly.

Super short optimization, increase significantly guarantee - get all records from the database, collect the time HashMap, while remove unnecessary orderBy, you don't need
$map = [];
DB::table('resources')->select('hash')->chunk(50000,
 function ($resources) {
 $map[$resource->hash] = $resource; // here Moinok to put only the really necessary data to memory not to litter
}
 });


then run through for the entire file
while (!feof($file)) {
 $line = iconv('cp1251', 'utf-8', fgets($file));
 $csv = str_getcsv($line, ';');

 if (count($csv) === 6) {
 $ipPool = explode('|', $csv[0]);
 foreach ($ipPool as $ip) {
 $date = new \DateTime($csv[5]);
 $hash = md5($csv[1] . $csv[2]);

 if (isset($map[$hash]) {
 // entry exists, do update
 } else {
 // no record add to your table
}
}
 }


Just imagine that you have a perfect hash function with no collisions...
March 23rd 20 at 19:40
first the file is downloaded from another source, then I need to check those records in the file with his records,

here's the key!
if I understand correctly from your chaotic text you are comparing two files?? ohhh
WHY?
faster and easier and more conducive to automation is to compare in the database!
@Bret.Armstrong no I'm not comparing the two files I compare the contents of the downloaded file with the records in my database - lue_Lehner commented on March 23rd 20 at 19:43

Find more questions by tags PostgreSQLPHP