How to quickly load data in the database checking each record for uniqueness?


On the website you need to upload data from csv file into database. Thus each row from csv file we need to check our database.

Data should not be in the database, in the database, loading only the new data.

Tell me how to implement it?

I do realize that and so I think that is wrong, the download is very slow, and if the database has lot of data, the request ends with a 502 error.

$wordsFromFile = array_map(function($data) { return str_getcsv($data,";");}, file($request->file('text_file')) );

 // All words from the database
 $myWords = Word::all();
 $dbWords = [];

 foreach ($myWords as $myWord) {
 $dbWords[] = $myWord->word;

 $wordsResult= [];
 foreach($wordsFromFile as $key => $data)
 if($key == 0) {continue;}

 $word = isset($data[0]) ? mb_strtolower($data[0]) : "empty_field";
 $translation = isset($data[1]) ? $data[1] : "empty_field";
 $state = isset($data[2]) ? $data[2] : 0;

 //$wordInDatabase = $myWords->where('word', $word)->first();

 if(!in_array($word, $dbWords))
 $wordsResult[] = [
 'word' => mb_strtolower($word),
 'translation' => mb_strtolower($translation),
 'state' => $state,

April 19th 20 at 12:34
1 answer
April 19th 20 at 12:36
When you use INSERT IGNORE - entry will only be added if it is unique.
Riley? - bertrand_Powlows commented on April 19th 20 at 12:39
@bertrand_Powlows, Shura!

Normally INSERT stops and rolls back when it encounters an error.
By using the keyword IGNORE all errors are converted to warnings, which will not stop inserts of additional rows.

Until MariaDB 5.5.28
MySQL and MariaDB 5.5.28 before didn't give warnings for duplicate key errors when using IGNORE. You can get the old behaviour if you set OLD_MODE to NO_DUP_KEY_WARNINGS_WITH_IGNORE

not? - Jackeline_Zboncak commented on April 19th 20 at 12:42
@Jackeline_Zboncaknot - the key here duplicate key. - bertrand_Powlows commented on April 19th 20 at 12:45
Well, it is necessary to specify in the database the data a unique key on all three fields. And shipping barrels. In the case of replay data should be ignored.

In General , I would try to use it or INSERT...ON DUPLICATE KEY UPDATE. . Ie not to do the check in code, and blame everything on the suffering of MYSQL . - Jackeline_Zboncak commented on April 19th 20 at 12:48
@Jackeline_Zboncak, right. simple had to write about a unique key, and not just insert ignore. - bertrand_Powlows commented on April 19th 20 at 12:51
@bertrand_Powlows, I wish my answer was suggestive to the enquiring went to Google boom , read stuff, and even then would receive not only a specific recipe , but maybe some additional knowledge. But, alas, happens very rarely )))) - Jackeline_Zboncak commented on April 19th 20 at 12:54

Find more questions by tags PHP