As a large file to process in parts?

There is a file in which through the transfer lines were written url. Need to somehow cut the first 100 url from a file for further processing, and thus processing the entire file. How it can be implemented?
April 3rd 20 at 18:52
2 answers
April 3rd 20 at 18:54
@davion_Smith
$filename = 'text.txt';

$total = 0;
$buffer = [];

$h = fopen($filename, 'r');

while (! feof($h)) {
 if ($total > 100) {
 // var_export($buffer); // Display or perform processing every 100 rows
 $total = 0;
 $buffer = [];
}
 $buffer[] = fgets($h);
$total++;
}

fclose($h);

if (! empty($buffer)) {
// var_export($buffer); // Display or to process the remaining rows
}
April 3rd 20 at 18:56
Write a. JS script, which pull the Ajax page in PHP and pass it a parameter N and M.
In PHP pass N * M rows, process the next M return OK.
Catch in JS response, increase N and start the next cycle.
Will correctly and clearly.
Why would he want js? - joshua.Durgan commented on April 3rd 20 at 18:59
@joshua.Durgan, then, that the task still most likely be solved on your web server, and so he will avoid lockups, timeouts and idle waiting for the end of the script, in which something silently went wrong. - savion.Funk commented on April 3rd 20 at 19:02
@savion.Funk, and cli no longer fashionable? - joshua.Durgan commented on April 3rd 20 at 19:05
@joshua.Durgan, rather than for kettle script on the bash script will be easier on JavaScript? - savion.Funk commented on April 3rd 20 at 19:08
@savion.Funk, and then bash? I'm talking about php-cli, where there are no limits, and other purely back - joshua.Durgan commented on April 3rd 20 at 19:11
@joshua.Durgan, the man asked how to eat the elephant in pieces. It is logical to assume that the elephant is big enough to get between these pieces of informed response and be able to interrupt processing, noting the position of at least a certain degree of accuracy. In the cli may not be time limits, but will flow memory, for example - and where she did have to guess or waste time logging.
In General, do not see any advantage, except the realization that you're cool, without the browser, which is still under your arm. - savion.Funk commented on April 3rd 20 at 19:14
@savion.Funk, and that friend will receive a watermelon? That's all broken? I'm not saying that the elephant don't eat the pieces, just in case, the output except Achtung is not implied, it is more training data before recording. Supposedly the parser. For example he will say a function which gets strings, it will offset, the second function will be a bundle of links to process, is the handler for a specific link with the log - joshua.Durgan commented on April 3rd 20 at 19:17
Like I did using spl
https://www.php.net/manual/ru/splfileobject.seek.php - joshua.Durgan commented on April 3rd 20 at 19:20

Find more questions by tags PHP