CRON once a minute hews to the database and looks for the last entry field with a pid=0
If it finds one, it parsed the json
with the further entry in different database tables that figured out.
Recently noticed such trouble -
For example, in the parsed json
is a block:
with the value (150) to dismantle the object is present only in the singular, com_total
is written in a separate table. In the end, I see "duplicate", that is, the table is two/three/more than 150 times, though it 1 time. (and not all records have duplicates )
Noticed a pattern that the "heavier" json, ie the more effort for processing, the more likely duplication. ~1000-1500 blocks - good, 2000 - snake some operations will execute 2 times, 4000 - 3 times and so on.
If the reason is a certain lack of something, I was expecting that the script will simply crash. But no, it runs, but somehow very tricky.
Do not tell me what could be the reason?