Background task, a large table, CURSOR and update whether or not to open 2 sessions?


There is a CLI utility in php which bypasses the entire table (potentially limited size) and, under certain conditions, does update some of the rows. The bypass occurs slowly, speed is not important, the frequency bypass "when it needs admin".

Decided to do a pass through the cursors. Because the cursor running under transaction (WITHOUT HOLD) and all the update before the end of the bypass table are accumulated in the transaction and applied only after a commit.

Then I see two options:

1) Let update's piling up in the transaction. But did the transaction Postgres on a large amount of data? Presumably update's can be thousands, and later, under some conditions, tens of thousands.

2) Run 2 sessions, one to create the cursor, and through the second to do update's (because the need for visibility and update'within a transaction, the cursor is missing)

Actually a question: whether to worry about the buffer size of the transaction in this situation and to open 2 sessions to the database or I should not worried from scratch?
July 9th 19 at 11:21
1 answer
July 9th 19 at 11:23
In fact, nothing worried. But there is a small but. I do not remember how in Postgres, but by analogy with the versioned database (and Postgres just it, the type of Oracle), records within a transaction is copied with the new version, and after the transaction simply remain in place. Here the transaction is rolled back, especially, can be very painful.
In General, a Balk can only be in two things, lack of memory and lack of drive. When it comes to that, you can select the recording pieces 1000 or more pieces. To handle them, and committing, Then select the next piece. It is also easier for the database and for tracking purposes.

Find more questions by tags PostgreSQL