What is the limit on the amount of data transferred in the Electron via ipcRenderer?

Faced with an unresponsive app on Electron, when it began to pass through ipcRenderer array of size 1.5 MB. The app just wedges. Debug in Visual Studio Code sticks. No exception is thrown.

Read the documentation on ipcMain, ipcRenderer, EventEmitter, - nowhere is there anything about the limit. The gugleniya did not help. This is a bug or something still is written about the limitation?

-----

Probably will have to cut an array into pieces and push them apart, but then the total size of the "package" will vary from time to time, plus all this turns itself ipcRenderer-Ohm in JSON a line with an unpredictable ending length.

UPD.
After similar problems with the arrays tested. As can be seen, a stack overflow occurs, not the total amount of data, and their number.

// number of elements element size
function test(itemsCount, itemSize) {
 const str = '1'.repeat(itemSize)

 let arr2 = []

 let i = itemsCount
 while (i--) {
arr2.push(str)
}

 let array1 = []
 Array.prototype.push.apply(arr1, arr2)
}

// test(1000000000, 100) // 100.000.000.000 - 1 billion * 100 - just closes without errors
// test(100000000, 100) // 10.000.000.000 - 100 million * 100 - freezes then just closes without errors
// test(10000000, 100) // 1.000.000.000 - 10 million * 100 - crashes the window on stack overflow
// test(1000000, 100) // 100.000.000 - 1 million * 100 - crashes the window on stack overflow
// test(100000, 100) // 10.000.000 - 100 thousand * 100 - ok
// test(10000, 1000) // 10.000.000 - 10 thousand * 1000 - ok
// test(100000, 1000) // 100.000.000 - 100 thousand * 1000 - ok !
// test(100000, 10000) // 1.000.000.000 - 100 thousand * 10.000 - ok !
// test(100000, 100000) // 10.000.000.000 - 100 thousand * 100.000 - ok !
// test(100000, 1000000) // 100.000.000.000 - 100 thousand * 1.000.000 - ok !
June 26th 19 at 14:21
1 answer
June 26th 19 at 14:23
I do not know, no restrictions should not be.
Try to run it without the debugger.
If nothing helps, install export NODE_ENV=production.
Maybe not fully, but freeze has 5-10 times smaller.
Noticed the problem when I began to pass 13 million records of objects somewhere for 128 bytes. At random found that with a big pause of a few seconds to be eating 10 of thousands of records. Now broke into blocks 1000 and everything started to fly faster than the entire 10 thousand

I guess a bug in the wrapping of data in a JSON string inside ipcRenderer.

Without the debugger to work he can not, but he's alive. He writes that hangs Noda (1000 tick of silence or something like that), that it is. - royce commented on June 26th 19 at 14:26
: there as I remember used child processes, maybe there is the same limitation. - sally_Carro commented on June 26th 19 at 14:29
: I had a similar problem at birth subprocess, h passed a very large amount of data and it took off with insults. But it's not your topic, but rather analogy. - sally_Carro commented on June 26th 19 at 14:32
:
In the debugger VSC before sending the data included "All exceptions". No exception pop up. I have the feeling that something just goes in an infinite loop. About the same speaks and the debugger, when you do not see ticks in the node. - royce commented on June 26th 19 at 14:35
:
Surfaced a similar problem with arrays. Could not merge the two arrays, if the length adding more 124500. Crashes stack overflow. When the stupid debugger hangs. I think the roots of the problem are the same. - royce commented on June 26th 19 at 14:38
: I think there is no length and volume, you have a stack of overflow crashes? - sally_Carro commented on June 26th 19 at 14:41
13 thousands of records of objects somewhere for 128 bytes.


how do you think the allocated object memory? - Ralph_Cremin commented on June 26th 19 at 14:44
: adding to the array it crashes on stack overflow. Yes, most likely the amount of data, not the size of the array. But it's really bad, because it is impossible to calculate the amount of data. - royce commented on June 26th 19 at 14:47
: just guessed. Recording is almost the text strings of the same length. - royce commented on June 26th 19 at 14:50
:
Conducted tests. Cm. UPD in question. Stack overflow does not occur from the total amount of data and their number. - royce commented on June 26th 19 at 14:53

Find more questions by tags Node.jsVisual Studio Code