I had to work with JSON files that are several gigs in size (6 - 7, if memory serves). Such a large file for a long time and cast to boot, doesn't fit in memory in the form of object models, in General, bad idea to make such files, but the data provider grants them this way and have to work.
So I built a simple utility
, which is a kind of SAX
parser, but JSON.
Utility takes a JSON as a text data stream, responds to the opening and closing tags, and outputs the pieces of item in a sequence of individual JSON strings separated by a paragraph (in accordance with RFC7464
With such a stream of similar small pieces much easier to handle.
Then, by the way, I suggested that the well-known and wonderful tool JQ
is also able to operate in this mode (SAX). But I already decided in the intricacies of command-line options for such a regime JQ did not go deep.