How to effectively (a) record huge amount of data in the file?
Hi all. I am an aspiring Java developer and I need the advice of specialists. There is a task: there are processing huge () number of files (searching, indexing etc). In the process you want to record to a file (well, I solve this problem) 10^6 - 20^6 rows (of old files). Please advise how to do it in the fastest time.
I would be very grateful for the help).
make a test run and see what is the place of the system is a bottleneck.
if plugging in media recorder to compress the data.
if plugging in the processing of the results, to put percent more powerful.
if plugging in obtaining heaps of information about files, speed up file system.
Zoey_Oberbrunner6 answered on June 3rd 19 at 21:00
Buffer. For writing on baltiku is wildly slow.
Kasey.Cruickshank answered on June 3rd 19 at 21:02
And, you can use distributed processing.... and look to the side such as hadoop or spark, hazelcast, ingine.
Well, as wrote in comments to the answer - to apply a compression lz4 or snappy