Does it make sense from a CSV with a large amount of data?

For example, there are tens of thousands of lines, each of which has a cell of type 'title', 'description', 'name', 'year' and others.
Does it make sense to keep something like this to CSV for further use of these data? Be processed and they will not change, only read some of the columns in the key.
Or better all the same to use a database, like MySQL?
March 19th 20 at 09:13
2 answers
March 19th 20 at 09:15
Solution
Makes sense, especially if well zipnote file
It makes sense to keep Postgres if the fields there is an active selection, filtering, etc

tens of thousands of rows
not a big volume
No sampling, no filtration.
Once the table is filled (by the parser) and 1 time on the key information will be published in another place. - shayne.Gorczany commented on March 19th 20 at 09:18
@kshnk, form pack for a week and gzip -9 - Bart_Pfannersti commented on March 19th 20 at 09:21
And if millions of rows? - Baby.Hilll31 commented on March 19th 20 at 09:24
March 19th 20 at 09:17
CSV is good as a transport mechanism of data. For example, to upload the database dump somewhere a report of some kind, etc. It is more compact than JSON for table data.
And better to store data in a DBMS. This is a good case to use SQLite.

Find more questions by tags Data storageCSVPython