After many years of being a developer I’ve come to the conclusion that the single strongest indicator of a person’s competence is how they handle CSV when asked to produce or consume it.
Comment on Just the way we likes it.
Jason2357@lemmy.ca 2 days agoGod I hate csv with the fire of a thousand suns.
Contractors never seem to know how to write them correctly. Last year, one even provided “csv”s that were just Oracle error messages. lol. Another told me their system could not quote string columns nor escape commas or use anything but commas as their separator, so there were unpredictable numbers of commas in the rows when the actual data contained commas. Total nightmare. And so much of my data has special character issues because somewhere in the pipeline a text encoding was wrong and there is exactly one mangled character in 5 million lines for me to find.
Give me the data as closely to the source data as you can. If it is a database, then a database dump or access to a clone of your database is the best option by far. I don’t care how obscure your shit is, Ill do the conversion myself.
For intermediate data, something like parquet or language specific formats like Rdata or pickle files. Maaaaybe very carefully created csv files for archival purposes, but even then, I think parquet is safe for the long haul nowadays.
vithigar@lemmy.ca 2 days ago
Jason2357@lemmy.ca 23 hours ago
I usually treat them by using an extremely well established library where someone else has spent the requisite years crying over every stupid edge case of csv reading. Rolling your own csv reader is a bit like encryption. Until someone hands you a file that rejects all sanity and you start fking with regex. Lol.
GTG3000@programming.dev 2 days ago
Reminds me of writing my own csv parser that implemented escapes properly. The one everyone else went with of course was written in regex, so it was faster… But broke if there were escaped newlines.
espurr@sopuli.xyz 2 days ago
What delimiter should I be using instead of commas?
rustydrd@sh.itjust.works 2 days ago
🤪 as a delimiter
🥦 for end of line
Jason2357@lemmy.ca 23 hours ago
The delimiter isn’t really the issue. Its that there are lots and lots of weird edge cases that break reading csvs. If you use commas, at minimum, you need to escape commas in the data, or quote strings that might contain commas… But now you have to deal with the possibility of a quote character or your escape character in the data.
Then you have the fact that csvs can be written with so many different character encodings, mangling special characters where they occur.
Aaand then you have all the issues that come with lack of metadata - good formats will at least tell you the type of data in each column so you dont have to guess them.
Lets see, its also really annoying to include any binary data in a csv, theres no redundancy or parity checks to catch currupted data, and they arent compressed so you need to tack on compression if you want efficient storage, but that means you always have to read the whole csv file for any task.
Oh, that brings me to the joys of modern columnar formats where you can read selected columns super fast without reading the whole file.
Oh god, I really kept going there. Sorry. Its been a year.
emergencyfood@sh.itjust.works 2 days ago
Use comma for delimiter, and escape any comma in the data by enclosing that entry in quotes.
Data: 225 | 2,500 | 450
CSV: 225,“2,500”,450
death_to_carrots@feddit.org 2 days ago
Semi-colons. Tabulators. Something not in the actual strings. However the Python CSV module it formats.
I_am_10_squirrels@beehaw.org 2 days ago
Alt-008
Jason2357@lemmy.ca 2 days ago
P.s. in the above quagmire, the only solution is choose to keep only the most important un-clean column per csv, and make it the last column in the file so you have predictable columns. If you need more, then write separate csvs. Computers are stupid.
deegeese@sopuli.xyz 2 days ago
If you could choose the column order, you could choose a better format, or at least escape correctly.
Jason2357@lemmy.ca 2 days ago
It was some sort of weird database frontend the contractor used. It was very limited.
kernelle@lemmy.dbzer0.com 2 days ago
I can’t tell you how many scripts I’ve written to format poorly made CSV files
harmbugler@piefed.social 2 days ago
The essence of data science
kernelle@lemmy.dbzer0.com 2 days ago
Image