Is 600 MB a lot for pandas? Of course, CSV isn’t really optimal but I would’ve sworn pandas happily works with gigabytes of data.
Pandas
Submitted 4 months ago by fossilesque@mander.xyz to science_memes@mander.xyz
https://mander.xyz/pictrs/image/1d2cf749-0714-446b-a819-90dd725d148a.jpeg
Comments
QuizzaciousOtter@lemm.ee 4 months ago
MoonHawk@lemmy.world 4 months ago
What do you mean not optimal? This is quite literally the most popular format for any serious data handling and exchange. One byte per separator and newline is all you need. It is not compressed so allows you to stream as well. If you don’t need structure it is massively better than others
QuizzaciousOtter@lemm.ee 4 months ago
I think portability and easy parsing is the only advantage od CSV. It’s definitely good enough (maybe even the best) for small datasets but if you have a lot of data you need a compressed binary format, something like parquet.
elmicha@feddit.org 4 months ago
But which separator is it, and which line ending? ASCII, UTF-8, UTF-16 or something else? What about quoting separators and line endings? Yes, there is an RFC, but a million programs were made before the RFC and won’t change their ways now.
Also you can gzip CSV and still stream them.
merari42@lemmy.world 4 months ago
Have you heard that there are great serialised file formats like .parquet from appache arrow, that can easily be used in typical data science packages like duckdb or polars. Perhaps it even works with pandas (although do not know it that well. I avoid pandas as much as possible as someone who comes from the R tidyverse and try to use polars more when I work in python, because it often feels more intuitive to work with for me.)
HappyFrog@lemmy.blahaj.zone 4 months ago
candyman337@sh.itjust.works 4 months ago
If you have a csv bigger than like 500mb you need more than 8gb of ram to open it
tequinhu@lemmy.world 4 months ago
It really depends on the machine that is running the code. Pandas will always have the entire thing loaded in memory, and while 600Mb is not a concern for our modern laptops running a single analysis at a time, it can get really messy if the person is not thinking about hardware limitations
naught@sh.itjust.works 4 months ago
Pandas supports lazy loading and can read files in chunks. Hell, even regular ole Python doesn’t need to read the whole file at once with
csv
marcos@lemmy.world 4 months ago
Is 600 MB a lot for pandas?
No, but it’s easy to make a program in Python that doesn’t like it.
QuizzaciousOtter@lemm.ee 4 months ago
Oh, I know, believe me. I have some painful first-hand experience with such code.
gigachad@sh.itjust.works 4 months ago
I guess it’s more of a critique of how bad CSV is for storing large data than pandas being inefficient
zaphod@sopuli.xyz 4 months ago
CSV is not optimal, but then someone shows up and gives you 60GB of JSON instead of 600MB of CSV.
ikilledlaurapalmer@lemmy.world 4 months ago
Fine! .csv.gz ftw!
mvirts@lemmy.world 4 months ago
It’s more likely you’ll eat up storage when you read a 600mb parquet and try to write it as CSV.
QuizzaciousOtter@lemm.ee 4 months ago
I mean, yeah, that’s the point of compression. I don’t quite get what you mean by that comment.
mvirts@lemmy.world 4 months ago
600MB? What is this, 2004?
anzo@programming.dev 4 months ago
pola.rs enters the chat…
troyunrau@lemmy.ca 4 months ago
Hell, depending on what you’re doing, reading it into a numpy array instead of a panda dataframe will yeild huge performance gains too.
Been meaning to try polars… Haven’t had a good excuse yet :)
fadhl3y@lemmy.world 4 months ago
No, just buy some more RAM. 64Gb is the minimum for a professional data analyst. 128Gb, is the sweet spot.
wallmenis@lemmy.one 4 months ago
Just read a few at a time…
Barx@hexbear.net 4 months ago
And there are like 8 software projects dedicated to making pandas wrappers that work with large datasets because this is somehow better than engineers and statisticians learning SQL or some kind of distributed calculations strategy.
psud@aussie.zone 4 months ago
Compared to other technical skills, SQL to a level needed by a data analyst has to be the easiest. It’s easier than learning Excel
blindbunny@lemmy.ml 4 months ago
But how else are the suits going to lick Microsoft’s boots?
Buddahriffic@lemmy.world 4 months ago
Did taking that picture damage that gun? It doesn’t look like the barrel is parallel to the rest of the frame (or whatever it’s called).
Or is it deliberately angled upwards to add some automatic bullet drop compensation to the sights?
randombullet@programming.dev 4 months ago
Barrels are angled upwards to unlock the chamber and allow the bullet to ride into the chamber easier.
FiniteBanjo@lemmy.today 4 months ago
CSV are a cool concept. Not so much any standard but rather a text doc where values are separated by commas. Sometimes banks use them and its hell to format them for Excel. Sometimes its just a list of readable words and values.
I had to build a Twitch Bot to add banned words in a CSV to a black list a while back, wish they would just let you copy paste like YT does.
ColeSloth@discuss.tchncs.de 4 months ago
“Constipated”
propter_hog@hexbear.net 4 months ago
I do this daily haha
Kausta@lemm.ee 4 months ago
You havent seen anything until you need to put a 4.2gb gzipped csv into a pandas dataframe, which works without any issues I should note.
thisfro@slrpnk.net 4 months ago
I raise you thousands of gzipped files (total > 20GB) combined into one dataframe. Frankly, my work laptop did not like it all that much. But most basic operations still worked fine tho
QuizzaciousOtter@lemm.ee 4 months ago
I really don’t think that’s a lot either. Nowadays we routinely process terabytes of data.
Kausta@lemm.ee 4 months ago
Yeah, it was just a simple example. Although using just pandas (without something like dask) for loading terabytes of data at once into a single dataframe may not be the best idea, even with enough memory.
whotookkarl@lemmy.world 4 months ago
It’s good to see the occult is still alive and well