Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Pandas

⁨356⁩ ⁨likes⁩

Submitted ⁨⁨9⁩ ⁨months⁩ ago⁩ by ⁨fossilesque@mander.xyz⁩ to ⁨science_memes@mander.xyz⁩

https://mander.xyz/pictrs/image/1d2cf749-0714-446b-a819-90dd725d148a.jpeg

source

Comments

Sort:hotnewtop
  • Kausta@lemm.ee ⁨9⁩ ⁨months⁩ ago

    You havent seen anything until you need to put a 4.2gb gzipped csv into a pandas dataframe, which works without any issues I should note.

    source
    • thisfro@slrpnk.net ⁨9⁩ ⁨months⁩ ago

      I raise you thousands of gzipped files (total > 20GB) combined into one dataframe. Frankly, my work laptop did not like it all that much. But most basic operations still worked fine tho

      source
    • QuizzaciousOtter@lemm.ee ⁨9⁩ ⁨months⁩ ago

      I really don’t think that’s a lot either. Nowadays we routinely process terabytes of data.

      source
      • Kausta@lemm.ee ⁨9⁩ ⁨months⁩ ago

        Yeah, it was just a simple example. Although using just pandas (without something like dask) for loading terabytes of data at once into a single dataframe may not be the best idea, even with enough memory.

        source
    • whotookkarl@lemmy.world ⁨9⁩ ⁨months⁩ ago

      It’s good to see the occult is still alive and well

      source
  • QuizzaciousOtter@lemm.ee ⁨9⁩ ⁨months⁩ ago

    Is 600 MB a lot for pandas? Of course, CSV isn’t really optimal but I would’ve sworn pandas happily works with gigabytes of data.

    source
    • MoonHawk@lemmy.world ⁨9⁩ ⁨months⁩ ago

      What do you mean not optimal? This is quite literally the most popular format for any serious data handling and exchange. One byte per separator and newline is all you need. It is not compressed so allows you to stream as well. If you don’t need structure it is massively better than others

      source
      • QuizzaciousOtter@lemm.ee ⁨9⁩ ⁨months⁩ ago

        I think portability and easy parsing is the only advantage od CSV. It’s definitely good enough (maybe even the best) for small datasets but if you have a lot of data you need a compressed binary format, something like parquet.

        source
      • elmicha@feddit.org ⁨9⁩ ⁨months⁩ ago

        But which separator is it, and which line ending? ASCII, UTF-8, UTF-16 or something else? What about quoting separators and line endings? Yes, there is an RFC, but a million programs were made before the RFC and won’t change their ways now.

        Also you can gzip CSV and still stream them.

        source
      • merari42@lemmy.world ⁨9⁩ ⁨months⁩ ago

        Have you heard that there are great serialised file formats like .parquet from appache arrow, that can easily be used in typical data science packages like duckdb or polars. Perhaps it even works with pandas (although do not know it that well. I avoid pandas as much as possible as someone who comes from the R tidyverse and try to use polars more when I work in python, because it often feels more intuitive to work with for me.)

        source
        • -> View More Comments
      • HappyFrog@lemmy.blahaj.zone ⁨9⁩ ⁨months⁩ ago

        Wait till you hear about WSV

        source
      • candyman337@sh.itjust.works ⁨9⁩ ⁨months⁩ ago

        If you have a csv bigger than like 500mb you need more than 8gb of ram to open it

        source
    • tequinhu@lemmy.world ⁨9⁩ ⁨months⁩ ago

      It really depends on the machine that is running the code. Pandas will always have the entire thing loaded in memory, and while 600Mb is not a concern for our modern laptops running a single analysis at a time, it can get really messy if the person is not thinking about hardware limitations

      source
      • naught@sh.itjust.works ⁨9⁩ ⁨months⁩ ago

        Pandas supports lazy loading and can read files in chunks. Hell, even regular ole Python doesn’t need to read the whole file at once with csv

        source
        • -> View More Comments
    • marcos@lemmy.world ⁨9⁩ ⁨months⁩ ago

      Is 600 MB a lot for pandas?

      No, but it’s easy to make a program in Python that doesn’t like it.

      source
      • QuizzaciousOtter@lemm.ee ⁨9⁩ ⁨months⁩ ago

        Oh, I know, believe me. I have some painful first-hand experience with such code.

        source
    • gigachad@sh.itjust.works ⁨9⁩ ⁨months⁩ ago

      I guess it’s more of a critique of how bad CSV is for storing large data than pandas being inefficient

      source
      • zaphod@sopuli.xyz ⁨9⁩ ⁨months⁩ ago

        CSV is not optimal, but then someone shows up and gives you 60GB of JSON instead of 600MB of CSV.

        source
        • -> View More Comments
      • ikilledlaurapalmer@lemmy.world ⁨9⁩ ⁨months⁩ ago

        Fine! .csv.gz ftw!

        source
    • mvirts@lemmy.world ⁨9⁩ ⁨months⁩ ago

      It’s more likely you’ll eat up storage when you read a 600mb parquet and try to write it as CSV.

      source
      • QuizzaciousOtter@lemm.ee ⁨9⁩ ⁨months⁩ ago

        I mean, yeah, that’s the point of compression. I don’t quite get what you mean by that comment.

        source
        • -> View More Comments
  • mvirts@lemmy.world ⁨9⁩ ⁨months⁩ ago

    600MB? What is this, 2004?

    source
  • anzo@programming.dev ⁨9⁩ ⁨months⁩ ago

    pola.rs enters the chat…

    source
    • troyunrau@lemmy.ca ⁨9⁩ ⁨months⁩ ago

      Hell, depending on what you’re doing, reading it into a numpy array instead of a panda dataframe will yeild huge performance gains too.

      Been meaning to try polars… Haven’t had a good excuse yet :)

      source
  • fadhl3y@lemmy.world ⁨9⁩ ⁨months⁩ ago

    No, just buy some more RAM. 64Gb is the minimum for a professional data analyst. 128Gb, is the sweet spot.

    source
  • wallmenis@lemmy.one ⁨9⁩ ⁨months⁩ ago

    Just read a few at a time…

    source
  • Barx@hexbear.net ⁨9⁩ ⁨months⁩ ago

    And there are like 8 software projects dedicated to making pandas wrappers that work with large datasets because this is somehow better than engineers and statisticians learning SQL or some kind of distributed calculations strategy.

    source
    • psud@aussie.zone ⁨9⁩ ⁨months⁩ ago

      Compared to other technical skills, SQL to a level needed by a data analyst has to be the easiest. It’s easier than learning Excel

      source
      • blindbunny@lemmy.ml ⁨9⁩ ⁨months⁩ ago

        But how else are the suits going to lick Microsoft’s boots?

        source
  • Buddahriffic@lemmy.world ⁨9⁩ ⁨months⁩ ago

    Did taking that picture damage that gun? It doesn’t look like the barrel is parallel to the rest of the frame (or whatever it’s called).

    Or is it deliberately angled upwards to add some automatic bullet drop compensation to the sights?

    source
    • randombullet@programming.dev ⁨9⁩ ⁨months⁩ ago

      Barrels are angled upwards to unlock the chamber and allow the bullet to ride into the chamber easier.

      source
  • FiniteBanjo@lemmy.today ⁨9⁩ ⁨months⁩ ago

    CSV are a cool concept. Not so much any standard but rather a text doc where values are separated by commas. Sometimes banks use them and its hell to format them for Excel. Sometimes its just a list of readable words and values.

    I had to build a Twitch Bot to add banned words in a CSV to a black list a while back, wish they would just let you copy paste like YT does.

    source
  • ColeSloth@discuss.tchncs.de ⁨9⁩ ⁨months⁩ ago

    “Constipated”

    source
  • propter_hog@hexbear.net ⁨9⁩ ⁨months⁩ ago

    I do this daily haha

    source