Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Is it worth it??

⁨613⁩ ⁨likes⁩

Submitted ⁨⁨5⁩ ⁨months⁩ ago⁩ by ⁨fossilesque@mander.xyz⁩ to ⁨science_memes@mander.xyz⁩

https://mander.xyz/pictrs/image/8ed07435-8225-477d-b8ac-9ab6fd1fca6e.png

source

Comments

Sort:hotnewtop
  • slackassassin@sh.itjust.works ⁨5⁩ ⁨months⁩ ago

    Working with pretrained models implemented in FPGAs for particle identification and tracking. It’s much faster and exactly as accurate. ¯\_(ツ)_/¯

    source
    • daniskarma@lemmy.dbzer0.com ⁨5⁩ ⁨months⁩ ago

      Run, the butlerian jihad it’s already going your way.

      source
  • Clent@lemmy.world ⁨5⁩ ⁨months⁩ ago

    The actual model required for general purpose likely lies beyond the range of petabytes of memory.

    These models are using gigabytes and the trend indicates its exponential. A couple more gigabytes isn’t going to cut it. Layers cannot expand the predictive capabilities without increasing the error. I’m sure a proof of that will be along within in the next few years.

    source
    • Krauerking@lemy.lol ⁨5⁩ ⁨months⁩ ago

      “come on man, I just need a couple more pets of your data and I will totally be able to predict you something useful!”. It’s capacitors flip polarity in anticipation.

      “I swear man! It’s only a couple of orders of magnitude more, man! And all your dreams will come true. I’m sure I’ll service you right!”

      Well if it needs it, right?

      source
  • fckreddit@lemmy.ml ⁨5⁩ ⁨months⁩ ago

    “There is no free lunch.”, is a saying in ML research.

    source
    • SturgiesYrFase@lemmy.ml ⁨5⁩ ⁨months⁩ ago

      That’s just a saying.

      source
  • Dirac@lemmy.today ⁨5⁩ ⁨months⁩ ago

    Source?

    source
    • fossilesque@mander.xyz ⁨5⁩ ⁨months⁩ ago

      reddit.com/…/machine_learning_in_physics_research…

      source
      • Dirac@lemmy.today ⁨5⁩ ⁨months⁩ ago

        Hahahahaha I meant for the statistics, but I appreciate ya!

        source
  • propter_hog@hexbear.net ⁨5⁩ ⁨months⁩ ago

    GET YOUR SHIT TOGETHER, CORAL

    source
  • azi@mander.xyz ⁨5⁩ ⁨months⁩ ago

    There’s plenty of stuff where ML algorithms the state of the art. For example the raw data from nanopore DNA sequencing machines is extremely noisy and ML algorithms clean it up with much less error than the Markov chains used in years previous.

    source
  • Collatz_problem@hexbear.net ⁨5⁩ ⁨months⁩ ago

    It is not even faster usually.

    source
    • propter_hog@hexbear.net ⁨5⁩ ⁨months⁩ ago

      And if it is faster, it just converges to the wrong answer faster

      source
  • Alexstarfire@lemmy.world ⁨5⁩ ⁨months⁩ ago

    For the meme? The Walking Dead. For the content? No idea.

    source
  • belated_frog_pants@beehaw.org ⁨5⁩ ⁨months⁩ ago

    Ai sucks ass, stop using it

    source
    • UnrepententProcrastinator@lemmy.ca ⁨5⁩ ⁨months⁩ ago

      It doesn’t. It’s just overhyped.

      source
  • bigbrowncommie69@hexbear.net ⁨5⁩ ⁨months⁩ ago

    Pretty much the only thing it’s even remotely good for is as a toy.

    source
  • BugleFingers@lemmy.world ⁨5⁩ ⁨months⁩ ago

    A lot of new tech is not as efficient or equally so at the get go. Learning how to properly implement and utilize it is part of the process.

    Right now we are just throwing raw computing power in ML format at it. As soon as it catches and shows a little promise in an area we can focus and refine. Sometimes you need to use the shotgun to see the rabbits ya know?

    source
    • rando895@lemmygrad.ml ⁨5⁩ ⁨months⁩ ago

      Physicists abhor a black box. So long as it is an option, most will choose not to use AI to any great extent, and will chastise those who do.

      source
  • Nasan@sopuli.xyz ⁨5⁩ ⁨months⁩ ago

    Coral*

    source
  • Reddfugee42@lemmy.world ⁨5⁩ ⁨months⁩ ago

    So what you’re saying, Dad, is it’s nascent and already faster? Gotcha.

    source