Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

I was right, it is

⁨83⁩ ⁨likes⁩

Submitted ⁨⁨4⁩ ⁨days⁩ ago⁩ by ⁨Lukstru@fedia.io⁩ to ⁨science_memes@mander.xyz⁩

https://fedia.io/media/1e/ea/1eeab4d3b120f66f5919bef11694b0f7e26e5ca37287f8d372f86fc583916f9b.jpg

source

Comments

Sort:hotnewtop
  • abbadon420@sh.itjust.works ⁨4⁩ ⁨days⁩ ago

    Took me 17 minutes to get the joke. Good video though.

    source
    • UnRelatedBurner@sh.itjust.works ⁨3⁩ ⁨days⁩ ago

      can I get my instant gratification please?

      source
      • abbadon420@sh.itjust.works ⁨3⁩ ⁨days⁩ ago

        Sure, imagine you’re training an AI model. You feed it data and test if it comes up with a good answer. Of course it doesn’t do that right away, thats why you.re training it. You have to correct it.

        If you correct the model by correcting the errors, you get overcompensation problems. If you correct it on the differences between the errors, you get a much better correction.

        The term for that is LOSS. You correct on LOSS in stead of on pure ERROR.

        source
        • -> View More Comments
  • AtariDump@lemmy.world ⁨3⁩ ⁨days⁩ ago

    Image

    source
  • sniggleboots@europe.pub ⁨4⁩ ⁨days⁩ ago

    I watched that video mere hours ago!

    source
  • Septimaeus@infosec.pub ⁨4⁩ ⁨days⁩ ago

    Gradient descent?

    source