Ryzen is produced by amd, why are we not allowed to append it to the ‘amd’ file?
preferences
Submitted 3 weeks ago by nave@lemmy.ca to [deleted]
https://lemmy.ca/pictrs/image/33be2735-d226-4294-8e6c-4a770fdce16d.jpeg
Comments
Longpork3@lemmy.nz 3 weeks ago
Lyra_Lycan@lemmy.blahaj.zone 2 weeks ago
I shouldn’t be surprised that most here don’t seem to realise that’s not a ‘greater than’ sign
ftbd@feddit.org 2 weeks ago
I read that as \gg, meaning “significantly greater than”
FilthyShrooms@lemmy.world 3 weeks ago
Nah that’s rage bait, AMD is way better than Ryzen
blackstampede@sh.itjust.works 2 weeks ago
Wrong. Ryzen >> amd = 0. The string “amd” is 6384996, when interpreted as a decimal. Right shifting by that many digits turns Ryzen (0101001001111001011110100110010101101110) into 0.
one_knight_scripting@lemmy.world 2 weeks ago
This is not a programming language, this is bash.
Does not right-shift bits, it appends to a file.
railwhale@lemmy.nz 3 weeks ago
Just like how the RTX cards are better than Nvidia
WhiskyTangoFoxtrot@lemmy.world 2 weeks ago
And Mystique is better than Matrox.
anzo@programming.dev 2 weeks ago
And your mommy is sexier than my lover ;p
stevedice@sh.itjust.works 2 weeks ago
He’s appending the output of the command Ryzen to the AMD file. I don’t see the confusion.
Ascend910@lemmy.ml 2 weeks ago
Ah yes I love my Megabytes - Intel Ryzen 3060 XT3D
WereCat@lemmy.world 2 weeks ago
Laughs in RTX Voodoo 6000X3D which only needs two external power bricks
Siegfried@lemmy.world 2 weeks ago
Man of culture… and age
littlebigendian@lemmy.zip 2 weeks ago
Bit shifting Ryzen by a factor of amd. Makes perfect sense
MotoAsh@lemmy.world 3 weeks ago
Maybe they mean ATI and are misremembering Radeon? Granted it’s been well over a decade, but I could see someone saying it anyways. For a long time, ATI cards were better than what was coming out of the merger (not literally because generational gains, but by relative comparison).
carrylex@lemmy.world 2 weeks ago
frezik@lemmy.blahaj.zone 2 weeks ago
I think you can still find it mentioned on the product detail stickers or the PCB silkscreen or such. They still have the name in a few places, probably to maintain the trademark.
prole@lemmy.blahaj.zone 2 weeks ago
Probably more like they don’t realize Ryzen is AMD’s new line
Ryzen hasn’t been “new” in a long time
MotoAsh@lemmy.world 2 weeks ago
I’m mostly working off that there is clearly some kind of disconnect here. Of course anyone remotely close to in the know, knows Ryzen is an AMD line of multiple generations by now.
insomniac_lemon@lemmy.cafe 3 weeks ago
Yeah, though I would say there is no need for the extra step of ATI.
Specifically due to value, though Ryzen was a better value when it started (pricing more Intel-like as soon as Ryzen became successful). Well… a Ryzen APU might still be better value at ultra-low-end compared to a new GPU, though probably better off with a used Polaris GPU.
To me it just seems like GPUs are still stagnated due to cryptomining, though gaming and raytracing hype probably doesn’t help either.
MotoAsh@lemmy.world 2 weeks ago
It was crypto mining. Now it’s “AI”.
ZkhqrD5o@lemmy.world 2 weeks ago
If I’m going to build a new computer, I’d like to use an AMD GPU. Problem is, they treat their GPU software like the red-headed stepchild of the family.
accideath@feddit.org 2 weeks ago
Obligatory switch to Linux notice. The AMD drivers for linux put the nvidia drivers for windows to shame.
ZkhqrD5o@lemmy.world 2 weeks ago
I’ve been running PopOS with an RTX 3080 for years now and I’m absolutely happy and zero chance of switching back to this Microsoft trash.
I have a problem with how AMD handles their software called “Rocm”. It’s basically AMD’s version of CUDA and it’s a complete mess. It’s ambiguous which cards are supported and which aren’t. They have gotten better with this problem over the past few years. But for example, their latest private customer graphics card is only supported on Ubuntu. Other products are only supported on other distros. Some cards, who aren’t even listed as supported, are very well supported in all distros. That’s what I mean, it’s a mess. Essentially, the only way to find out is take the bullet and plug it in, see what happens. I mean, Nvidia is a trash company that makes Apple and Microsoft look like saints. But at least, if you buy one of their products, you know it runs CUDA. No support matrix needed. Everything, no matter how old it is, supports CUDA. The company and their graphics cards are still trash though. Unless you buy the most expensive model, of course. Then all of your problems miraculously go away. cries in 10GB VRAM
empireOfLove2@lemmy.dbzer0.com 3 weeks ago
the type shi you see in WCCF comment sections
GrindingGears@lemmy.ca 3 weeks ago
Bruh…
xia@lemmy.sdf.org 2 weeks ago
When you get your C++ extraction operator backwards?
expatriado@lemmy.world 3 weeks ago
average online commentator