Comment on YouTube will let users booted for 'repeated violations' of COVID, elections policies 'rejoin'
snooggums@piefed.world 2 days agoYeah, it wasn't a perfect solution so let's just give misinformation a microphone and a spot on the 6 o'clock news!
Plebcouncilman@sh.itjust.works 2 days ago
People will believe whatever they want to believe, you cannot suppress whatever you believe is misinformation. People in the streets say crazier shit every day, are you also gonna stop them from saying it? Fuck that man, the audience is the one that needs to learn to be discerning.
Fuck outta here with this fuckhead logic.
pillowtags@lemmy.dbzer0.com 2 days ago
Explain how it made things worse. That’s nonsense.
Plebcouncilman@sh.itjust.works 1 day ago
It gave them the excuse to build their own platforms in which their ideas could spread uncontested and at the same time made them more alluring because “forbidden” knowledge is so alluring to humans that perhaps the most famous myth in history is about how our species lost the perfect existence fell because of it.
You cannot make anything forbidden and expect that by doing so it won’t spread as long as there is a demand for it. This applies to ideas, drugs, guns, and pretty much everything everything. If the people want it they will get it. Alcohol is the perfect example: we tried to make it illegal and all it did was increase crime, violence and people kept drinking as much if not more than before. Fast forward to today, people drink less than ever. Give people the tools to tell right from wrong, correct from incorrect instead of trying to bubble wrap their world and then act surprised when they feel betrayed because someone told them there is another point of view (false as it may be). Let them see both point of views and let the very absurdity of the opposite view discredit itself.
If we cannot trust that people can make the correct decisions why then would we insist on democracy?
ordnance_qf_17_pounder@reddthat.com 2 days ago
Pre-Musk Twitter and current X is a good example of what happens when a platform completely drops its policies on misinformation. It is an order of magnitude worse now than it ever was before, and much more harmful to society.
If we’re worried about far-right idiots crying censorship, then we might as well fold to all their other demands and then the lunatics will truly be running the asylum.
Plebcouncilman@sh.itjust.works 1 day ago
If I’m to believe that I need to protect people from “bad” ideas and that they are not capable of discerning right from wrong, false from truth, them I will also have to believe that democracy itself is wrong because clearly we cannot allow these monkeys to make any decisions. Now while my heart of hearts might believe this to be true, I do not have apodictic certainty in that and instead I truly believe that education can make people take better decisions and help them discern right from wrong. As such I can never believe in labeling speech as allowed or not allowed, rather I would like to invest my energies into fostering curiosity, truth seeking and knowledge as perhaps the highest human virtues. So instead of burying speech we should be educating kids.
yucandu@lemmy.world 2 days ago
We need to be the ones willing to teach them.
Plebcouncilman@sh.itjust.works 2 days ago
I don’t know that I believe in that sort of paternalistic attitude what I do know is that Google et al have no business dictating what is or isn’t misinformation. It’s a double edged blade.
Dionysus@leminal.space 2 days ago
Breathing bleach, very bright light in veins, horse worm paste to fight covid.
Oh fuck right off.
This isn’t a grey area “Oh Mike’s car sounds louder so it’s faster” information. There are objectively factual truths at play and these flat earth worm paste eating fuckwits want to play Don’t Look Up in real life.