Wasn’t this literally the shady research that Facebook got caught doing with Cambridge Analytica? Specifically tweaking a user’s feed to be more negative resulted in that user posting more negative things themselves and more engagement overall.
Don't Engage with Trolls
Submitted 4 weeks ago by FundMECFSResearch@lemmy.blahaj.zone to memes@sopuli.xyz
https://lemmy.blahaj.zone/pictrs/image/910bcf68-bfdd-4c0f-81e6-459c6ec98497.webp
Comments
wizardbeard@lemmy.dbzer0.com 4 weeks ago
TheReturnOfPEB@reddthat.com 4 weeks ago
I wonder exactly how much of Hawaii Zuckerberg has to own before people start to question what they are getting from facebook
sp3tr4l@lemmy.zip 4 weeks ago
Yep!
Facebook figured out how to monetize trolling.
Over 10 years later, it’s destroyed society, but made them a lot of money.
nehal3m@sh.itjust.works 4 weeks ago
The old thread I posted this in was deleted, but I wrote this:
Okay so hear me out. I have this pet theory that might explain some of the divide between genders, but also political parties, causing paralysis which ultimately might lead to humanity’s extinction. Forgive me if I’m stating the obvious.
I’m going to set up two axioms to arrive at an extrapolated conclusion.
One: Human psychology tends to ascribe more weight to negative things than positive things in the short term. In the long term this generally balances out, but in the short term it’s more prudent in a biological sense to pay attention to the rustling in the bushes than the berries you might pick from them. This is known as the negativity bias.
Two: The modern gatekeepers of social interaction, Big Tech, employ blind algorithms that attempt to steer your attention towards spending more time on their platforms. These companies are the arbiters of the content we experience daily and what you do and don’t see is mostly at their discretion. The techniques they employ, in simple terms, are designed to provoke what they call ‘engagement’. They do this because at the end of the day FAANG have not only a financial interest, but a fiduciary duty to sell advertisements at the behest of their shareholders. The more they can engage you, the more ads they can sell. They employ live A-B testing, divide people into cohorts and poke and prod them with psychological techniques to try and glue your eyeballs to their ads.
Extrapolated conclusion: These companies have a financial and legally binding interest to divide the population against itself, obstructing politics and social interaction to the point where we might not be able to achieve any of the goals that we need to reach to prevent oblivion.
Thank you for coming to my TED Talk.
sp3tr4l@lemmy.zip 4 weeks ago
I don’t even think this is controversial in any way, in fact I used to assume this was just common knowledge after Cambridge Analytica…
I deleted, as in permanently, totally deleted my FB presence when that came out… but everyone else I explained … basically what you’ve just explained … to, thought I was insane or overreacting and paranoid.
…
Its simple.
Engagement, usage, time on platform is being optimized for.
What drives these things most effectively?
Hatred, outrage, extremely offensive things.
…
… And they know that they can, through exposing people to such things, make said people more extreme and hateful and anxious and depressed.
Zuckerberg stated at one point that his goal with Facebook was to be able to profile (and manipulate, but he didn’t say that part) users so well that he’d be able to predict what they’d post next.
He really did/does just view all social interaction as a very complex problem that can be ‘solved’, like a physics question can be solved, to make a predictive model.
They literally know that their business model is to ruin social discourse, ruin peoples mental health and their lives, to polarize society.
It should not be surprising in any way that, well now society is extremely polarized and mentally ill.
MouseKeyboard@ttrpg.network 4 weeks ago
For a long time Facebook counted an angry react as equal to five likes for measuring engagement. It’s very much intentional.
the_post_of_tom_joad@sh.itjust.works 4 weeks ago
The term “algorithm” in this context is simply a convenient term hiding the intentionalright wing radicalization of users to push them towards pro-business policies and
I’m quite tired of “algorithm” standing in for the intentions behind the owners who write and maintain it.
It was also an “algorithm” that inflated rent around the country, right?
An algorithm, yes. Written with the intention of inflating rent.
It’s not an accident.
iorale@lemmy.dbzer0.com 4 weeks ago
[deleted]reksas@sopuli.xyz 3 weeks ago
report to who? the ones who profit from them?
ristoril_zip@lemmy.zip 4 weeks ago
What I don’t get about this is why in this day and age with all the analytics tools we have do companies continue to just happily pay for simple eyeball exposure?
The only time they seem to have any pause at all on this model is if people post screenshots of ads for their products next to posts literally praising Nazis.
These so called AIs (LLMs) can learn to tell the difference between positive/happy/uplifting posts, neutral posts, and angry/sad/disturbing posts. The advertisers should be asking for their products to be featured next to the first and second groups of posts.
People engage based on anger, sure. They click posts and reply and whatnot. But do they click the ad next to a post that pisses them off and then buy the product?
Or is this purely a subconscious intrusion effort? Do the advertisers just want their products in front of eyeballs regardless of what’s around the ad? It seems like the answer is “no” when they’re called out. But maybe it’s “yes” if they can get away with it?
DiogenesOfMiami@lemmy.ml 4 weeks ago
I disagree with OP’s editorialized title.
As an avid video gamer, I find myself constantly encountering subtle and overt bigotry in most online games I play. I will always call them out for it, no matter how much whooping it incites from kids just eating their popcorn and enjoying the fight.
Ignoring them is how you let the Andrew Tates of the world win, because they’re certainly not taking the high road by remaining silent about their beliefs.
TriflingToad@lemmy.world 4 weeks ago
pelespirit@sh.itjust.works 4 weeks ago
They did a study around the 2020 elections and have found the following to work with trolls:
Respond once with the facts (if you must), and then walk away. I have found Lemmy not needing that most of the time, just downvoting seems to work. But if you’re on the place that shall not be named, this works.
FundMECFSResearch@lemmy.blahaj.zone 4 weeks ago
I wish lemmy had the feature where you can mute all replies to a comment.
pelespirit@sh.itjust.works 4 weeks ago
Yes, that and the tagging of users.
10_0@lemmy.world 4 weeks ago
Based and reposted
RememberTheApollo_@lemmy.world 4 weeks ago
I’ve been participating in Threads (yeah, I know, should be ashamed) and I’m unfortunately a sucker for some of the ragebait, especially political.
Guess what Threads pushes at me. A lot of the dumbest ragebait. Not people that actually want to have a conversation. My fault for being a sucker, but the algorithms work.
Doesn’t really matter, I’m shadowbanned. Pissed off too many republican propagandists by refuting them, so as usual, the “report” button is their remedy.
DaddleDew@lemmy.world 4 weeks ago
I think the word is ragebait
AVincentInSpace@pawb.social 4 weeks ago
…as opposed to platforms like Lemmy, where the only political ideologies you’ll find are “leftists” who, when asked what they even believe, respond with “what are you, a cop?”
the_post_of_tom_joad@sh.itjust.works 4 weeks ago
are you? 👮♂️ 👀
Lost_My_Mind@lemmy.world 4 weeks ago
If you’re a cop, you have to tell me, man! Like legally, you can’t arrest me without telling me you’re a cop!
WalkableProgrammer@lemmy.world 4 weeks ago
Every social media company’s content algorithm should be open source or at least a government agency should have code enforcement
whotookkarl@lemmy.world 4 weeks ago
Nobody likes Muhammad ibn Musa al-Khwarizmi anymore, I guess he had a good run since around 820 CE.
MonkderVierte@lemmy.ml 4 weeks ago
Is that cat from poland?
IMNOTCRAZYINSTITUTION@lemmy.world 4 weeks ago
no he’s from bangladesh
RGB@lemmy.today 4 weeks ago
Algorithms simply determine which posts will get the most interaction and feed it to people. Does it benefit corps? Of course! But it’s driven by people who choose to engage in this content.
Empricorn@feddit.nl 4 weeks ago
But… I love this cat and want my fingers to engage with the top of their fuzzy head!
umbrella@lemmy.ml 4 weeks ago
thats a cute cat i like it
tacosanonymous@lemm.ee 4 weeks ago
I feel like that says a lot about us.
Are we addicted to misery and then fighting bc of it?
moakley@lemmy.world 4 weeks ago
whose cat is this
traches@sh.itjust.works 4 weeks ago
They don’t have to, algorithms do whatever they are designed to do. Long division is an algorithm.
Profit motives are the issue here.
FundMECFSResearch@lemmy.blahaj.zone 4 weeks ago
Algorithms [based on engagement]
irmoz@lemmy.world 4 weeks ago
Yeah, the narrowing of the word “algorithm” to only mean “social media recommendation algorithms” is getting on my nerves.
traches@sh.itjust.works 4 weeks ago
It’s the only time normies encounter the word.
How do you feel about crypto?
snowsuit2654@lemmy.blahaj.zone 4 weeks ago
I’ve always thought it was so funny when people say tHe aLgOrItHM like it’s a bad word or something. I know they mean social media & marketing, but it’s funny to think that they’re very concerned about something like bubble sort.