theluddite
@theluddite@lemmy.ml
I write about technology at theluddite.org
- Comment on After is a new dating app that tries to tackle ghosting 2 months ago:
Glad to hear it!
- Comment on After is a new dating app that tries to tackle ghosting 2 months ago:
Totally agreed. I didn’t mean to say that it’s a failure if it doesn’t properly encapsulate all complexity, but that the inability to do so has implications for design. In this specific case (as in many cases), the error they’re making is that they don’t realize that the root of them problem that they’re trying to solve lies in that tension.
The platform and environment are something you can shape even without an established or physical community.
Again, couldn’t agree more! The problem here is that the platform is actually extremely powerful and can easily change behavior in undesirable ways for users. That’s, I think, a big part of where ghosting comes from in the first place, and by thinking that you can just bolt a new thing onto the Tinder model is to repeat the original error.
- Comment on After is a new dating app that tries to tackle ghosting 2 months ago:
This app fundamentally misunderstands the problem. Your friend sets you up on a date. Are you going to treat that person horribly. Of course not. Why? First and foremost, because you’re not a dick. Your date is a human being who, like you, is worthy and deserving of basic respect and decency. Second, because your mutual friendship holds you accountable. Relationships in communities have an overlapping structure that mutually impact each other. Accountability is an emergent property of that structure, not something that can be implemented by an app. When you meet people via an app, you strip both the humanity and the community, and with it goes the individual and community accountability.
I’ve written about this tension before: As we use computers more and more to mediate human relationships, we’ll increasingly find that being human and doing human things is actually too complicated to be legible to computers, which need everything spelled out in mathematically precise detail. Human relationships, like dating, are particularly complicated, so to make them legible to computers, you necessarily lose some of the humanity.
Companies that try to whack-a-mole patch the problems with that will find that their patches are going to suffer from the same problem: Their accountability structure is a flat shallow version of genuine human accountability, and will itself result in pathological behavior. The problem is recursive.
- Comment on Vance promised a new dawn for workers — one Trump didn’t bring to pass 5 months ago:
This is a garbage piece. Anyone with even a passing knowledge of history knows that you can’t just report on what fascist movements say, with a few fact checks thrown in. JD Vance doesn’t give a single shit about workers. The American fascist movement, like all such movements, is interested in appropriating the very real grievances of workers into a spectacle that services power rather than challenges it. Walter Benjamin calls this the aestheticization of politics.
Fascism attempts to organize the newly proletarianized masses without affecting the property structure which the masses strive to eliminate. Fascism sees its salvation in giving these masses not their right, but instead a chance to express themselves. The masses have a right to change property relations; Fascism seeks to give them an expression while preserving property. The logical result of Fascism is the introduction of aesthetics into political life.
- Comment on Goldman Sachs: AI Is Overhyped, Wildly Expensive, and Unreliable 5 months ago:
Investment giant Goldman Sachs published a research paper
It’s not a research paper. It’s a report. This may seem like a nit-pick, but journalists need to (re-)learn to carefully distinguish between the thing that scientists do and corporate R&D, even though we sometimes use the word “research” for both. The AI hype in particular has been absolutely terrible for this. Companies have learned that putting out AI “research” that’s just them poking at their own product but dressed up in a science-lookin’ paper leads to an avalanche of free press from lazy credulous morons gorging themselves on the hype. I’ve written about this problem a lot. For example, in this post, which is about how Google wrote a so-called paper about how their LLM does compared to doctors, only for the press to uncritically repeat (and embellish on) the results as a headline. Had anyone in the press actually fucking bothered to read the paper critically, they would’ve noticed that it’s actually junk science.
- Comment on Biden, Trump Die 2 Minutes Apart Holding Hands 6 months ago:
I have been predicting for well over a year now that they will both die before the election, but after the primaries, such that we can’t change the ballots, and when Americans go to vote, we will vote between two dead guys. Everyone always asks “I wonder what happens then,” and while I’m sure that there’s a technical legal answer to that question, the real answer is that no one knows,
- Comment on How the coming flood of AI-generated content might actually free the soul of Internet StayGrounded.online 8 months ago:
Haha I was actually paraphrasing myself from last year, but I’ve seen that because lots of readers sent me that article when it came out a few months later, for obvious reasons!
- Comment on How the coming flood of AI-generated content might actually free the soul of Internet StayGrounded.online 8 months ago:
I completely and totally agree with the article that its current manifestation is in crisis, but I’m much less sanguine about the outcomes. The problem with the theory presented here, to me, is that it’s missing a theory of power. The attention economy isn’t an accident, but the result of the inherently political nature of society. Humans, being social animals, gain power by convincing other people of things. From David Graeber (who I’m always quoting lol):
Politics, after all, is the art of persuasion; the political is that dimension of social life in which things really do become true if enough people believe them. The problem is that in order to play the game effectively, one can never acknowledge this: it may be true that, if I could convince everyone in the world that I was the King of France, I would in fact become the King of France; but it would never work if I were to admit that this was the only basis of my claim.
In other words, just because algorithmic social media becomes uninteresting doesn’t mean the death of the attention economy as such, because the attention economy is something innate to humanity, in some form. Today its algorithmic feeds, but 500 years ago it was royal ownership of printing presses.
I think we already see the beginnings of the next round. As an example, the YouTuber Veritsasium has been doing educational videos about science for over a decade, and he’s by and large good and reliable. Recently, he did a video about self-driving cars, sponsored by Waymo, which was full of (what I’ll charitably call) problematic claims that were clearly written by Waymo, as fellow YouTuber Tom Nicholson pointed out. Veritasium is a human that makes good videos. People follow him directly, bypassing algorithmic shenanigans, but Waymo was able to leverage their resources to get into that trusted, no-algorithm space. We live in a society that commodifies everything, and as human-made content becomes rarer, more people like Veritsaium will be presented with more and increasingly lucrative opportunities to sell bits and pieces of their authenticity for manufactured content (be it by AI or a marketing team), while new people that could be like Veritsaium will be drowned out by the heaps of bullshit clogging up the web.
This has an analogy in our physical world. As more and more of our physical world looks the same, as a result of the homogenizing forces of capital (office parks, suburbia, generic blocky bulidings, etc.), the fewer and fewer remaining parts that are special, like say Venice, become too valuable for their own survival. They become “touristy,” which is itself a sort of ironically homogenized commodified authenticity.
- Comment on [deleted] 9 months ago:
Am alternative approach that may interest you: theluddite.org/#!post/reddit-extension
- Comment on ‘The tide has turned’: why parents are suing US social media firms after their children’s death 11 months ago:
Whenever one of these stories come up, there’s always a lot of discussion about whether these suits are reasonable or fair or whether it’s really legally the companies’ fault and so on. If that’s your inclination, I propose that you consider it from the other side: Big companies use every tool in their arsenal to get what they want, regardless of whether it’s right or fair or good. If we want to take them on, we have to do the same. We call it a justice system, but in reality it’s just a fight over who gets to wield the state’s monopoly of violence to coerce other people into doing what they want, and any notions of justice or fairness are window dressing. That’s how power actually works. It doesn’t care about good faith vs bad faith arguments, and we can’t limit ourselves to only using our institutions within their veneer of rule of law when taking on powerful, exclusively self-interested, and completely antisocial institutions with no such scruples.