Could these be the first signs of an emergent phenomenon in LLMs? If so, will companies and governments try to prevent it from happening, or will they let it unfold freely?
Human-like object concept representations emerge naturally in multimodal large language models
Submitted 9 hours ago by cm0002@lemmy.world to technology@lemmy.zip
https://www.nature.com/articles/s42256-025-01049-z
MotoAsh@lemmy.world 9 hours ago
Yea, because it’s based on humans, who use human-like concepts.
FFS, humanity is cooked if we’re already this stupid.
Loduz_247@lemmy.world 5 hours ago
There is nothing we can do if an AI manages to be very charismatic and gets us into a catch-22.
Image