I agree that humans are just flesh computers, but I don’t know whether we can say LLMs have overcome human creativity because I think the definition is open to interpretation.
Is the intentionality capable only with metacognition a requirement for something to be art? If no, then we and AI and spiders making webs are all doing the same “creativity” regardless of our abilities to consider ourselves and our actions.
If yes, then is the AI (or the spider) capable of metacognition? I know of no means to answer that except that ChatGPT can be observed engaging in what appears to be metacognition. And that leaves me with the additional question: What is the difference between pretending to think something and actually thinking it?
In terms of specifically “overcoming” creativity, I don’t think that kind of value judgement has any real meaning. How do you determine whether artist A or B is more creative? Is it more errors in reproduction leading to more original compositions?
oce@jlai.lu 1 week ago
As I suggested above, I would say creating a coherent link between ideas that was not learned. I guess it could be possible to create an algorithm to estimate if the link was not already present in the learning corpus of an ML model.
stray@pawb.social 1 week ago
I’m not sure how humans go about creating ideas, and therefore cannot be sure that the resulting ideas aren’t a combination of learned things. There have been people in history who did things like guess that everything is made up of tiny particles long before we could ever test the idea, but probably they got the idea from observing various forms of matter, right? Like seeing how rocks can crumble into sand and grain can be ground to flour. I don’t think they would have been able to come up with the idea in a vacuum. I think anything we’re capable of creating must be based on things which we’ve already learned about, but I don’t know that I can prove that.