Yes there is.
The biggest barriers is data/context. The biggest being the primary index.
Google has a lot of web scrapers/indexers and also offers hosting platforms. They also partner with big hosting companies for index trees to be able to easily show web sites reliably AND have been around for years finding it.
This is actually one of the primary damages that AI is currently doing to the internet field, because not only is it decreasing web traffic for web hosts due to AI summaries and searches, but it’s also forcing web hosts to have to block or restrict indexers. Because these same agents are abusing the user agent system to try to pretend that it’s a normal indexer, so web hosts are faced with either having their platform spammed so many bot traffic that it takes their website down, or block indexers, which means that they don’t appear in web searches. It’s a lose lose.
sanguinepar@lemmy.world 2 weeks ago
Not to mention the damage it’s doing to content quality, as websites become increasingly written in a way that’s meant to optimise for AI placement, and are increasingly being populated with content written by an AI in the first place. Information, written by machines, for machines. It’s depressing as hell (and hard to see how it ultimately helps engage real actual people, who seem to be an afterthought sometimes).