Comment on New "symbolic image compressor" posted in r/computerscience turns out to be AI hallucinated nonsense
altkey@lemmy.dbzer0.com 1 day ago
Reading OP and thinking about their misinformed understanding of what they are doing, I came upon an idea I propose to all of you: the almighty Babylonian Compression Algorythm.
As long as we have all combinations of (say, 256x256px) images in the database, we can cut down image size to just a reference to a file in said database.
It produces a bit-by-bit copy of any image without any compression, so it puts OOP’s project to shame. Little, almost non-existent problem is having access to said database, bloated with every existing but also not-yet-existing image. But since OOP’s solution depends on proprietary ChatGPT on someone else’s server, we are on par there.