etrotta
@etrotta@beehaw.org
This is a remote user, information on this page may be incomplete. View at Source ↗
- Comment on [deleted] 7 months ago:
To be fair, I wouldn’t include “loading the whole model into VRAM” as part of the cost, given they can just keep it in there between different requests, and it might be down to hundreds of billions or dozens of billions instead of trillions… but even after all improvements it should still be orders of magnitude more expensive than normal search, which just makes their decision even crazier
- Comment on What’s next for Mozilla? 11 months ago:
The vast majority of consumer devices, both mobile and laptops/desktops, are not powerful enough to run local AI with a good user experience yet, and even if they were, a lot of users would still prefer having it run in the cloud rather than using up their phone battery