No it’s not a clumsy poorly worded pitch is just as “disrespectful”.
Your just on lemmy aka the place with people more up their own asshole licking the inside of their colon then reddit.
The avg is here may be higher than reddit, but the amount of stupid motherfuckers full of themselves is even higher.
Your pitch is fine, as long as your clear about your llm usage. The scope of your project and goals. And have correct and well communicated disclaimers. Then the only people who are goanna be pissy about it are the shit fucks that arnt worth listening to.
Your code is your own and that’s what matters. You put in the work and are sharing and or selling the fruits of your own labor.
Llms are literally there to be word generators. The two things they are actually good for is making large blocks of text that are easily understandable and translations. You used a tool in a manner that’s actually correct in its strengths.
It would be preferable if you at least used a local llm instead of something like chat gpt. Just to help offset the environmental impact even if it is just a little bit.
seedlord_com@lemmy.zip 1 day ago
Genuinely appreciate this, thank you.
You’re right that Lemmy is new territory for me still learning the culture and clearly stepped on some landmines along the way.
On LLMs I’m pretty firm in my position: useful only when you already know what you’re doing, only to move faster, and absolutely not a replacement for understanding your own work. And they get it wrong constantly even then which is exactly why you need to be able to read, write, build and debug without them first.
Local models I’m fully on board with in principle. The environmental point is well taken. The problem I keep running into is that for actual coding tasks, the local options that are genuinely good enough still want a GPU setup that costs more than a full datacenter expecially now with the RAM shortage. If you have any recommendations on that front though models, setups, anything that punches above its weight I’m all ears. Seriously.