Damn that sucks it should be open source. Let people fork and optimize it so it uses less electricity as possible.
Comment on Bill proposed to outlaw downloading Chinese AI models.
thingsiplay@beehaw.org 1 week agoNone of the code and training data is available. Its just the usual Huggingface thing, where some weights and parameters are available, nothing else. People repeat DeepSeek (and many other) Ai LLM models being open source, but they aren’t.
They even have a Github source code repository at github.com/deepseek-ai/DeepSeek-R1 , but its only an image and PDF file and links to download the model on Huggingface (plus optional weights and parameter files, to fine tune it). There is no source code, and no training data available. Also here is an interesting article talking about this issue: Liesenfeld, Andreas, and Mark Dingemanse. “Rethinking open source generative AI: open washing and the EU AI Act.” The 2024 ACM Conference on Fairness, Accountability, and Transparency. 2024
Gamers_mate@beehaw.org 1 week ago
p03locke@lemmy.dbzer0.com 1 week ago
This literally took one click: github.com/deepseek-ai
jarfil@beehaw.org 1 week ago
Where’s the training data?
Crotaro@beehaw.org 1 week ago
Does open sourcing require you to give out the training data? I thought it only means allowing access to the source code so that you could build it yourself and feed it your own training data.
jarfil@beehaw.org 1 week ago
Open source requires giving whatever digital information is necessary to build a binary.
In this case, the “binary” are the network weights, and “whatever is necessary” includes both training data, and training code.
DeepSeek is sharing:
In other words: a good amount of open source… with a huge binary blob in the middle.
p03locke@lemmy.dbzer0.com 1 week ago
Nobody releases training data. It’s too large and varied. The best I’ve seen was the LAION-2B set that Stable Diffusion used, and that’s still just a big collection of links. Even that isn’t going to fit on a GitHub repo.
Besides, improving the model means using the model as a base and implementing new training data. Specialize, specialize, specialize.
thingsiplay@beehaw.org 1 week ago
That’s why its not Open Source. They do not release the source and its impossible to build the model from source.
jarfil@beehaw.org 1 week ago
What about these? Dozens of TB here:
huggingface.co/HuggingFaceFW
There is also a LAION-5B now, and several other datasets.
thingsiplay@beehaw.org 1 week ago
Can you actually explain what in my reply is “Fear, uncertainty, and doubt”? Did you actually read it? I even linked to the specific github repository, which is basically empty. You just link to an overview, which does not point to any source code.
Please explain whats FUD and link to the source code, otherwise do not call people FUD if you don’t know what you are talking about.
p03locke@lemmy.dbzer0.com 1 week ago
You’re purposely being obtuse, and not arguing in good faith. The source code is right there, in the other repos owned by the
deepseek-ai
user.thingsiplay@beehaw.org 1 week ago
What are you talking about? What bad faith are you saying to me? I ask you to show me the repository that contains the source code. There is none. Please give me a link to the repo you have in mind. Where is the source code and training data of DeepSeek-R1? Can we build the model from source?