The problem with this solution is that it leaves out the most important part of Wikipedia of all; the editors. Wikipedia is a living document, constantly being updated and improved. Sure, you can preserve a fossil version of it. But if the site itself goes down then that fossil will lose value rapidly, and it's not even going to be useful for creating a new live site because it doesn't include the full history of articles (legally required under Wikipedia's license) and won't be the latest database dump from the moment that Wikipedia shut down.
Comment on Wikipedia is under attack — and how it can survive
thingsiplay@beehaw.org 4 days ago
Do theverge have this big font or is something broken on my end?
You can download the entirety of Wikipedia for offline usage, BTW. I do this with an application called Kiwix kiwix.org/en/ .
- Click “All Files” on the left menu of the program.
- In the bottom search bar (there is one top and one bottom bar) type “wikipedia” to show only those entries matching the search.
- Then click on the “Size” header to sort all entries by size. Usually the biggest one is the most complete.
- Now “Download” it (i already have it, so it says “Open” for me).
Note that the big one with 111 GB contains images and contains all English language Wikipedia articles. The one with 43 GB should be the same I think, but without images. There are many other variants too, varying in content and theme and even build date. In example the one with “1m Top” contains the top 1 million articles only.
FaceDeer@fedia.io 4 days ago
other_cat@lemmy.zip 4 days ago
Some solution is better than no solution. I don’t mind having a ‘fossil’ version for a pinch. We got along okay with hardcovered encyclopedias pre-internet and this is not that different except it still being reliant on electricity. (I have different, more valuable books on hand if we ever wind up THAT fucked.)
FaceDeer@fedia.io 4 days ago
My point is that the alternative isn't "no solution", it's "the much better database dump from Internet Archive or Wikimedia Foundation or wherever, the one that a new Wikipedia instance actually would be spun up from, not the one that you downloaded months ago and stashed in your closet."
The fact that random people on the Internet have old copies of an incomplete, static copy of Wikipedia doesn't really help anything. The real work that would go into bringing back Wikipedia would be creating the new hosting infrastructure capable of handling it, not trying to scrounge up a database to put on it.
bufalo1973@piefed.social 3 days ago
Isn't there a way to sync the copy to the current version?
interdimensionalmeme@lemmy.ml 4 days ago
Wikipedia is not at risk of being shutdown, the danger is malevolent editors bringing the culture war inside of it and destroying “truth”. While it would be great to keep wikipedia as it is, “they” are coming for it, wikipedia doesn’t get to be excluded from the war. For now the best we can hope for is that it will survive but the best we can do is save local wikipedia copies in case the worse happens. Which isn’t shutdown, but corruption.
laranis@lemmy.zip 4 days ago
Thanks for sharing this. Started hosting a local copy of several wiki sources last weekend once this news broke.
Another commenter said downloading is missing out on the best part of Wikipedia, then ongoing editing. Which, while true, is also going to be a weak point.
How many of those amazing editors are going to stock around when their full time job becomes combatting obvious right wing bullshit, when they have to submit gov ID to have an account on the site, and when common sense and fairness becomes a crime?
Wikipedia was a high point for humanity. Whatever comes next I’d like to preserve a little piece of it.
HubertManne@piefed.social 3 days ago
Honestly someone recently posted on the hisoricalness of jesus and the article seemed way different than a few years ago and I would say less accurate. Sorta wish I had downloaded it in like 2015.
balder1993@programming.dev 4 days ago
Best thing is that it works flawlessly on the mobile apps as well, and Wikipedia also has a 1 million most relevant articles or so, which is just a few gigabytes.
ordnance_qf_17_pounder@reddthat.com 4 days ago
The fact you can download the entirety of the site for 111gb sounds pretty damn impressive to me.
e0qdk@reddthat.com 4 days ago
It doesn’t actually include all the media, and – I think – edit history. It does give you a decent offline copy of the articles with at least the thumbnails of images though.
thingsiplay@beehaw.org 4 days ago
Nice stats. I always wondered. I get the feeling that ~678 TB is little bit more than ~111 GB.
SteevyT@beehaw.org 4 days ago
Like, at least 7GB bigger.
Powderhorn@beehaw.org 4 days ago
Dear god, are we still using base 2 for file sizes?
thingsiplay@beehaw.org 4 days ago
It doesn’t matter in this case, as long as it is documented (and it is by the unit).
Summzashi@lemmy.one 1 day ago
Nobody does that nerd
phoenixz@lemmy.ca 3 days ago
Yes, we all do
interdimensionalmeme@lemmy.ml 3 days ago
I don’t remember which is the stupid “1024 bytes in a kilobyte” one but
745,450,666,761,889 byte is 745 terabytes, that should be 745 TB and that 678 should be what TiB is for
And also that entire 677.98 is a useless value, there’s nothing that is “677” about this
CanadaPlus@lemmy.sdf.org 4 days ago
Text is light. Images are a bit heavier, but there’s not too too many.