

The 102GB includes pictures then? That’s insane.
The 102GB includes pictures then? That’s insane.
As of 2025, the English Wikipedia has 63,040,591 pages. The current text content in all its pages is about 156 GB in size. When counting all the revisions in histories, the size is 26,455 GB (26 TB).
I’m sure many people have already archived this at the least. Or snapshots, at a minimum. Not sure if that’s uncompressed, but if so, compression would save a shitload of that.
I’m doing it with a jellyfin client to my friend’s jellyfin server.
That would be pretty nice. Our plates are expensive over here (US) so we just put a new tiny year sticker on each time and keep the plates for a long time.
I just used that for export, but I have yet to try import on it. But I’m assuming it works well, it has good reviews as far as I remember.
https://ollama.ai/, this is what I’ve been using for over a year now, new models come out regularly and you just “ollama pull <model ID>” and then it’s available to run locally. Then you can use docker to run https://www.openwebui.com/ locally, giving it a ChatGPT-style interface (but even better and more configurable and you can run prompts against any number of models you select at once.)
All free and available to everyone.
In my experience it depends on the math. Every model seems to have different strengths based on a wide berth of prompts and information.
+1 for Mistral, they were the first (or one of the first) Apache open source licensed models. I run Mistral-7B and variant fine tunes locally, and they’ve always been really high quality overall. Mistral-Medium packed a punch (mid-size obviously) but it definitely competes with the big ones at least.
GrapheneOS offers such an auto-reboot feature (18 hours by default, but the users can set it between 10 minutes and 72 hours), while the iPhone picked up something similar with iOS 18.1 (Inactivity Reboot) last year.
I was referring primarily to things that are known to be good security practices and widely known and used already. Keeping data more secure at rest goes with the “don’t trust anything or anyone” goal, and if not doing it on Android due to said trust or lack thereof, then GrapheneOS offers it too at least.
Thanks for the voice of sanity. There are so many people freaked out by basic security measures that it boggles the mind.
Heat guns are what I use to loosen things up.
I didn’t even know that term existed, it’s the one that told me about it, I only copied and pasted it.
It only summarized the behaviors of the mods/admins, not the reddit userbase at large. There’s probably Venn diagram between the two but not exactly 100% the same. And things like shadowbanning (listed in the response) are not actions of the users, or political ideologies of the users either, only something a mod can do.
I gave it a list of actions and behaviors by admins/mods on reddit, and asked it what the political ideologies would be considered. That’s what ChatGPT spat out. I was curious how its inference would work given a list of behaviors. Hilarious being downvoted for pasting from ChatGPT’s inferences though, I’m just the messenger.
Rule 1 ToS violation – permban.
Sad part is I find a lot of helpful random material (information about cellphone providers, accounts, all sorts of random edge cases in different domains) and still want to look at the threads, and find a lot of help that way. So I just switch to a VM that’s on a VPN just to check those. Sucks but some information doesn’t exist anywhere else.
You know a site is good when its mascot makes you feel intense uncontrollable rage deep in your soul.
The sad part is I still have a lot of random subs I enjoy, fringe stuff like r/OnionLovers and r/slowcooking, r/frugal… and subs like r/frugal immediately delete any comment that brings up politics in any way, so there are a few decent subs left with non-batshit insane mods but they are becoming exceedingly rare.
I am surprised I didn’t even notice that, I used to use it for when sites changed or were down. I guess there’s always internet archive sites if I were so inclined, but still sad.
I can’t hear over 13.5khz anymore… supposedly happens with aging too. To be fair 13khz is a really annoying noise.