Not the user you’ve asked but I’m using Silverbullet and have been loving it, it ticks every box of what I was looking for:
I’ve tried several, but I’ve had a major incident and lost all of the recipes I had because of a database corruption.
So I decided against keeping recipes in databases. I migrated to Notion, but I kept looking for a replacement since that’s not self-hosted. Eventually I ran across Silverbullet, and I’ve been using it for everything, so far it’s been great. Not exactly specifically what you asked but it can be used for it and works great.
Portage has supported binary packages since forever, back in 2012 I had some binary packages on my system, I clearly remember because it was a pain in the ass to compile certain things, for those I installed the binary version. It’s like Debian supporting source packages, it’s been there since forever but people don’t know about it.
I feel like facepalming myself to death for having asked such a stupid question before running an ls -a
on the folder.
One last question, I’ve been reading on Plugs because there’s one thing that I use regularly that I think doesn’t exist and want to know if it would be possible for me to implement, it’s called plantuml. Essentially it’s a plug that would act on a specific block of code, like the latex one, and would use POST the code to a configurable url, get an image as return and display that instead.
I said hundreds or thousands, I don’t expect to be creating hundreds of thousands of pages, but from your reply on the other thread SQLite should be more than capable of handling this scale.
Nice knowing that you have close to a thousand and it’s still fine. It will take me a long time to get to that amount of pages, but if I can get started with this it seems like an awesome way of storing knowledge bases, so I expect it will grow quite rapidly as I migrate all of my different things into it.
You have a 5GB file:
RAID 0: Each of your 5 disks stores 1GB of that data in alternating chunks (e.g. the first disk has bytes 1, 6, 11, second disk has 2, 7, 12, etc), occupying a total of 5GB. When you want to access it all disks read in parallel so you get 5x the speed of a single disk. However if one of the disks goes away you lose the entire file.
RAID 1: The file is stored entirely on two disks, occupying 10GB, giving a read speed of 2x, and if any single disk fails you still have your entire data.
RAID 5: Split the file in only 3 chunks similar to above, call them A, B and C, disk 1 has AB, disk 2 has BC, disk 3 has AC, the other two disks have nothing. This occupies a total of 10GB, it’s read at most st 3x the speed of a single disk, but if any single one of the 5 disks fails you still have all of your file available. However if 2 disks fail you might incur in data loss.
That’s a rough idea and not entirely accurate, but it’s a good representation to understand how they work on a high level.
I use it, but it has issues, for example you need to remember how you wrote the website name, and if you ever change your master password you need to change the password of every site, and if you must change the password of a single site, you need to remember the counter for each site.
It’s a cool idea, and worth it to generate passwords, but I would still advise to have other methods, and if you have those other methods it becomes kind of pointless. Still a very cool idea and very manageable for a low number of sites.
Correlation does not imply causation. It could be that the same thing that is lowering your internet speed is affecting LAN, perhaps the router not being enough to handle the traffic, or something in the network occupying a lot of bandwidth which is only active when there is internet (e.g. a download client, or worse a download client accessing a NAS).
In any case you need to give more info into what your setup looks like, e.g.
I wouldn’t do Windows, Linux will give you freedom to use docker for most things that you might want to host. As for which distro use whatever you find nice, there’s not going to be much difference. Some of the things people are suggesting are great for extremely advanced use cases, for just spinning up some services whatever you feel more comfortable would be best.
Extra question for people who have been using it. It says the bandwidth is unlimited, how unlimited are we talking about?. I was considering getting one to use as a reverse-proxy into my home lab to be accessible from the outside, which would mean lots of bandwidth usage, media streaming amounts.