cross-posted from: https://lemmy.ml/post/5768010
You know BOINC, the thing where you can donate your processing power to specific computational projects?
Is there anything like that, but for hosting platforms / services?
Something where you could say “I am willing to dedicate this much of my CPU, RAM and storage space to this project or this group of people”.
Say that I have a server that is more or less collecting dust, and I want to make it do something productive.
I am aware of YUNOHost and alternatives, but that still requires me to choose which things to deploy and also somehow then offer that to the community.
As a certified lazy dude, I would much rather say “here’s the computer, use it for whatever you need the most”.
The issue I see with this is that my goodwill could be abused for hosting something inappropriate or even illegal, and then I would be held responsible. So there should be some transparency requirement or some other mechanism that helps prevents this.And yes, self-hosting would not be the accurate term to describe this kind of distributed resource sharing. “croud-sourced self-hosting”? “crowd-hosting” sounds like a good description for this phenomenon.
Some implementation of this probably already exists. Please provide any relevant names or links that would help me find more about this.
deleted by creator
Sounds like IPFS or torrents.
I am aware of those two, but those are just for file storage. I was thinking something more general. Like having a Virtual Private Server, that people can extend with their donated resources. VPS can be used for more than just file storage. It can do processing as well.
it will 100% be abused by assholes for mining some millicents in monero or do ddos attacks.
Processing for a website need to be as fast as possible. Nobody would like a loading screen “please wait - contacting p2p node 32292 for rendering your page” that will last several seconds, pages on a peer and database on another, plus all the overhead to manage sync between all the nodes
file storage doesn’t require ultrafast processing. Make a static website with hugo, host pages on gh pages, and downloads on ipfs
The problem here, to my understanding (context: I work in IT, but I’m not claiming to have a PhD in comp sci or whatever) is that something like BOINC works because the computation is highly self contained. Basically you’re just working through a list of math problems.
But something like, say, Lemmy or Mastodon isn’t really all that heavy on raw math. Instead it’s all about referencing items in databases, and dynamically assembling them into pages that are presented to a user. So it’s mostly about a) storing information, and b) accessing the stored information.
You can’t really offload that, because you have to be able to trust that wherever you’re putting the data, it’ll still be there when you need it. Not very easy when you might just turn your PC off at night… Or have a power outage. You also have to deal with the security and privacy issues involved in placing that data on random people’s computers.
Then you have the problem of connection speeds. Consumer internet connections typically have pathetic upload speeds. Generally the biggest issue with doing any kind of distributed database is that you need lightning fast communication speeds between every component. This is why no one builds distributed databases.
Once you actually present the data to the user’s PC, most of the “processing” happens on their end, so you’re already donating as much power as you reasonably can.
Like I said, I’m not a hardcore computer scientist, so there might be something in missing here, but to my understanding there’s really no way that you could usefully leverage any kind of “borrowed” processing power for any sort of platform or service outside of the very narrow field of “crunching big numbers.”
This is a genuinely fresh and intriguing idea, but you’ve sort of answered your own question (as have most of the commenters) by noting it would immediately be abused. So I think you are going to have to be the one deciding how your compute cycles and bandwidth are being used.
BOINC/World Community Grid is the obvious choice since they are set up for exactly this use case. There’s also things like Sheepit - a render farm. Maybe you could run a Tor node .
https://garagehq.deuxfleurs.fr/
Is kind of this idea but for S3 object storage.