girls <3
You can run an LLM on a phone (tried it myself once, with llama.cpp), but even on the simplest model I could find it was doing maybe one word every few seconds while using up 100% of the CPU. The quality is terrible, and your battery wouldn’t last an hour.
deleted by creator
A study from 1989 doesn’t apply to modern plants built 35 years later, it really doesn’t make sense to extrapolate it like this.
I would rather do that instead of indirectly killing a bunch of unwilling people, yeah.
Lemmy but twitter instead of reddit.
I’m pretty sure that most lemmy instances run on a VPS, where the only thing you actually have to worry about usually is securing SSH, i.e. only using keys and setting up fail2ban. After that it’s only a matter of securing lemmy the software itself, which is a whole other discussion.
Yeah you’re right, I just felt the need to point out that API calls are not really comparable to serving a full website.
The thing is that when you interact with the remote server directly it’s not 10 api calls, it’s 10 full-blown HTML webpages that have to be served to you, which are way bigger than REST API calls.
yeah, from my experience lemmy has way fewer but more active users compared to reddit
A Very Polish Christmas by Sabadu.