You realize the gpu site idle when not actively being used right?
It’d be cheaper if you host it locally, essentially just your normal electricity bill which is the entire point of what op is saying lol.
You realize the gpu site idle when not actively being used right?
It’d be cheaper if you host it locally, essentially just your normal electricity bill which is the entire point of what op is saying lol.
Seems nifty, bake in stuff like selecting your AI provider (support local llama, local openAI api, and if you have to use a third party I guess lol) make sure it’s dockerized (or is relatively easy to do, bonus points for including a compose)
OH being able to hook into a self host engine like searxng would be nice too, can do that with Oogabooga web search plug-in currently as an example.
WG Ez worked fine for me? Basically just VPNs me right into my LAN.
OH I’m an idiot, I forgot I connect to my domain for the wire guard connection lmao
Though I did mean just tunnel into the Lan then the vpn is applied on outbound connections on the Lan using something like Gluetun or w/e
Why not just skip that and just use a wire guard tunnel?
Just go on fdroid or something, I don’t understand why android app devs just give up with the play store. I get the reach is minimized but let’s be real, the reach for this kind of app is already limited.
We are still allowed to sideload apps on Android, and the more projects on Fdroid or other alternative repos the better.
What is the problem? I started my homelab a month ago by installing proxmox lol
The gpu is already running because it’s in the device, by this logic I shouldn’t have a GPU in my homelab until I want to use it for something, rip jellyfin and immich I guess.
I get the impression you don’t really understand how local LLMs work, you likely wouldn’t need a very large model to run basic scraping, just would depend on what OP has in mind really or what kind of schedule it runs on. You should consider the difference between a mega corps server farm compared to some rando using this locally on consumer hardware. (which seems to be the intent from OP)