🇨🇦
I ran a setup like this for a couple years. Super handy being able to literally press the power button remotely; especially when/if the system hangs and becomes unresponsive.
If you use an RPI as the third device, you can use one of the GPIO pins to trigger a transistor connected in parallel with the servers power button. The pi can then (re)start the server on command.
It’s clunky, but I can open files in firefox by using a file browser app (I use x-plore), selecting ‘open with’, then selecting firefox. Sometimes it’s not in the list, but there’s a selector for what type of file (text, video, audio, ‘*’). ‘*’ lists all the apps.
Sometimes stuff still refuses to open, but things like pdfs and html files usually work
You have to explicitly enable directory indexing; but then it will automatically generate simple http pages listing directory contents.
https://nginx.org/en/docs/http/ngx_http_autoindex_module.html
FolderSync selectively syncs files/folders from my phone back to my server via ssh. Some folders are on a schedule, some monitor for changes and sync immediately; most are just one-way, some are two-way (files added to the server will sync back to the phone as well as uploading data to the server). There’s even one that automatically drops files into paperless-ngx’ consume folder for automatic document importing.
From there BorgBackup makes a daily backup of the data, keeping historical backups for years with absolutely incredible efficiency. I currently have 21 backups of about ~550gb each. Borg stores this in 447gb of total disc space.
If you’ve got an nvidia gpu+drivers installed, you’ve probably got ‘nvidia-smi’ already which will show you utilization and which processes are using it.
You could setup a user account like the share you’re describing. There’s a setting to prevent the user from changing their password.
Just pass out those credentials to anyone you want to collaborate with; they don’t need their own individual accounts.
I use https://filebrowser.org/ for this.
Nice lightweight filebrowsing/sharing with user management. Users can have their own dedicated directories, or collaborate.
You can also create share links that allow anyone with the link to view/download files. Optionally password protected.
Here’s a demo you can mess with: https://demo.filebrowser.org/ User: demo Pass: demo
Most of my web services are behind my vpn, but there are a couple I expose publicly for friends/family to use. Things like emby, ombi, and some generic file sharing with file browser.
One of these has a long custom path setup in nginx which, instead of proxying to the named service, will ask for http basic auth credentials. Use the correct host+path, then provide the correct user+pass, and you’ll be served an openvpn configuration file which includes an encrypted private key. Decrypt that and you’ve got backdoor vpn access.
I keep vaultwarden behind a vpn so it’s not exposed directly to the net. You don’t need a constant connection to the server; that’s only needed to add/change vault items.
This does require some planning though; it’s easy to lock yourself out of your accounts when you’re away, if you don’t incorporate a backdoor of some kind to let yourself in in an emergency. (lost your device while away from home for example)
My normal vpn connection requires a private key and a password that’s stored in my vault to decrypt it. I’ve setup a method for retrieving a backup set of keys using a series of usernames, emails, passwords, and undocumented paths (these are the only passwords I actually memorize); allowing me to reach vaultwarden where I can retrieve my vault with the data needed to login to everything else properly.
Thank you! You gave me the hint I needed.
I didn’t know there was a quick setting button (the buttons in the notification tray) and have been struggling to find the accessibility options people have mentioned.
That button in the tray seems so much more reliable. Thanks again!
I tried. I couldn’t get it to work again, so wanted to look at other options alongside looking for help/solutions.
But just as it decided to stop working, despite my efforts; it’s suddenly started working again.
Sigh…
That’s an interesting option. It’s the Bitwarden app I’ve been having issues with; though I’m not sure how much of that is Bitwardens fault vs Android itself.
I’ll give that a look, thanks :)
Actually it looks like Caddy is supposed to set those automatically (I’m used to Nginx which doesn’t).
You’ll have to look at why the upstream isn’t accepting them then. I’m not familiar with that particular app.
X-Forwarded-For
And
X-Real-IP
The application you’re proxying also has to listen to these headers. Some don’t, some need to be told they’re ok to use. (if you enable them, but don’t have a proxy in front, users can spoof their ip using them)
Yeah “dd if=/dev/mmcblk0 of=$HOSTNAME.$(date +%Y.%m.%d).img” and while its running. (!!! Make sure the output is NOT going to the sd card you are backing up…)
I deliberately chose a time when it’s not very active to perform the backup. Never had an issue, going on 6 years now.
I’ve always used dd + sshfs to backup the entire sd card daily at midnight to an ssh server; retaining 2 weeks of backups.
Should the card die, I’ve just gotta write the last backup to a new card and pop it in. If that one’s not good, I’ve got 13 others I can try.
I’ve only had to use it once and that went smoothly. I’ve tested half a dozen backups though and no issues there either.
Major version changes for any software from the OS right down to a simple notepad app should update as sequentially as possible (11>12>13>etc). Skipping over versions is just asking for trouble, as it’s rarely tested throughly.
It might work, but why risk it.
An example: if 12 makes a big database change but you skip over that version, 13 may not recognize the databases left by 11 because 12 had the code to recognize and reformat the old database while that code was seen as unnecessary and removed from 13.
Stuff like this is also why you can’t always revert to an older version while keeping the data/databases from the newer software.