this post was submitted on 03 Mar 2025
29 points (100.0% liked)

Selfhosted

43108 readers
723 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I'm having trouble staying on top of updates for my self hosted applications and infrastructure. Not everything has auto updates baked in and some things you may not want to auto update. How do y'all handle this? How do you keep track of vulnerabilities? Are there e.g. feeds for specific applications I can subscribe to via RSS or email?

top 32 comments
sorted by: hot top controversial new old
[–] lambalicious@lemmy.sdf.org 2 points 2 hours ago (1 children)

I don't.

Yeah, hot take, but basically there's no point to me having to keep track of all that stuff and excessively worry about the dangers of modernity and sacrifice the spare time I have on watching update counter go brrrr of all things, when there's entire peoples and agencies in charge of it.

I just run unattended-upgrades (on Debian), pin container image tags to only the major version number where available, run rebuild of containers twice a week, and go enjoy the data and media I built the containers and installed for software for.

[–] 9488fcea02a9@sh.itjust.works 1 points 2 hours ago* (last edited 2 hours ago)

I think the problem is that a lot of people are just running flatpaks, dockers, and third party repos which might not be getting timely updates.

I try to stick to debian packages for everything as much as possible for this reason.

[–] LiveLM@lemmy.zip 2 points 3 hours ago* (last edited 3 hours ago)
  • VPN only, nothing exposed
  • Host runs openSUSE MicroOS which updates itself daily
  • Watchtower updates the containers daily and if something blows up so be it, except for Nextcloud as everyone says it's brittle as hell.
[–] slazer2au@lemmy.world 20 points 6 hours ago* (last edited 6 hours ago)

Does badly count as a way?

I kinda keep an eye on that https://selfh.st/ post that does a weekly roundup of stuff to know when I need to do patching.

No doubt there is a container I could run that would do it for me. I just can't remember the name of it.

[–] mhzawadi@lemmy.horwood.cloud 1 points 2 hours ago

I have stuff in new releases.io and also GitHub release RSS feeds in nextcloud, I then sit down once a week and see what needs an update. Reboot when required.

[–] vegetaaaaaaa@lemmy.world 1 points 3 hours ago* (last edited 2 hours ago)

upgrades:

  • distribution packages: unattended-upgrades
  • third party software: subscribe to the releases RSS feed (in tt-rss or rss2email), read release notes, bump version number in my ansible playbook, run playbook, done.

vulnerabilities:

  • debsecan for distribution packages
  • trivy fort third-party applications/libraries/OCI images
  • wazuh for larger (work) setups
[–] Darkassassin07@lemmy.ca 7 points 5 hours ago (1 children)

95% of things I just don't expose to the net; so I don't worry about them.

Most of what I do expose doesn't really have access to any sensitive info; at most an attacker could delete some replaceable media. Big whoop.

The only thing I expose that has the potential for massive damage is OpenVPN, and there's enough of a community and money invested in that protocol/project that I trust issues will be found and fixed promptly.

Overall I have very little available to attack, and a pretty low public presence. I don't really host any services for public use, so there's very little reason to even find my domain/ip, let alone attack it.

[–] kewko@sh.itjust.works 1 points 1 hour ago

You should try wireguard if you haven't before, like a breath of fresh air

[–] sk@hub.utsukta.org 9 points 6 hours ago

i subscribe to the release page of the repo in my rss reader. simple and effective.

[–] ShortN0te@lemmy.ml 1 points 3 hours ago

Most critical infrastructure like my mail i subscribe to the release and blog rss feed. My OSs send me Update notifications via Mail (apticron), those i handle manual. Everything else auto updates daily.

You still need to check if the software you use is still maintained and receives security updates. This is mostly done by choosing popular and community drive options, since those are less likely to get abandoned.

[–] enumerator4829@sh.itjust.works 4 points 5 hours ago

Unless you have actual tooling (i.e. RedHat erratas + some service on top of that), just don’t even try.

Stop downloading random shit from dockerhub and github. Pick a distro that has whatever you need packaged, install from the repositories and turn on automatic updates. If you need stuff outside of repos, use first party packages and turn on auto updates. If there aren’t any decent packages, just don’t do it. There is a reason people pay RedHat a shitton of money, and that’s because they deal with much of this bullshit for you.

At home, I simply won’t install anything unless I can enable automatic updates. Nixos solves much of it. Two times a year I need to bump the distro version, bump the nextcloud release, and deal with depreciations, and that’s it.

I also highly recommend turning on automatic periodic reboots, so you actually get new kernels running…

[–] 30p87@feddit.org 6 points 6 hours ago (1 children)

How do I do it? Everything's installed and updated via pacman/the AUR, including python packages and nextcloud apps. The only thing I don't install via that way is Firefox addons.

[–] N0x0n@lemmy.ml 1 points 5 hours ago* (last edited 5 hours ago) (1 children)

The only thing I don't install via that way is Firefox addons.

Any specific reason why? Yesterday I installed LibreWolf and saw at the same time a few addons in the AUR.

Do you know what's the difference from an AUR addon or the official Firefox addon repo?

I guess It would be for security reasons because you never know if someone has tempered with the addon.

[–] 30p87@feddit.org 2 points 5 hours ago (1 children)

Simply because I haven't bothered searching for the extensions I have in the AUR. And some extensions aren't in there (namely 7tv, augmented steam, blacklist autoclose, defund wikipedia, kagi, peertube companion, tampermonkey and unload tabs).

[–] N0x0n@lemmy.ml 1 points 4 hours ago

XD okay ! Maybe I put to much fought into it 😅

[–] catloaf@lemm.ee 3 points 5 hours ago

That's the neat part. I don't!

I have automatic updates on everything, but if I actually spent time managing updates and vulnerabilities I'd have no time to do anything else in my life.

[–] N0x0n@lemmy.ml 3 points 5 hours ago

For my docker containers I use what's up docker which not only alerts me when there is an update but also give a link to the changes, so I can have a look what's happening !

For my system itself... Just doing sudo pacman -Syu. Though that's not great, cause some updates can potentially break my EndeavourOS system... I keep sometimes an eye on the forum when I see some critical changes like the kernel itself or nvidia updates though.

[–] sugar_in_your_tea@sh.itjust.works 3 points 5 hours ago* (last edited 5 hours ago)

I just update every month or two, or whenever I remember. I use Docker/podman, and I set the version to whatever minor release I'm using, and manually bump after checking the release notes to look for manual upgrade steps.

It usually takes 5 min and that's with doing one at a time.

[–] just_another_person@lemmy.world 1 points 6 hours ago* (last edited 5 hours ago)

Trivy and Grype will give you a pretty decent idea of what you have for exposure, but you're at the behest of any project for fixing their own issues, or you can contribute updates if accepted.

Really the first line of defense is just securing your comms to the public internet. If you're running everything internally, you have a lot less to worry about. Nothing will ever be bulletproof though.

[–] eager_eagle@lemmy.world -1 points 6 hours ago (2 children)
[–] NaibofTabr@infosec.pub 0 points 5 hours ago (1 children)

This is also a great way to just break everything you've set up.

[–] eager_eagle@lemmy.world 1 points 5 hours ago* (last edited 5 hours ago)

that's a lot of FUD, topgrade just upgrades using all package managers you have, it doesn't do the upgrades itself bypassing the manager that installed it, or package authors.

[–] just_another_person@lemmy.world -1 points 6 hours ago (1 children)

This is a bad idea for a number of reasons. Most obvious issue is that it doesn't guarantee anything in the way of actually fixing vulnerabilities, because some project you use may not even be scanning their own work.

[–] eager_eagle@lemmy.world 1 points 6 hours ago (1 children)

what's the alternative? Write a PR yourself?

[–] just_another_person@lemmy.world -1 points 5 hours ago (1 children)

Yup. Really easy in most cases if you're just upgrading a dependency version of something to the next minor release up, but then it has to pass all the project CI tests, and get an actual maintainer to tag it for release. That's how open source works though.

[–] eager_eagle@lemmy.world 1 points 5 hours ago (1 children)

That may work for a handful of projects. It'd be my full time job if I did it for everything I run. Also, I might simply suggest maintainers to adopt dependabot or an alternative before I spend time with manual changes. These things should be automated.

[–] just_another_person@lemmy.world -1 points 5 hours ago (1 children)

Well a PR means an upstream fix for the project. If you want to scan all your local running things, by all means change whatever you want, but it will just be potentially wiped out by the tool you mentioned if running.

[–] eager_eagle@lemmy.world 0 points 5 hours ago (1 children)

dependabot is a tool for repos, not to apply local changes

[–] just_another_person@lemmy.world -1 points 5 hours ago (1 children)

I'm aware, but then you mentioned "manual changes", which connotes "local changes". Putting up a PR with changes isn't considered a manual anything.

[–] eager_eagle@lemmy.world 1 points 5 hours ago (1 children)

“manual changes”, which connotes “local changes”

It doesn't. Manual as in a PR with upgrades that you're suggesting yourself, as opposed to running dependabot.

Putting up a PR with changes isn’t considered a manual anything.

If I have to open a PR myself, that's very much a manual change.

[–] just_another_person@lemmy.world 0 points 5 hours ago (1 children)

I don't even know what you're talking about now, so I'm going to stop responding. If Dependabot was already enabled for a project, you probably wouldn't need to worry, so that negates this entire thread. 🙄

[–] eager_eagle@lemmy.world 1 points 5 hours ago

exactly my point, I'd suggest automating that before I bothered with PRs that upgrade versions, as it's a waste of time.