Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
view the rest of the comments
everytime you copy paste a terminal command, try see if you can understand what it's doing with:
$ tldr mycommand (you need tealdeer installed)
and
$ mycommand --help
imo this is way more concise and beginner friendly than reading man pages
For all its flaws. Low level tech support, rubber duck, command explainer is something LLMs do really well. Kept my early mistakes off the web and got me where I needed to be most times.
I haven't had that experience. More often than not I've found properly made software breaks in ways that tell you why. I seem to get stuck going in a circle of doom with llms.
I must have been having more basic problems than you. I found LLMs to present the most common solution, and generally the most common way of setting it up is the "right-way", At least for a beginner. Then I'd quiz it on what docker compose environments do, what "ports: ####:####" meant, how I could route one container through another. All very basic stuff. Challenge: ask gpt
Then tell me it doesn't spit out something a hobbiest could understand, immediately start applying, and is generally correct? Beginners, still verify what gpt spits out.
By the time I wanted to do non-standard stuff I was better equipped with the fundamentals of hobbiest deployment and how to coax an LLM into doing what I needed. It won't write an Nginx config for you, or an ACL file, but with the documentation and an LLM you could teach yourself to write one.
Goes without saying I'd take the output of the LLM to Google for verification, then back to the LLM for a hobbiest's explaination, back to Google for verification... Also, all details are place holders: don't give it your email, api-keys, domains, nothing. Learn to scrub your input there and it'll be a habit here is a bonus too.
Properly made software has great documentation and logs. If you know how to access those logs and read documentation (both skills in themselves)... Not to mention not all software is "properly made" some of it is bare bones and just works(tm). Works it do, absolutely not a criticisms for FOSS projects, I love your stuff keep making it, and I'll keep finding ways to teach myself to use it.
Fair enough. LLMs and even Google have nuanced drawbacks, I personally try to give the creator of software some say into its usage simply because the intended usage is better tested than any changes I may need in the future.
At the and of the day learning is key.