PumpkinDrama

joined 2 years ago
MODERATOR OF
[–] PumpkinDrama@reddthat.com -1 points 3 months ago (9 children)

I'm not referring to the amount of content but how it is curated. If it showed the content sorted by votes from the local instance instead of an aggregate of all instances the content would differ form instance to instance.

 

I've noticed that the "All" feed on Lemmy is pretty much the same across all instances, showing posts from every instance regardless of the specific focus or community vibe of the instance you're on. This seems like a missed opportunity to make the experience more tailored and engaging for each instance's unique audience.

For example, if there were an instance dedicated to literature lovers, wouldn't it make sense for the "All" feed on that instance to prioritize content that's more relevant to people who enjoy books, poetry, and writing? Instead of being a global feed that shows everything from memes to tech news, it could reflect the interests and values of the instance's community.

I feel like making the "All" feed more tailored to each instance would not only improve user experience but also strengthen the sense of community within each instance. What do you think? Would love to hear everyone's thoughts!

 

Available online as in, you just log in to a website and use it, not on hugging face or github, where you need to download, install and configure.

LLMs are already made so "safe" that they won't even describe an erotic or crime story - content you would easily find visually represented in all its detail on Netflix, Amazon, HBO, youtube, etc. Ie writing "Game of Thrones" with an AI is not possible in most chat bots anymore.

 
 

I would love to be more active in posting links to articles and websites I find interesting to the fediverse, but I find that searching for the appropriate community can be a hassle. With so many different instances hosting the same communities, it can be difficult to know where to post. Is there a Firefox extension that would allow me to quickly and easily post links to a single Lemmy community (for example https://reddthat.com/c/random)? I'm envisioning something like a bookmarking tool that lets me post the website I'm viewing with a single click. If there isn't an existing extension that does this, I'd be interested in finding a similar program that I could use for inspiration to create one myself.

[–] PumpkinDrama@reddthat.com 3 points 7 months ago (2 children)

As long as I'm not looking at it I'd feel more comfortable with it that being surrounded by mosquitoes. Would you rather be surrounded by mosquitoes than be in the same room as that thing?

[–] PumpkinDrama@reddthat.com 3 points 7 months ago

You are overcomplicating the issue by suggesting a "favorite" option when there is already a "subscribe" option. At the very least, consider proposing something distinct that helps users discover more of the small communities they are subscribed to, rather than suggesting something that has already been implemented.

[–] PumpkinDrama@reddthat.com 6 points 7 months ago* (last edited 7 months ago) (12 children)

Although there were some proposed solutions for this issue, when scaled sort was implemented, @nutomic@lemmy.ml closed all related issues, even when they weren't being solved by scaled sort. So, it's clear that since there are no longer any open issues about this, no one is going to care about solving it. Therefore, it seems like the only option is to accept this fact and learn to cope with it. At this point, I've come to terms with the fact that Lemmy is mainly a platform for shitposts, while Reddit is for everything else. When I look at the feed, I mostly see memes, US politics, and some tech.

Custom feeds may not be the most efficient solution due to scalability concerns. However, an alternative approach could be to make the metadata about the posts (votes, comments, etc) available through an API call. This would enable users to develop their own algorithms for content discovery and potentially create a more personalized experience. Users could then implement, share and install these algorithms using tools like Tampermonkey or other userscript managers.

[–] PumpkinDrama@reddthat.com 0 points 7 months ago

Over the past few days, I’ve witnessed a remarkable surge in the number of communities on browse.feddit.de. What started with 2k communities quickly grew to 4k, and now it has reached an astonishing 8k. While this exponential growth signifies a thriving platform, it also brings forth challenges such as increased fragmentation and the emergence of echo chambers. To tackle these issues, I propose the implementation of a Cross-Instance Automatic Multireddit feature within Lemmy. This feature aims to consolidate posts from communities with similar topics across all federated instances into a centralized location. By doing so, we can mitigate community fragmentation, counter the formation of echo chambers, and ultimately foster stronger community engagement. I welcome any insights or recommendations regarding the optimal implementation of this feature to ensure its effectiveness and success.

source

[–] PumpkinDrama@reddthat.com 0 points 7 months ago (1 children)

I think it’s because it’s just memes and also quite hard moderation and downvotes. It feels like a reddit clone that has the exact same mindset as reddit. I get annoyed when I see people being moderated for having an opinion that is not popular.

I saw a post being locked yesterday for asking about moderation. Doesn’t anyone else see the problem with that? Your channels rules are not more important than making people feel they can talk and express what’s on their mind.

I hate that so much. Stop treating people like they are just resources to moderate.

I don’t see much discussions. But I’m sure there is a few here and there.

source

[–] PumpkinDrama@reddthat.com 0 points 7 months ago

Yeah because first of all, content had to be spread out across 562826 different communities for no reason other than that reddit had lots of communities, after growing for many many years. It started with just a few.

Then 99% of those were created on Lemmy.world, and every new user was directed to sign up at Lemmy.world.

I guess a lot of people here are younger than me and didn’t experience forums, but we had like 30 forum channels. That was enough to talk about anything at all. And I believe it’s the same here, it would have been enough. And then all channels would have easy to find content.

source

[–] PumpkinDrama@reddthat.com 1 points 7 months ago* (last edited 7 months ago)

It certainly doesn't help that Lemmy had and still has absolutely no sensible way to actually surface niche communities to its subscribers. Unlike Reddit, it doesn't weigh posts by their relative popularity within the community but only by total popularity/popularity within the instance. There's also zero form of community grouping (like Reddit's multireddits) - all of which effectively eliminates all niche communities from any sensible main view mode and floods those with shitty memes and even shittier politics only. This pretty much suffocated the initially enthusiastic niche tech communities I had subscribed to. They stood no chance to thrive and their untimely death was inevitable.

There are some very tepid attempts to remedy this in upcoming Lemmy builds, but I fear it's too little too late.

I fear that Lemmy was simply nowhere near mature enough when it mattered and it has been slowly bleeding users and content ever since. I sincerely hope I'm wrong, though.

source

[–] PumpkinDrama@reddthat.com 0 points 7 months ago

Visibility-Based Ranking: Factor in how often a post is shown to users by tracking the number of times a post appears in users' feeds and calculating an "engagement rate" by dividing votes by views. Rank "Top of All Time" posts using this engagement rate. This option cannot be implemented as the software does not keep track of post views or the number of times a post appears in users' feeds.

[–] PumpkinDrama@reddthat.com 0 points 7 months ago

Community-Specific Normalized Scoring: Adjust post scores based on each community's monthly active user count at the time of posting. Unfortunately, this option cannot be implemented as the software does not keep track of the monthly active user count for each community over time.

[–] PumpkinDrama@reddthat.com 0 points 7 months ago

Normalized Scoring: Adjust post scores based on the instance's monthly active user count at the time of posting. However, this option cannot be implemented as the software does not keep track of the monthly active user count over time.

 

Hello, I'm looking for a new distro that aligns with my privacy preferences and offers a wide range of packages without requiring me to search for PPAs, similar to Manjaro. I've grown uneasy about Manjaro's decision to collect unique data like MAC addresses and disk serial numbers by default, even if it's for diagnostic purposes.

In light of this, I'd like to ask for your recommendations on a Linux distro that meets the following criteria:

  1. No opt-out telemetry: I'm looking for a distro that doesn't collect any unique data by default.
  2. Access to a wide range of packages: I prefer a distro that offers a vast repository of packages, so I don't have to search for PPAs or third-party repositories.
  3. User-friendly: I'm not a fan of complicated configurations or steep learning curves, so a distro with a user-friendly approach would be ideal.

I'm curious to hear any recommendations you might have. Thanks!

 

To automatically and recursively download subtitles for all videos in a directory on Arch Linux, you have several options. Here's a comprehensive approach using some of the tools mentioned in the search results:

Using Subliminal

Subliminal is a powerful command-line tool that can recursively search for video files and download subtitles for them[1].

  1. Install Subliminal:
sudo pacman -S subliminal
  1. Use the following command to download subtitles recursively:
subliminal download -l en /path/to/your/video/directory

Replace "en" with your preferred language code and adjust the directory path as needed.

Using QNapi

QNapi is another excellent option for downloading subtitles[5].

  1. Install QNapi:
sudo pacman -S qnapi
  1. Use QNapi in console mode to download subtitles recursively:
find /path/to/your/video/directory -type f \( -name "*.mp4" -o -name "*.mkv" -o -name "*.avi" \) -exec qnapi -c {} +

This command finds all video files with .mp4, .mkv, or .avi extensions and passes them to QNapi for subtitle download.

Using yt-dlp

While primarily used for downloading videos, yt-dlp can also download subtitles for local video files[2].

  1. Install yt-dlp:
sudo pacman -S yt-dlp
  1. Use the following command to download subtitles recursively:
find /path/to/your/video/directory -type f \( -name "*.mp4" -o -name "*.mkv" -o -name "*.avi" \) -exec yt-dlp --write-sub --sub-lang en --skip-download {} +

Replace "en" with your preferred language code.

Using OpenSubtitlesDownload

OpenSubtitlesDownload is a Python script that can be used to download subtitles[3][4].

  1. Install OpenSubtitlesDownload:
yay -S opensubtitlesdownload
  1. Use the following command to download subtitles recursively:
find /path/to/your/video/directory -type f \( -name "*.mp4" -o -name "*.mkv" -o -name "*.avi" \) -exec OpenSubtitlesDownload.py {} +

Additional Tips

  • For all these methods, you may need to adjust the file extensions in the find command to match your video file types.
  • Some of these tools may require you to create an account on the subtitle service they use (e.g., OpenSubtitles.org).
  • If you encounter rate limiting issues, you may need to add delays between downloads or use a tool that handles this automatically.

Remember to respect copyright laws and the terms of service of the subtitle providers when downloading subtitles.

Citations: [1] https://www.tecmint.com/best-linux-movie-subtitles-player-software/ [2] https://wiki.archlinux.org/title/Yt-dlp [3] https://aur.archlinux.org/packages/opensubtitlesdownload [4] https://bbs.archlinux.org/viewtopic.php?id=162416 [5] https://man.archlinux.org/man/qnapi.1.en

 

To synchronize your home directory between two Manjaro systems using rsync, you can follow these steps:

Preparation

  1. Ensure both systems are connected to the same network.
  2. Install rsync on both systems if it's not already installed:
sudo pacman -S rsync
  1. Determine the IP address of the destination system:
ip addr show

Syncing the Home Directory

To sync your home directory from the source system to the destination system, use the following command on the source system:

rsync -av --update ~/ username@destination_ip:/home/username/

Replace username with your actual username on the destination system, and destination_ip with the IP address of the destination system[1][2].

Explanation of the Command

  • -a: Archive mode, which preserves permissions, ownership, timestamps, etc.
  • -v: Verbose mode, which provides detailed output of the sync process.
  • --update: This option skips files that are newer on the receiver side.
  • ~/: This is the source directory (your home directory on the current system).
  • username@destination_ip:/home/username/: This is the destination, specifying the user, IP address, and path on the remote system[1][3].

Additional Considerations

  1. SSH Key Authentication: For a smoother experience, set up SSH key authentication between the two systems. This eliminates the need to enter a password each time you run rsync[4].

  2. Exclude Files: You might want to exclude certain directories or files. Use the --exclude option:

    rsync -av --update --exclude '.cache' --exclude '.local/share/Trash' ~/ username@destination_ip:/home/username/
    
  3. Dry Run: Before performing the actual sync, you can do a dry run to see what would be transferred:

    rsync -av --update --dry-run ~/ username@destination_ip:/home/username/
    
  4. Bandwidth Limit: If you're concerned about network usage, you can limit the bandwidth:

    rsync -av --update --bwlimit=1000 ~/ username@destination_ip:/home/username/
    

    This limits the transfer to 1000 KB/s[3].

  5. Incremental Backups: The --update flag ensures that only newer files are transferred, making subsequent syncs faster.

Remember to run this command from the source system, and ensure you have the necessary permissions on both systems. Always double-check your command before running it to avoid unintended data loss or overwriting[2][5].

Citations: [1] https://www.bleepingcomputer.com/forums/t/748252/a-guide-to-backing-up-your-home-directory-using-rsync/ [2] https://www.reddit.com/r/linux4noobs/comments/qtu0ww/backup_and_restore_home_directory_with_rsync/ [3] https://www.cherryservers.com/blog/how-to-use-rsync-on-linux-to-synchronize-local-and-remote-directories [4] https://www.tecmint.com/rsync-local-remote-file-synchronization-commands/ [5] https://www.digitalocean.com/community/tutorials/how-to-use-rsync-to-sync-local-and-remote-directories [6] https://stackoverflow.com/questions/9090817/copying-files-using-rsync-from-remote-server-to-local-machine/9090859 [7] https://www.heficed.com/tutorials/vps/how-to-use-rsync/

view more: next ›