Do they really hit that much? I might not have a popular opinion there, but if they don't have a performance impact then I probably wouldn't care
Technology
Share interesting Technology news and links.
Rules:
- No paywalled sites at all.
- News articles has to be recent, not older than 2 weeks (14 days).
- No videos.
- Post only direct links.
To encourage more original sources and keep this space commercial free as much as I could, the following websites are Blacklisted:
- Al Jazeera;
- NBC;
- CNBC;
- Substack;
- Tom's Hardware;
- ZDNet;
- TechSpot;
- Ars Technica;
- Vox Media outlets, with exception for Axios;
- Engadget;
- TechCrunch;
- Gizmodo;
- Futurism;
- PCWorld;
- ComputerWorld;
- Mashable;
- Hackaday;
- WCCFTECH;
- Neowin.
More sites will be added to the blacklist as needed.
Encouraged:
- Archive links in the body of the post.
- Linking to the direct source, instead of linking to an article talking about the source.
Do we all want the fucking Blackwall from Cyberpunk 2077?
Fucking NetWatch?
Because this is how we end up with them.
....excuse me, I need to go buy a digital pack of cigarettes for the angry voice in my head.
Consider nicotine+
What was that?
I was sucking on my nicotine nipple, err, I mean my vape.
(Hey, its a more affordable stimulant addiction than coffee now!)
No, not the drug; the app.
Oh, well shit, I had not heard of this lol.
I am partial to I2P as ... potentially, an entirely new, full internet paradigm, not just filesharing, but I will look into this too!
It's a soulseek client, basically. You can share files, chat, put your interests in your profile, etc. It's basically like social media, minus the posts. The only algorithm that exists is the one that shows people with similar interests. You can also view the most common interests. You can also add disinterests, which are the exact opposite.
That does sound very intetesting!
When you realize that you live in a cyberpunk novel. The AI is cracking the ICE. https://cyberpunk.fandom.com/wiki/Black_ICE
I love seeing how much influence William Gibson had on cyberpunk.
It's not intentional but the chap ended up writing works that defined both the Cyberpunk (Neuromancer) and Steampunk (The Difference Engine) genres.
Can't deny that influence.
Business idea: AWS, but hosted entirely within the computing power of AI web crawlers.
Reminds me of the "store data inside slow network requests for the in-transit duration". It was a fun article to read.
I like the idea but couldn't you just go the more direct route and mine crypto?
If someone just wants to download code from Codeberg for training, it seems like it'd be way more efficient to just clone the git repositories or even just download tarballs of the most-recent releases for software hosted on Codeberg than to even touch the Web UI at all.
I mean, maybe you need the Web UI to get a list of git repos, but I'd think that that'd be about it.
Then they'd have to bother understanding the content and downloading it as appropriate. And you'd think if anyone could understand and parse websites in realtime to make download decisions, it be giant AI companies. But ironically they're only interested in hoovering up everything as plain web pages to feed into their raw training data.
The same morons scrape Wikipedia instead of downloading the archive files which trivially can be rendered as web pages locally
Are those blocklists publicly available somewhere?
I would hope not. Kinda pointless if they become public
On the contrary. Open community based block lists can be very effective. Everyone can contribute to them and asphyxiate people with malicious intents.
If you think something like, "if the blocklist is available then malicious agents simply won't use that ips" I don't think if that makes a lot of sense. As the malicious agent will know any of their IPs being blocked as soon as they use them.
Just to give an example of public lists that are working, I have an IRC server and it's getting bombarded with spam bots. It's horrible around the superbowl for some reason, but it just continues year round.
So I added a few public anti spamming lists like dronebl to the config, and the vast majority of the bots are automatically G-Lined/banned.
I blocked almost all big players in hosting, China, Ruasia, Vietnam and now they're now bombarding my site with residential IP address from all over the world. They must be using compromised smart home devices or phones with malware.
Soon everything on the internet will be behind a wall.
Not necessarily compromised, I saw a VPN provider (don’t remember the name) that offered a free tier where the client accepts being used for this.
And I suspect that in the future some VPN companies will be exposed doing the same but with their paid customers.
This isn't sustainable for the ai companies, when the bubble pops it will stop.
In the mean time, sites are getting DDOS-ed by scrapers. One way to stop your site from getting scraped is having it be inaccessible... which is what the scalpers are causing.
Normally I would assume DDOS-ing is performed in order to take a site offline. But ai-scalpers require the opposite. They need their targets online and willing. One would think they'd be a bit more careful about the damage they cause.
But they aren't, because capitalism.
If they had the slightest bit of survival instinct they'd share a archive.org / Google-ish scraper and web cache infrastructure, and pull from those caches, and everything would just be scraped once, repeated only occasionally.
Instead they're building maximally dumb (as in literally counterproductive and self harming) scrapers who don't know what they're interacting with.
At what point will people start to track down and sabotage AI datacenters IRL?
I just thought that having a client side proof-of-work (or even only a delay) bound to the IP might deter the AI companies to choose to behave instead (because single-visit-per-IP crawlers get too expensive/slow and you can just block normal abusive crawlers). But they already have mind-blowing computing and money ressources and only want your data.
But if there was a simple-to-use integrated solution and every single webpage used this approach?
Believe me, these AI corporations have way too many IPs to make this feasible. I've tried per-IP rate limiting. It doesn't work on these crawlers.
A good solution would be to load with a virus, to the PCs connecting from the AI ips, that overloads the computer and makes it explode.
I run my own gitea instance on my own server and within the past week or so I've noticed it just getting absolutely nailed. One repo in particular, a Wayland WM I built. Just keeps getting hammered over and over by IPs in China.
Just keeps getting hammered over and over by IPs in China.
Simple solution: Block Chinese IPs!
This type of large-scale crawling should be considered a DDoS and the people behind it should be charged with cyber crimes and sent to prison.
Applying the Computer Fraud and Abuse Act to corporations? Sign me up! Hey, they're also people, aren't they?
Put the entire datacenter buildings into prison
I think they call that a "job" already
If it’s disrupting their site, it is a crime already. The problem is finding the people behind it. This won’t be some guy on his dorm PC and they’ll likely be in places interpol can’t reach.
they’ll likely be in places interpol can’t reach
Like some Microsoft data center
They're getting hammered again this morning.