gerikson

joined 2 years ago
[–] gerikson@awful.systems 8 points 1 week ago (2 children)

Also into Haskell. Make of that what you will.

[–] gerikson@awful.systems 14 points 1 week ago

LOL the mod gets snippy here too

This comment too is not fit for this site. What is going on with y'all? Why is fertility such a weirdly mindkilling issue?

"Why are there so many Nazis in my Nazi bar????"

[–] gerikson@awful.systems 12 points 1 week ago* (last edited 1 week ago) (8 children)

LessWrong's descent into right-wing tradwife territory continues

https://www.lesswrong.com/posts/tdQuoXsbW6LnxYqHx/annapurna-s-shortform?commentId=ueRbTvnB2DJ5fJcdH

Annapurna (member for 5 years, 946 karma):

Why is there so little discussion about the loss of status of stay at home parenting?

First comment is from user Shankar Sivarajan, member for 6 years, 1227 karma

https://www.lesswrong.com/posts/tdQuoXsbW6LnxYqHx/annapurna-s-shortform?commentId=opzGgbqGxHrr8gvxT

Well, you could make it so the only plausible path to career advancement for women beyond, say, receptionist, is the provision of sexual favors. I expect that will lower the status of women in high-level positions sufficiently to elevate stay-at-home motherhood.

[...]

EDIT: From the downvotes, I gather people want magical thinking instead of actual implementable solutions.

Granted, this got a strong disagree from the others and a tut-tut from Habryka, but it's still there as of now and not yeeted into the sun. And rats wonder why people don't want to date them.

[–] gerikson@awful.systems 14 points 2 weeks ago (5 children)

In the recent days there's been a bunch of posts on LW about how consuming honey is bad because it makes bees sad, and LWers getting all hot and bothered about it. I don't have a stinger in this fight, not least because investigations proved that basically all honey exported from outside the EU is actually just flavored sugar syrup, but I found this complaint kinda funny:

The argument deployed by individuals such as Bentham's Bulldog boils down to: "Yes, the welfare of a single bee is worth 7-15% as much as that of a human. Oh, you wish to disagree with me? You must first read this 4500-word blogpost, and possibly one or two 3000-word follow-up blogposts".

"Of course such underhanded tactics are not present here, in the august forum promoting 10,000 word posts called Sequences!"

https://www.lesswrong.com/posts/tsygLcj3stCk5NniK/you-can-t-objectively-compare-seven-bees-to-one-human

[–] gerikson@awful.systems 10 points 2 weeks ago (4 children)

NYT covers the Zizians

Original link: https://www.nytimes.com/2025/07/06/business/ziz-lasota-zizians-rationalists.html

Archive link: https://archive.is/9ZI2c

Choice quotes:

Big Yud is shocked and surprised that craziness is happening in this casino:

Eliezer Yudkowsky, a writer whose warnings about A.I. are canonical to the movement, called the story of the Zizians “sad.”

“A lot of the early Rationalists thought it was important to tolerate weird people, a lot of weird people encountered that tolerance and decided they’d found their new home,” he wrote in a message to me, “and some of those weird people turned out to be genuinely crazy and in a contagious way among the susceptible.”

Good news everyone, it's popular to discuss the Basilisk and not at all a profundly weird incident which first led peopel to discover the crazy among Rats

Rationalists like to talk about a thought experiment known as Roko’s Basilisk. The theory imagines a future superintelligence that will dedicate itself to torturing anyone who did not help bring it into existence. By this logic, engineers should drop everything and build it now so as not to suffer later.

Keep saving money for retirement and keep having kids, but for god's sake don't stop blogging about how AI is gonna kill us all in 5 years:

To Brennan, the Rationalist writer, the healthy response to fears of an A.I. apocalypse is to embrace “strategic hypocrisy”: Save for retirement, have children if you want them. “You cannot live in the world acting like the world is going to end in five years, even if it is, in fact, going to end in five years,” they said. “You’re just going to go insane.”

[–] gerikson@awful.systems 2 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

More on Banks and Elon's terrible reading of his books

https://www.lawyersgunsmoneyblog.com/2025/07/apes-do-read-banks-elon-they-just-dont-understand-him

edit from the comments I see the meme that The Player of Games is the Culture book to start with is prevalent there too. Oh well.

[–] gerikson@awful.systems 14 points 2 weeks ago

This true, but I'm convinced the original poster knows this and is using the term ironically.

[–] gerikson@awful.systems 6 points 2 weeks ago (3 children)

Everyone is entitled to their own readership of Banks. I'm not saying mine is the one and only. But the Culture is supposed to be a background character, even if Banks spends a lot of time in the later novels "explaining" it. But if the reader only focusses on the lore, they'll miss the quite good characters and psychology that Banks was good at too.

My personal favorite is Use of Weapons, where the focus is on the people doing the Culture's dirty work. In one scene, Zakalwe

spoilerspends an inordinate time trying to protect a useless aristocracy from being wiped out by a revolution, only to find out his side was meant to lose for some inscrutable Mind-directed reason. This kind of shit happens all the time to him, and as he's basically a deeply traumatized individual he's able to keep doing it.

In Look to Windward

spoilerContact goes too far along the path of optimizing "help backwards civilization" and manages to create a genocidal civil war. The survivors decide to try to destroy a Mind (and the Orbital it's managing), and you know, you kind of get why.

[–] gerikson@awful.systems 11 points 2 weeks ago (6 children)

Tired: the universe was created by a deity

Wired: the universe was created by physical forces

Fucking crazy: the universe was created by a figment of my imagination and I'm communicating with it using a blog post https://www.lesswrong.com/posts/uSTR9Awkn3gpqpSBi/dear-paperclip-maximizer-please-don-t-turn-off-the

[–] gerikson@awful.systems 14 points 2 weeks ago (6 children)

Also the Galactic Empire as an anti-scientific hellhole with secret police surveillance.

Witness good old Hari Seldon unveiling his plans on Trantor:

It was not a large office, but it was quite spy-proof and quite undetectably so. Spy-beams trained upon it received neither a suspicious silence nor an even more suspicious static. They received, rather, a conversation constructed at random out of a vast stock of innocuous phrases in various tones and voices.

[Seldon] put his fingers on a certain spot on his desk and a small section of the wall behind him slid aside. Only his own fingers could have done so, since only his particular print-pattern could have activated the scanner beneath.

[…]

“You will find several microfilms inside,” said Seldon. “Take the one marked with the letter T.”

Gaal did so and waited while Seldon fixed it within the projector and handed the young man a pair of eyepieces. Gaal adjusted them, and watched the film unroll before his eyes.

[–] gerikson@awful.systems 20 points 2 weeks ago (8 children)

Some dweeb:

I would recommend “Consider Phlebas” by Iain Banks, which is part of the Culture series of novels. Very formative for me, and I read that while I was writing Theme Park. And I still think it’s the best depiction of a post-A.G.I. future, an optimistic post-A.G.I. future, where we’re traveling the stars and humanity reached its full flourishing.

The protagonist of Consider Phlebas is working for the Culture's enemies, a theocratic empire that has slaves literally bred for loyalty, and the conflict they're engaged in ultimately kills billions of sentient beings. Most of the thoughts about the Culture are his, and he basically decries them as the ultimate wokesters. No wonder HN nerds prefer The Player of Games in which a smart nerd like themselves get recruited as an agent to bring down an empire a bit like our own by being really really good at games.

 

Yes, I know it's a Verge link, but I found the explanation of the legal failings quite funny, and I think it's "important" we keep track of which obscenely rich people are mad at each other so we can choose which of their kingdoms to be serfs in.

 

Apologies for the link to The Register...

Dean Phillips is your classic ratfucking candidate, attempting to siphon off support from the incumbent to help their opponent. After a brief flare of hype before the (unofficial) NH primary, he seems to have flamed out by revealing his master plan too early.

Anyway, apparently some outfit called "Delphi" tried to create an AI version of him via a SuperPAC and got their OpenAI API access banned for their pains.

Quoth ElReg:

Not even the presence of Matt Krisiloff, a founding member of OpenAI, at the head of the PAC made a difference.

The pair have reportedly raised millions for We Deserve Better, driven in part by a $1 million donation from hedge fund billionaire Bill Ackman, who described his funding of the super PAC as "the largest investment I have ever made in someone running for office."

So the same asshole who is combating "woke" and DEI is bankrolling Phillips, supposed to be the new Bernie. Got it.

 

Years ago (we're talking decades) I ran into a small program that randomly generated raytraced images (think transparent orbs, lens flares, reflection etc), suitable for saving as wallpapers. It was a C/C++ program that ran on Linux. I've long since lost the name and the source code, and I wonder if there's anything like that out there now?

 

Rules: no spoilers.

The other rules are made up as we go along.

Share code by link to a forge, home page, pastebin (Eric Wastl has one here) or code section in a comment.

 

The wider community is still on Reddit, I wonder if there’s an interest to have a small alternative?

If not, what’s a good Lemmy instance for these things?

 

In a since deleted thread on another site, I wrote

For the OG effective altruists, it’s imperative to rebrand the kooky ultra-utilitarianists as something else. TESCREAL is the term adopted by their opponents.

Looks like great minds think alike! The EA's need to up their google juice so people searching for the term find malaria nets, not FTX. Good luck on that, Scott!

The HN comments are ok, with this hilarious sentence

I go to LessWrong, ACX, and sometimes EA meetups. Why? Mainly because it's like the HackerNews comment section but in person.

What's the German term for a recommendation that's the exact opposite?

 

[this is probably off-topic for this forum, but I found it on HN so...]

Edit "enjoy" the discussion: https://news.ycombinator.com/item?id=38233810

 

Title is ... editorialized.

19
submitted 2 years ago* (last edited 2 years ago) by gerikson@awful.systems to c/techtakes@awful.systems
 

Title quote stolen from JZW: https://www.jwz.org/blog/2023/10/the-best-way-to-profit-from-ai/

Yet again, the best way to profit from a gold rush is to sell shovels.

 

After several months of reflection, I’ve come to only one conclusion: a cryptographically secure, decentralized ledger is the only solution to making AI safer.

Quelle surprise

There also needs to be an incentive to contribute training data. People should be rewarded when they choose to contribute their data (DeSo is doing this) and even more so for labeling their data.

Get pennies for enabling the systems that will put you out of work. Sounds like a great deal!

All of this may sound a little ridiculous but it’s not. In fact, the work has already begun by the former CTO of OpenSea.

I dunno, that does make it sound ridiculous.

view more: ‹ prev next ›