Architeuthis

joined 2 years ago
[–] Architeuthis@awful.systems 12 points 2 years ago* (last edited 2 years ago)

There's technobabble as a legitimate literary device, and then there's having randomly picked up that comments and compilers are a thing in computer programming and proceeding to write an entire ~~parable~~ ~~anti-wokism screed~~ interminable goddamn manifesto around them without ever bothering to check what they actually are or do beyond your immediate big brain assumptions.

[–] Architeuthis@awful.systems 10 points 2 years ago* (last edited 2 years ago) (5 children)

Seeing as incel apologetics occasionally appear even in mainstream rat/ea outlets like slate star or shetl (nerds are an oppressed people unfairly stigmatized by the fairer sex, and other Good Guy grievances), one has to wander what the e/a calculus is in regards to doing the most good with your genitals.

Like, keeping your effective altruist higher-ups in peak effectiveness by shielding them from sexual frustration has got to be worth a few million billion far future simulated lives, at the trifling cost of a little of your time and some lube, right?

I can't imagine the peer pressure dynamics in such a dating environment to be too healthy.

[–] Architeuthis@awful.systems 10 points 2 years ago* (last edited 2 years ago) (2 children)

How did Sam and Caroline get into taking high doses of ADHD medication? We think it was via Scott Alexander Siskind, the psychiatrist behind the rationalist blog Slate Star Codex.

Siskind occasionally writes up particular psychiatric drugs as public education. One popular piece was β€œAdderall Risks: Much More Than You Wanted To Know” from December 28, 2017.

Not to cast further aspersions or anything, but siskind did write a sort of follow up (titled psychopharmacology of ftx or something like that if you feel like googling it) where he explicitly denies ever having met the FTX psychiatrist/dealer, even though a) he admits they actually worked in the same hospital for a time and, perhaps more tellingly, b) no one asked.

Also according to the birdsite the ftx psychiatrist may have in fact been a huge creep.

[–] Architeuthis@awful.systems 20 points 2 years ago* (last edited 2 years ago) (9 children)

This reads very, uh, addled. I guess collapsing the wavefunction means agreeing on stuff? And the uncanny valley is when the vibes are off because people are at each others throats? Is 'being aligned' like having attained spiritual enlightenment by way of Adderall?

Apparently the context is that he wanted the investment firms under ftx (Alameda and Modulo) to completely coordinate, despite being run by different ex girlfriends at the time (most normal EA workplace), which I guess paints Elis' comment about Chinese harem rules of dating in a new light.

edit: i think the 'being aligned' thing is them invoking the 'great minds think alike' adage as absolute truth, i.e. since we both have the High IQ feat you should be agreeing with me, after all we share the same privileged access to absolute truth. That we aren't must mean you are unaligned/need to be further cleansed of thetans.

[–] Architeuthis@awful.systems 5 points 2 years ago* (last edited 2 years ago) (1 children)

Like I said, you aren't missing anything,

image of tweet

Also I just added the image text in the description in case lemmy is weird with inline images.

[–] Architeuthis@awful.systems 4 points 2 years ago* (last edited 2 years ago)

It's supposed to be from the book the moneyball guy wrote about him that was recently released, according to several seconds of googling 'SBF on Shakespear'.

[–] Architeuthis@awful.systems -1 points 2 years ago

Enshittification

Once [a company] can make more money by screwing its customers, that screw-job becomes a fait accompli.

[–] Architeuthis@awful.systems 9 points 2 years ago* (last edited 2 years ago) (1 children)

Note that near the end of the original post the writer claims to have unilaterally decided to pay his sources a whistleblower fee of $5K each, which will probably muddy the waters a lot if this ever were to get traction outside ea/lw circles.

I’m very grateful to the two staff members involved for coming forward and eventually spending dozens of hours clarifying and explaining their experiences to me and others who were interested. To compensate them for their courage, the time and effort spent to talk with me and explain their experiences at some length, and their permission to allow me to publish a lot of this information, I (using personal funds) am going to pay them each $5,000 after publishing this post.

[–] Architeuthis@awful.systems 11 points 2 years ago

Seriously, the mandatory forced equanimity of the text went from merely off-putting to pretty gross actually as it was becoming increasingly apparent the nonlinear people are basically sociopaths who make it a point of pride to flagrantly abuse anyone who finds themselves at the other end of a business arrangement with them, not to mention that their employment model and accounting practices as described seem wildly illegal anywhere not a libertarian dystopia, even without going into the allegations about workplace romance.

Except they are EAs doing unspecified x-risk work, aka literally God's work, so they are afforded every lenience and every benefit of a doubt, I guess.

[–] Architeuthis@awful.systems 4 points 2 years ago (1 children)

If they open their APIs so I can coordinate different brands without downloading a bazillion different apps and as long as I can do it without my data leaving the house, I'll think about it.

[–] Architeuthis@awful.systems 7 points 2 years ago* (last edited 2 years ago) (2 children)

Not sure if it's a NSFW assertion, but to me the p-zombie experiment seems like the result of a discourse that went off the rails very early and very hard into angels on the head of a pin territory, this lw post notwithstanding.

Like, as far as I can tell, imagining a perfectly cloned reality except with the phenomenon in question assumed away, is supposedly (metaphysical) evidence that the phenomenon exists, except in a separate ontology? Isn't this basically like using reverse Occam's razor to prove that the extra entities are actually necessary, at least as long as they somehow stay mostly in their own universe?

Plus, the implicit assumption that consciousness can be defined as some sort of singular and uniform property you either have or don't seems inherently dodgy and also to be at the core of the contradiction; like, is taking p-zombies too seriously a reaction specifically to a general sense of disappointment that a singular consciousness organelle is nowhere to be found?

[–] Architeuthis@awful.systems 0 points 2 years ago (1 children)

Effective Utopianism or: how I learned to stop worrying about the type of people attracted to low regulation environments with poor human right records and love ~~seasteading~~ charter cities.

view more: β€Ή prev next β€Ί