BigWeed
Imagine if capitalists could capture the additional economic benefit from a more connected society. The railroads could be able to sue individuals who started a business as a result of taking the train. Of course these people should have insurance to safeguard them as long as they properly reported how the idea came up.
fwiw, I hear kratom hits different if you've never been on dope
I definitely think it should be legal and accessible because it's a valuable tool for people on opioids. I don't think we have a kratom epidemic. But in a conversation about alcohol alternatives with people who aren't as informed about drug experimentation, I'd voice that kratom as a lot more potential consequences from say kava, cannabis, etc.
Not kratom in isolation. But if there is one drug that I'd say to not fuck around with it's opioid and opioid-like analgesics. I have some friends who have been struggling for the past 15 years to get off of opioids, and many who didn't make it. No other drug category has contributed more to decimating my friend group.
kratom
I don't know how to say this nicely. Don't do kratom unless you're already an opioid addict and you need to come off. It's not a safe drug to experiment with recreationally because it's so easy to get addicted to. I'm a heavy drug user and promote experimenting with drugs except for opioids (and kratom) because of how many people I've seen absolutely ruin their lives on it.
I hoped the takeaway was to not evangelize the tech because it hurts people and withhold your labor from furthering this type of work, rather than "don't use it". People no longer have an option with not using it, the expectation of productivity has gone up and it's either use it or be replaced by someone who will.
Sure but the argument is that we shouldn't be so quick to accept technology that has negative consequences. This thread is all about job layoffs and loss of positions for those first entering the labor market because of AI speculation and labor replacement for low productivity tasks. This specific technology has consequences and maybe we shouldn't be so quick to fervently accept it with open arms.
One big theme of the book is we have a moral obligation to withhold labor from developing technology that uniquely benefits governments and large corporations. Similarly, you're defending the using ai to 'stylize text' even though it is disproportionately benefiting a fortune 500 news firm and hurting new labor entrants. The technology is not neutral and which side you are on?
The creator of Eliza found that people would sneak into his office late at night in order to have a secret conversations with it. They would say that the chatbot understood them, even those who understood it was a program. He would be asked to leave the room so they could have private conversations with it. He found this to be very alarming and it was one of the reasons he wrote the book. These stories are in that book.
The first chatbot (ELIZA) creator, Joseph Weizenbaum, wrote a book called Computer Power and Human Reason where he argued that we shouldn't be so ready to accept technology that have extremely ~~native~~ negative moral and ethical consequence. It's a good book and very relevant for something written in 1976.
The one down there is the boot. Then there's the thimble, and the tophat.
You'll have to pay the duty fee for your package to be released to you. Shippers usually want to cover this for you on your behalf, which will be offered as DDP (Delivered Duty Paid), in contrast to DDU (Delivered Duty Unpaid). If the package needs to be inspected, the carrier might charge $80-$200 (as per the image). If you got your package DDP then the shipper will cover this for you and shippers will generally amortize this cost over many customers so you'll see the cost of shipping go up depending on how often these fees are levied.