will_a113

joined 2 years ago
[–] will_a113@lemmy.ml 3 points 4 hours ago

I like it for content discovery, but it feels weird to upvote bot posts. When I see something interesting enough to comment on I do try to see if there’s a similar article in a better community already or make cross-post.

[–] will_a113@lemmy.ml 13 points 8 hours ago
[–] will_a113@lemmy.ml 11 points 17 hours ago

I doodle incessantly. Like whole pages that are 20% notes, 80% doodles. I had a teacher in middle school who got mad because it seemed like I was constantly distracted (true) and had me stop, but then discovered it was 10x worse if I was not doing something with my hands. Meanwhile my most recent boss looked over at my pad during our first in-person meeting and said, “ah, so you have ADHD too?”

[–] will_a113@lemmy.ml 1 points 17 hours ago

I love Technology Connections, but i do have watch at 1.75X or else it’s too long form for me.

[–] will_a113@lemmy.ml 1 points 2 days ago (2 children)

I've been exclusively using a K2 for my cheap-but-decent-enough setup at work (along with a Breville Dedica) and have found it to be pretty consistent. You do occasionally get some larger particles that slip through, but not enough to create any channeling issues, and it's easy enough to use. Compared to my 1ZPresso J Max, I find it for 1/3 the price it does a great job, especially since I don't have the super-sophisticated palette that would notice any really subtle differences.

Unrelated, is it possible to follow @coffee@a.gup.pe from Lemmy? I've tried from both the web UI and Voyager and haven't been able to get it to work -- this ticket suggests it didn't work as of September, but wanted to check in.

[–] will_a113@lemmy.ml 1 points 2 days ago (1 children)

Have you had any luck importing even a medium-sized codebase and doing reasoning on it? All of my experiments start to show subtle errors past 2k tokens, and at 5k tokens the errors become significant. Any attempt to ingest and process a decent-sized project (say 20k SLOC plus tooling/overhead/config) has been useless, even on models that "should" have a good-enough sized context window.

[–] will_a113@lemmy.ml 3 points 2 days ago

Yes, that’s a valid distinction. Though practically speaking, I don’t really know how different it is from anthropic sharing with Amazon, for example

[–] will_a113@lemmy.ml 29 points 2 days ago* (last edited 2 days ago) (5 children)

Gee I wonder if grok shares data with the rest of twitter, Gemini shares with the rest of google or llama… actually I haven’t used Facebook in ages, I don’t even know if there’s a ChatGPT equivalent service on Facebook.

I do actually wonder if Anthropic shares data with Amazon, or OpenAI with Microsoft (their majority shareholders). That would be a direct 1:1 comparison with what’s happening between deepseek and bytedance (though at least in the latter case you can host-your-own since the model is open source).

[–] will_a113@lemmy.ml 2 points 3 days ago

My desk is littered with post-it notes and papers, as the act of writing something down is usually what helps to "lock" it into my memory for a time. Unfortunately due to the ridiculous number of to-dos I usually have at any given time it makes me look like some kind of crazy note-hoarder.

[–] will_a113@lemmy.ml 2 points 3 days ago

Was going to say “Muppet”, but yours is better!

 

File this under "small wins". I had been banging my head against a technical problem for most of the day yesterday. As I slipped into bed around midnight, I suddenly knew the solution. Despite the call of the pillows, I dragged myself out of bed, down to the laptop and took a full 30 seconds to write it down -- and thank goodness, because by this morning I had forgotten about it again!

[–] will_a113@lemmy.ml 2 points 5 days ago

This gives me Old Internet vibes. Glorious.

[–] will_a113@lemmy.ml 5 points 5 days ago

We need to start writing this type of “Christian” with a K. Most non-evangelicals don’t want any of this bullshit. Every Catholic pope of the last 70 or more years has come out multiple times saying as much.

 

Crowds and water have more in common than you'd think - they both flow like a fluid, with predictable patterns that can turn perilous if not properly managed. Looks like the physics of human herds is no bull, as researchers have uncovered the fluid dynamics behind dangerous crowd crushes.

 

Using Reddit's popular ChangeMyView community as a source of baseline data, OpenAI had previously found that 2022's ChatGPT-3.5 was significantly less persuasive than random humans, ranking in just the 38th percentile on this measure. But that performance jumped to the 77th percentile with September's release of the o1-mini reasoning model and up to percentiles in the high 80s for the full-fledged o1 model.

So are you smarter than a Redditor?

 

When even Cory Doctrow starts to sound like an optimist I have to give myself a reality check as it usually means I'm heading off the deep end. But in this case it just rubs me the wrong way that he talks about Mastodon and Bluesky in the same breath -- one is not like the other.

 

Originality.AI looked at 8,885 long Facebook posts made over the past six years.

Key Findings

  • 41.18% of current Facebook long-form posts are Likely AI, as of November 2024.
  • Between 2023 and November 2024, the average percentage of monthly AI posts on Facebook was 24.05%.
  • This reflects a 4.3x increase in monthly AI Facebook content since the launch of ChatGPT. In comparison, the monthly average was 5.34% from 2018 to 2022.
 

Yet another entry from the truth-is-stranger-than-fiction department, as drug-addicted rats have turned Houston’s police evidence storage into their personal stash house.

 

And I just assumed they called Rainbolt

 

The Breakthrough Starshot program has gotten a lot of press for planning to accelerate tiny (a few grams) probes to 10-20% C with powerful laser pulses, but a new idea proposes building a giant particle accelerator in space to push much larger probes (up to 1000kg) up to speed with a relativistic electron beam. Such a setup might get a large-ish probe to Alpha Centauri within 40 years.

 

Well, maybe not, but a new study suggests drinking it only in the morning can reduce the risk of heart disease and all-cause mortality. so that's something, right?

 

As far as we've been able to tell, all matter in the universe comes in just of two distinct types: fermions (particles that make stuff)and bosons (particles that mediate a force).

However, physicists from Rice University have developed a mathematical framework for a third type, known as paraparticles, whose behavior could imply the existence of elementary particles nobody has ever considered. While they don't yet have a means for experimentally identifying such particles yet (or even predicting exactly what they'd do), this is the first time that such a model has been found that allows for a new particle family.

 

Dubbed Mazarron II, she was extracted from the sea in twenty parts and taken to the laboratories of the Cartagena National Museum of Underwater Archaeology for reconstruction. Laden with a cargo of lead ingots, she will not only offer an insight into the shipbuilding techniques of the Phoenicians but also their metallurgical sophistication.

view more: next ›