this post was submitted on 27 Jul 2025
628 points (78.8% liked)

memes

16512 readers
2786 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/AdsNo advertisements or spam. This is an instance rule and the only way to live.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Tartas1995@discuss.tchncs.de 2 points 1 day ago (2 children)

Sorry my phrasing was bad and made it confusing. Let me explain it in detail.

They correctly choose a unsigned int for the time but they based it on Unix time, and Unix time is signed. So they choose a system that would require an conversion from Unix time to Bitcoin time (or the other way around) anyway. But you don't need to be able to have a timestamp for 1970, which their timestamp system supports, because instead of counting from 2008 (the invention of Bitcoin) they count from 1970. Wasting 38 years and as you know Unix time is hitting a limit in 2038, 68 years after its start, Bitcoin time is unsigned and so it gets to 2106. 2106-1970= 136 years. And they are wasting 38 years!!! Why? You need a conversion between both after 2038 anyway. And if they really care for cheap conversion, a signed 64bit value would be much better, because after 2038, that will probably be the standard. So they chose to waste 38 years for compatibility which will break after 2038, instead of choosing compatibility after 2038 for 292 billion years.

And if size was the reason and 64bit timestamps would have been too big, just start counting from 2008 (or better 2009 when the network started) and get all those juicy 136 years instead of 98 years.

It is stupid.

[–] NateNate60@lemmy.world 1 points 1 day ago* (last edited 1 day ago) (1 children)

The choice of a uint32_t for time saves 4 bytes per transaction. That doesn't sound like much, but with 1.2 billion transactions recorded, it adds up to almost 10 GB of space saved.

They could, ultimately, just replace it with a uint64_t some time in the 22nd century without much fuss. In the late 2000s when Bitcoin was created, storage space was at a significant cost, but now it is quite cheap and in the 2100s it will undoubtedly be even cheaper.

[–] Tartas1995@discuss.tchncs.de 1 points 1 day ago* (last edited 1 day ago)

10gb, on a 670gb big Blockchain. Those 10gb are super important.

And again, size would an ok argument if they didn't go for uint32 instead of int32. Because they broke compatible with Unix time for no reason at that moment. Unless they wanted to min/Max every bit and then why did they start with 1970? And not 2008/2009?

It makes no sense.

Also in 2008, 10gb would have cost you around $1. Ofc, each node would have required the 10 additional gb, so each node would be $1 more. Of course, there weren't that many transaction in the chain and it wouldn't actually cost that much, but ok.

[–] nitrolife@rekabu.ru 1 points 1 day ago

or they can simply inherit the UNIXTIME library, in which 0 has shifted from 1970 to 2038, and add one additional "time epoch" flag. Think about what's easier - create your own time library or inherit from unixtime?