In a nutshell, it essentially increases the range of brightness values (luminance/gamma to be specific) that can be sent to a display. This allows content to both be brighter, and to display colours more accurately as there are far more brightness levels that can be depicted. This means content can look more lifelike, or have more "pop" by having certain elements be brighter than others. There's more too, and it's up to the game/movie/device as to what it should do with all this extra information it can send to the display. This is especially noticeable on an OLED or QD OLED display, since they can individually dim or brighten every pixel. Nits in this context refers to the brightness of the display - 1000 nits is far brighter than most conventional displays (which are usually in the 300-500 range).
No, they just call Joker "senpai" all the time, which sounds like sin pi.
What sort of system are you on, and what have you been trying? The best setup is with an AMD GPU and a more up to date distro (Fedora, Arch, so on). I can give some help if you need.
Oh boy, I should have caught that. Ironic, considering saying things like "ATM machine" is a pet peeve of mine.
In a small room where it's the only light source, it's still a crazy amount of light. My eyes genuinely had to get used to the brightness for a couple minutes after I set it up for the first time, and the walls sometimes looked like the ceiling light was on.
If you ever get the opportunity, try out HDR ITM tone mapping (essentially a HDR upconversion thing you can do with Gamescope on Linux) playing Persona 3 Reload on a QD OLED monitor (for that extra brightness) in a dark room. Even though it's not even a native HDR game, with ITM it looks so good, especially because it's a game with a lot of dark graphics mixed in with super bright. The text pops, and combat is next-level.
If you're interested, RIP Felix did a pretty good video on YouTube about the YLOD failures. TL:DW is most early models have defective GPUs, plus the ones that survived now have aging capacitors, but there's tools now to find the exact cause from the system controller (SYSCON). A GPU swap is pretty involved though, so needs a skilled technician to pull it off. Still, if you have one of those early backwards compatible models, having it repaired isn't that bad of an idea nowadays since those consoles are only getting rarer.
The PS3 is in competition with the original Xbox One for being the most undercooked and overpriced launch of a game console. I guess it's what made the recovery even more astounding.
I guess they had to remove backwards compatibility at some point considering the solution was to shove an entire PS2 CPU and GPU onto the motherboard, massively driving up the already stratospheric production cost that they lost money on even with the high launch prices. Still, it is unfortunate it went away, as it ensured that PS2 games would play perfectly and as intended on PS3 with a proper HDMI output too. Plus, since it went away early, it means all the models with backwards compatibility have the defective GPUs that can cause a yellow light of death. A PS3 Slim with PS2 compatibility would have been amazing. I agree so much with XMB being peak UI design as well, almost every console following has in my opinion a worse UI, other than maybe the Switch, but that's because the Switch barely has much beyond a game selector.
I am quite excited for if Valve make a successor to the Steam Machine, as the Deck shows a Linux gaming device can be the perfect do anything device. The first Steam Machine was not ready for prime time when it came out, but it came before Proton.
We don't talk about ~~Bruno~~ the Microsoft POSIX subsystem
I haven't experienced issues with oranges on my setup (AW3423DWF, 7900 XTX). Perhaps it is to do with your hardware?