heythatsprettygood

joined 2 years ago
MODERATOR OF
[–] heythatsprettygood@feddit.uk 5 points 1 month ago (3 children)

Microsoft are likely going to keep on trying anyway, Deck is too big of a threat to their Windows gaming monopoly for them to ignore it. Just look at how they managed to get so many Windows handheld models out of the door seemingly minutes after the Deck released with its Linux based SteamOS. Their worst nightmare is people having a viable alternative to their platforms.

[–] heythatsprettygood@feddit.uk 8 points 1 month ago

Fair enough. Steam sales are a pathway to deals some consider unnatural, or so the saying goes. Especially with things like Humble Bundle.

[–] heythatsprettygood@feddit.uk 21 points 1 month ago (4 children)

People around me have a different sort of take. Most of them have big powerful desktops already, and Deck doesn't really appeal to them, so they all ended up buying Switch 1s eventually for the exclusives and portable use, and will likely follow with Switch 2. Deck is still amazing value though, considering it's a full Linux PC you can take anywhere and use how you wish.

[–] heythatsprettygood@feddit.uk 3 points 1 month ago (1 children)

I take that you might be interested in my little shitposting community !okbuddyborders@feddit.uk if you want more Ace Combat memes.

[–] heythatsprettygood@feddit.uk 2 points 1 month ago

Ah, I should have made that more clear in the meme. Both NVIDIA and ATI messed up in this era, bad. Sony's efforts with the Cell are always so fascinating - so much potential is in that (just look at the late PS3 era games), but they just could not get it to the point of supplanting the GPU.

[–] heythatsprettygood@feddit.uk 1 points 1 month ago (1 children)

If it's a reflow, your PS3 is running on borrowed time. A reflow heats up the chip enough that parts of it expand enough to make the GPU work again temporarily (the solder bumps on the bottom of the silicon attaching it to the interposer line up their cracks again), but eventually you'll be back to square one. The real fix is to replace the 90nm GPU with a later 65 or 45nm variant that has the fixed design (search up "PS3 frankenstein mod" for more). There is thermal paste both under and above the IHS - the one under for taking the heat from the silicon up to the IHS, then the top layer for taking it to the heatsink. Here's an image of a delidded RSX and Cell to show (apologies for the quality, was the best one I could easily find).

PS3s did cook themselves, but not to the extent of the 360.

It is funny to see how there are probably so many misdiagnosed 360s out there with bad power supplies that have been subjected to being bolt modded (shudder) or something. It doesn't help that the three red lights just mean "general hardware fault" without doing the button combination to get further information. I guess at least more helpful than the PS3, whose diagnostics were only made available recently due to a key being cracked.

[–] heythatsprettygood@feddit.uk 1 points 1 month ago (3 children)

Your description of the Starlet is more accurate, yes. However, its heat output consequently caused some of the issues with the ATI designed parts of the Hollywood, as it exacerbated the thermal issues the 90nm variants had, and that a better designed chip would have been able to handle.

The PS3's IHS was not the problem. There was decent contact and heat transfer, maybe not as perfect as it could have been (there's thermal paste instead of it being soldered into place, which is why a delid and relid is fairly essential if you have a working 90nm PS3 due to aging thermal paste), but definitely not big enough of a problem for a properly designed chip to cook itself at the operating temperatures of the PS3 (75-80 temperature target on the RSX on an early variant at full load). The Cell B.E. next to the RSX that uses more power (consequently outputs more heat) has a similar setup for its IHS, but IBM did not make the same design mistakes as NVIDIA, and so we see very few reports of the CPU cooking itself even in those early PS3s.

[–] heythatsprettygood@feddit.uk 3 points 1 month ago

Those leather jackets won't buy themselves!

[–] heythatsprettygood@feddit.uk 3 points 1 month ago

Agreed, thermals were increasing faster than most manufacturers could handle. Only real exceptions in this time I can think of were IBM (because they had to, PowerPC G5 was such a power hog it pissed off Apple enough for them to switch architectures) and Intel (because they also had to, Pentium 4 was a disaster).

[–] heythatsprettygood@feddit.uk 2 points 1 month ago (6 children)

Wii was mostly okay, but boards with a 90nm Hollywood GPU are somewhat more likely to fail than later 65nm Hollywood-A boards (so RVL-CPU-40 boards and later), especially if you leave WiiConnect24 on as it keeps the Starlet ARM chip inside active even in fan off standby - most 90nm consoles will be okay due to low operating temperatures, but some (especially as thermal paste ages and dust builds) are more likely to die due to bumpgate related problems.

PS3s did crap out with yellow lights of death, although not as spectacularly as 360 red rings (lower proportion due to beefier cooling and different design making the flaws less immediately obvious, but still a problem). NVIDIA on the RSX made the same mistakes as ATI on the Xenos - poor underfill and bump choice that could not withstand the thermal cycles, which should have been caught (NVIDIA and bumpgate is a whole wild story in and of itself though, considering it plagued their desktop and mobile chips). The Cell CPU on there is very reliable though, even though it drew more power and consequently output more heat - it was just the GPU that could not take the heat.

360s mostly red ringed due to faulty GPUs, see previous comments about the PS3 RSX. ATI had a responsibility to choose the right materials, design, and packaging partner to ship to Microsoft for final assembly, and so they must take some responsibility (they also, like NVIDIA, had troubles with their other products at this time, leading to high failure rates of devices like the early MacBook Pros). However, it is unknown if they are fully to blame, as it is unknown who made the call for the final package design.

[–] heythatsprettygood@feddit.uk 1 points 1 month ago

Yeah, pricing is not the greatest at the moment, most likely because there's no reference card to keep other prices in check. Still (at least here in the UK) they are still well below the stratospheric NVIDIA prices for a 5070 Ti and are easily available.

[–] heythatsprettygood@feddit.uk 6 points 1 month ago (2 children)

AMD have been amazing lately. 9070 XT makes buying most other cards in that price range pointless, especially with NVIDIA's melting connectors being genuine hazards. ATI (who were dissolved in 2010 after being bought out by AMD) and NVIDIA in the mid to late 2000s however were dumpster fires in their own ways.

 
 
 
 
 

I've been doing some experimentation lately with my Switch, and have replaced the thermal paste on all layers (under the copper heat spreader, under the heat sink, and on top of the heat pipe). I used a combination of Thermal Grizzly PhaseSheet (for heat spreader and heat sink) and K5 Pro (for heat pipe) if anyone is curious. I am running a Switch with the original SOC (higher heat and power) that I bought in about late 2018.

I wasn't expecting much if anything, but in games they seem to have had a bit of a performance improvement, especially in docked mode. The two games I especially noticed were Fire Emblem Three Houses and Pokemon Violet. In Three Houses, the loading sections in explore mode (so fully in game, with everything still showing on screen) took far less time than I remember (1-3 seconds compared to 5-10) along with a higher frame rate during the loading, and in Violet the frame rate seems a decent bit more stable. Unfortunately, since my Switch is not modded, I do not have any hard performance numbers to back up these perceived improvements. I have heard that the Switch will prioritise CPU clocks over GPU when loading due to thermal restrictions, but I cannot find anything about it being able to boost both if it has the thermal headroom. The outside of the Switch is definitely exhausting more heat, which could suggest more heat is being taken away from the SOC and consequently cooling it down.

Does anyone have any experience with doing this, and have you gotten a similar result? I am curious to find out if this is just placebo, or if this is genuinely a way to increase performance even just by a little bit.

 
 
 
 
179
submitted 1 month ago* (last edited 1 month ago) by heythatsprettygood@feddit.uk to c/lemmyshitpost@lemmy.world
 
 
 
view more: ‹ prev next ›