Probably consistency.
Zelda was 20 fps, but it was 20 fps all the time so your brain adjusted to it. You could get lost in the world and story and forget you were playing a game.
Modern games fluctuate so much that it pulls you right out.
!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.
The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:
Rule 1- All posts must be legitimate questions. All post titles must include a question.
All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.
Rule 2- Your question subject cannot be illegal or NSFW material.
Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.
Rule 3- Do not seek mental, medical and professional help here.
Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.
Rule 4- No self promotion or upvote-farming of any kind.
That's it.
Rule 5- No baiting or sealioning or promoting an agenda.
Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.
Rule 6- Regarding META posts and joke questions.
Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.
On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.
If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.
Rule 7- You can't intentionally annoy, mock, or harass other members.
If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.
Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.
Rule 8- All comments should try to stay relevant to their parent content.
Rule 9- Reposts from other platforms are not allowed.
Let everyone have their own content.
Rule 10- Majority of bots aren't allowed to participate here. This includes using AI responses and summaries.
Our breathtaking icon was bestowed upon us by @Cevilia!
The greatest banner of all time: by @TheOneWithTheHair!
Probably consistency.
Zelda was 20 fps, but it was 20 fps all the time so your brain adjusted to it. You could get lost in the world and story and forget you were playing a game.
Modern games fluctuate so much that it pulls you right out.
Well a good chunk of it is that older games only had ONE way they were played. ONE framerate, ONE resolution. They optimized for that.
Nowadays they plan for 60, 120, and if you have less too bad. Upgrade for better results.
Game design is a big part of this too. Particularly first person or other fine camera control feels very bad when mouse movement is lagging.
I agree with what the other commenters are saying too, if it feels awful at 45 fps your 0.1% low frame rate is probably like 10 fps
I think it has something to do with frame times.
I'm not an expert but I feel like I remember hearing that low framerate high frametime feels worse than low framerate low frametime. Something to do with the delay it takes to actually display the low framerate?
As some of the other comments mentioned, it's probably also the framerate dropping in general too.
Stuttering, but mostly it's the FPS changing.
Lock the FPS to below the lowest point where it lags, and suddenly it wont feel as bad since it's consistent.
The display being at a higher resolution doesnt help either. Running retro games on my fancy flatscreen hi-def massive TV makes them look and feel so much worse than on the smaller fuzzy CRT screens of the time.
I can't stand modern games with lower frame rates. I had to give up on Avowed and a few other late titles on the series S because it makes me feel sick when turning the camera. I assume most of the later titles on xbox will be doing this as theyre starting to push what the systems are capable of and the series S can't really cope as well.
Are you using an OLED screen?
I had to tinker with mine a fair bit before my PS1 looked good on it.
Stability of those fps are even more important. Modern games have problems with that.
Because they all fuck with the frame timing in order to try to make the fps higher (on paper)
also latency! playing with high input latency typical of when modern game engines choke is awful.
It's a few things, but a big one is the framerate jumping around (inconsistent frame time). A consistent 30fps feels better than 30, 50, 30, 60, 45, etc. Many games will have a frame cap feature, which is helpful here. Cap the game off at whatever you can consistently hit in the game that your monitor can display. If you have a 60hz monitor, start with the cap at 60.
Also, many games add motion blur, AI generated frames, TAA, and other things that really just fuck up everything. You can normally turn those off, but you have to know to go do it.
If you are on console, good fucking luck. Developers rarely include such options on console releases.
30 50 30 60 30... Thats FPS... Frametime means the time between each frame in this second.
It depends on what you're running, but often if the frame rate is rock solid and consistent it helps it feel a lot less stuttery. Fallout games are not known for their stability and well functioning unfortunately.
For comparison, deltarune came out a few days ago, that's locked to 30 fps. Sure it's not a full 3D game or anything, but there's a lot of complex motion in the battles and it's not an issue at all. Compared to something like bloodborne or the recent Zeldas, even after getting used to the frame rate they feel awful because they're stuttering all the damn time.
I think player expectations play a big role here. It's because you grew up with 20fps on ocarina of time that you accept how it looks.
I'm pretty sure that game is not a locked 20 FPS and can jump around a bit between 15-20, so the argument that it is locked 20 and so feels smooth doesn't really convince me.
If that game came out today as an indie game it would be getting trashed for its performance.
Funny
I played a lot of Lunistice some time back. It's a retro 3D platformer that has an option to cap the framerate at 20 for a "more authentic retro feel". Fun lil' game, even if I eventually uncapped the framerate because it's also a high-speed and precision platformer and doing that at 20FPS is dizzying.
And yes absolutely Zelda 64 chokes on its 20 frames from time to time. I played it enough (again, yearly tradition, which started when I first finished the duology in the mid-aughts) to know that.
But it wouldn't change the fact that its absolute maximum is 20 and it still doesn't feel bad to play.
Couple things. Frame timing is critical and modern games aren’t programmed as close to the hardware as older games were.
Second is the shift from CRT to modern displays. LCDs have inherent latency that is exacerbated by lower frame rates (again, related to frame timing).
Lastly with the newest displays like OLED, because of the way the screen updates, lower frame rates can look really jerky. It’s why TVs have all that post processing and why there’s no “dumb” TVs anymore. Removing the post process improves input delay, but also removes everything that makes the image smoother, so higher frame rates are your only option there.
First time hearing that about OLEDs, can you elaborate? Is it that the lack of inherent motion blur makes it look choppy? As far as I can tell that's a selling point that even some non-oled displays emulate with backlight strobing, not something displays try to get rid of.
Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts
Edit: to be clear, this is the screen's refresh rate, the game doesn't need to run at hfr to benefit.
I don’t understand all the technicals myself but it has to do with the way every pixel in an OLED is individually self-lit. Pixel transitions can be essentially instant, but due to the lack of any ghosting whatsoever, it can make low frame motion look very stilted.
Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts
That’s a misunderstanding. CRTs technically don’t have refresh rates, outside of the speed of the beam. Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.
The LCD latency has to do with input polling and timing based on display latency and polling rates. Also, there’s added latency from things like wireless controllers as well.
The actual frame rate of the game isn’t necessarily relevant, as if you have a game at 60 Hz in a 120 Hz display and enable black frame insertion, you will have reduced input latency at 60 fps due to doubling the refresh rate on the display, increasing polling rate as it’s tied to frame timing. Black frame insertion or frame doubling doubles the frame, cutting input delay roughly in half (not quite that because of overhead, but hopefully you get the idea).
This is why, for example, the Steam deck OLED has lower input latency than the original Steam Deck. It can run up to 90Hz instead of 60, and even at lowered Hz has reduced input latency.
Also, regarding LCD, I was more referring to TVs since we’re talking about old games (I assumed consoles). Modern TVs have a lot of post process compared to monitors, and in a lot of cases there’s gonna be some delay because it’s not always possible to turn it all off. Lowest latency TVs I know are LG as low as 8 or 9ms, while Sony tends to be awful and between 20 and 40 ms even in “game mode” with processing disabled.
That’s a misunderstanding. CRTs technically don’t have refresh rates, outside of the speed of the beam. Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.
Essentially, the speed of the beam determined how many lines you could display, and the more lines you tried to display, the slower the screen was able to refresh. So higher resolutions would have lower max refresh rates. Sure, a monitor could do 120 Hz at 800x600, but at 1600x1200, you could probably only do 60 Hz.
Standards were settled on based on power frequencies, but CRTs were equally capable of 75, 80, 85, 120Hz, etc.
That's why I specified 60hz :)
I see that you meant TVs specifically but I think it is misleading to call processing delays 'inherent' especially since the LG TV you mentioned (which I assume runs at 60hz) is close to the minimum possible latency of 8.3ms.
True, but even that is higher than the latency was on the original systems on CRT. My previous comments were specific to display tech, but there’s more to it.
Bear in mind I can’t pinpoint the specific issue for any given game but there are many.
Modern displays, even the fastest ones have frame buffers for displaying color channels. That’s one link in the latency chain. Even if the output was otherwise equally fast as a CRT, this would cause more latency in 100% of cases, as CRT was an analogue technology with no buffers.
Your GPU has a frame buffer that’s essentially never less than one frame, and often more.
I mentioned TVs above re: post processing.
Sometimes delays are added for synchronizing data between CPU and GPU in modern games, which can add delays.
Older consoles were simpler and didn’t have shaders, frame buffers, or anything of that nature. In some cases the game’s display output would literally race the beam, altering display output mid-“frame.”
Modern hardware is much more complex and despite the hardware being faster, the complexity in communication on the board (CPU, GPU, RAM) and with storage can contribute to perceived latency.
Those are some examples I can think of. None of them alone would be that much latency, but in aggregate, it can add up.
The latency numbers of displays ie the 8-9 or 40ms include any framebuffer the display might or might not have. If it is less than the frame time it is safe to assume it's not buffering whole frames before displaying them.
Your GPU has a frame buffer that’s essentially never less than one frame, and often more.
And sometimes less, like when vsync is disabled.
That's not to say the game is rendered in from top left to bottom right as it is displayed, but since the render time has to fit within the frame time one can be certain that its render started one frame time before the render finished, and it is displayed on the next vsync (if vsync is enabled). That's 22 ms for 45 fps, another 16 ms for worst case vsync miss and 10 ms for the display latency makes it 48 ms. Majora' mask at 20 fps would have 50ms render + 8ms display = 58 ms of latency, assuming it too doesn't miss vsync
old games animations were sometimes made frame by frame. like the guy who drew the character pixel by pixel was like "and in the next frame of this attack the sword will be here"
Part of it is about how close you are to the target FPS. They likely made the old N64 games to run somewhere around 24 FPS since that was an extremely common "frame rate" for CRT TVs common at the time. Therefore, the animations of, well, basically everything that moves in the game can be tuned to that frame rate. It would probably look like jank crap if they made the animations have 120 frames for 1 second of animation, but they didn't.
On to Fallout 4... Oh boy. Bethesda jank. Creation engine game speed is tied to frame rate. They had several problems with the launch of Fallout76 because if you had a really powerful computer and unlocked your frame rate, you would be moving 2-3 times faster than you should have been. It's a funny little thing to do in a single-player game, but absolutely devastating in a multi-player game. So, if your machine is chugging a bit and the frame rate slows down, it isn't just your visual rate of new images appearing that is slowing down, it's the speed at which the entire game does stuff that slowed down. It feels bad.
And also, as others have said, frame time, dropped frames, and how stable the frame rate is makes a huge difference too in how it "feels".
I have never come across a CRT whose native "frame rate" was 24
Yeah it's actually around the 30s (or 60s, depending on whether you consider interlaced frames to be 'true' or just 'halves')
A CRT television runs at 60Hz because it uses the alternating current from the wall as a reference, but in every half cycle it only actually draws half of the image. "60i" as they call it.
So you can say it's 60 interlaced frames a second, which is about comparable to 30 progressive frames.
Two things that haven't quite been mentioned yet:
1) Real life has effectively infinite FPS, so you might expect that the closer a game is to reality, the higher your brain wants the FPS to be in order for it to make sense. This might not be true for everyone, but I imagine it could be for some people.
More likely: 2) If you've played other things at high FPS you might be used to it on a computer screen, so when something is below that performance, it just doesn't look right.
These might not be entirely accurate on their own and factors of these and other things mentioned elsewhere might be at play.
Source: Kind of an inversion of the above: I can't focus properly if games are set higher than 30FPS; It feels like my eyes are being torn in different directions. But then, the games I play are old or deliberately blocky, so they're not particularly "real" looking, and I don't have much trouble with real life's "infinite" FPS.
My favorite game of all time is Descent, PC version to be specific, I didn't have a PlayStation when I first played it.
The first time I tried it, I had a 386sx 20MHz, and Descent, with the graphics configured at absolute lowest size and quality, would run at a whopping 3 frames per second!
I knew it was basically unplayable on my home PC, but did that stop me? Fuck no, I took the 3 floppy disks installer to school and installed it on their 486dx 66MHz computers!
I knew it would just be a matter of time before I got a chance to upgrade my own computer at home.
I still enjoy playing the game even to this day, and have even successfully cross compiled the source code to run natively on Linux.
But yeah I feel you on a variety of levels regarding the framerate thing. Descent at 3 frames per second is absolutely unplayable, but 20 frames per second is acceptable. But in the world of Descent, especially with modern upgraded ports, the more frames the better 👍
Descent broke my brain. I'm pretty good at navigating in FPS', but Descent's 4 axis of movement just didn't click for me. I kept getting lost, recently I tried it again after many years, I just can't wrap my head around it.
Same with space sims. I'm dog awful in dog fights.
Indeed, it's not quite a game for everyone, especially if you're prone to motion sickness. Initially it only took me about a half hour to get a feel for the game, but configuring the controls can still be a headache.
Every time I set the game up on a new or different system, I tend to usually go for loosely the same sort of controls, but each new setup I might change up the controls a bit, like an endless guessing and testing game to see what controls might be ideal, at least for me.
By the way, Descent is considered a 6 Degrees Of Freedom game, not 4. But hey, at least they have a map feature, I'd go insane without the map sometimes..
I meant 6, not sure why I typed 4.
Great games. Free space was almost mind blowing when I first played it as well
If you’re not playing on a VRR display 45fps *will * be horrible and stuttery. 30fps locked would feel significantly better.
Some old games are still pretty rough with their original frame rate. I recently played 4 player golden eye on an n64, and that frame rate was pretty tough to deal with. I had to retrain my brain to process that.
Out of curiosity, did you have an actual N64 hooked up to a modern TV? A lot of those old games meant to be played on a CRT will look like absolute dog shit on a modern LCD panel. Text is harder to read, it is harder to tell what a shape is supposed to be, it's really bad.
Trinatron baby
FPS counters in games usually display an average across multiple frames. That makes the number actually legible if the FPS fluctuates, but if it fluctuates really hard on a frame-by-frame, it might seem inaccurate. If I have a few frames here that were outputted at 20 FPS, and a few there that were at 70 FPS, the average of those would be 45 FPS. However, you could still very much tell that the framerate was either very low or very high, which would be perceived as stutter. Your aforementioned old games probably were frame-capped to 20, while still having lots of processing headroom to spare for more intensive scenes.
Bro when Majora's mask came out nothing was 60fps lol. We weren't used to it like how we are today. I'm used to 80fps so 60 to me feels like trash sometimes.
Ackshuli -- By late 2000 there were a couple games on PC that could get there.
.... If you were playing on high-end hardware. Which most PC gamers were not. (despite what Reddit PCMR weirdos will tell you, PC gaming has always been the home for janky hand-built shitboxes that are pushed to their crying limits trying to run games they were never meant to)
Regardless that's beside the point -- The original MM still doesn't feel bad to go back to (it's an annual tradition for me, and I alternate which port I play) even though it never changed from its 20FPSy roots.
Yeah but even now you can go back and play Majora's mask, and it not feel bad.
But as mentioned the real thing is consistancy, as well as the scale of action, pace of the game etc... Zelda games weren't sharp pinpoint control games like say a modern FPS. Gameplay was fairly slow. and yeah second factor is simply games that were 20FPS, were made to be a 100% consistant 20 FPS. A game locked in at 20, will feel way smoother than one that alternates between 60 and 45