Oh, man, imagine thinking that minimum requirements weren’t a thing before.
I once deleted the operating system just to fit a single game into my hard drive, booted from floppy while I was playing it and reversed the process when I was finished. Sometimes games were aiming at a specific speed of computer and if you had a computer that didn’t run at that specific number of megahertz the game just ran like a slideshow or in fast forward. I didn’t realize some of my favourite games were running under the speed cap for years sometimes. We just didn’t have a concept of things running at the same refresh rate as your screen in the early 3D era until APIs fully standardized. Sometimes you upgraded your GPU and the hardware accelerated version of your old software rendered game actually ran slower.
Also, game developers “then” made arcade games that literally charged you money for dying, then charged you more money for effectively cheating at the game and actively asked you to literally pay to win. We used to think that was normal.
Also, also, we used to OBSESS about games being bigger. The size the game took up was heavily advertised and promoted, especially on consoles. Bigger was better. We were only kinda glad that CDs could do 500 Mb, so we could keep getting bigger on a single disk, but by the time FMV games got popular triple A games were back to coming into books with disks instead of pages. This was still seen as a selling point.
Also, also, also, the assembly code of a whole bunch of old games is sheer spaghetti. Half of the mechanics in NES games are just bugs. There are a couple of great Youtube channels that just break these down and tweak them. In fairness, they didn’t have development tools as much as a notepad and a pencil, but still.
I mean, the demo for Rollercoaster Tycoon (Mr. “Hand coded in assembly” there) bricked our Windows 98 machine when i installed it as a kid. My dad was pissed: we had to reformat the harddrive, reinstall windows, all that.
Yeah I remember the specific clock speed thing! I had a game that I loved on a friend’s computer and didn’t get to play it much. Some sort of space sim / combat game. Years later I had my own much more powerful machine and was hyped to check it out. Installed via dosbox or whatever, loaded it up, and it ran at fucking 10x speed! It took seconds to walk around a city and the combat was completely unplayable. So sad but also pretty funny. No idea why they attached the FPS directly to the hardware. If you want an easier game, just get a worse computer apparently.
No idea why they attached the FPS directly to the hardware.
It’s the most trivial and straight forward thing to do. The game is a simple loop of:
get user input (can be nothing)
calculate new game state based on old state and input
draw new game state.
The speed of the game is now 100% dependant on the speed of computation.
NOT attaching fps to hardware is the hard thing, as you need to detach the game state loop and the drawing loop and then synchronize them. Doing that yourself is extremely complicated. Today developers don’t even need to think about that because the whole drawing loop is abstracted away by things like directX/Vulcan and the game engine. But without those tools, fps tied to CPU speed is basically the default.
And in fairness a lot of microcomputers at the time were closed specs. Even on PC for a while you were theoretically aiming at a 4Mhz XT or, at worst, also wanted to account for a 8MHz AT. By the time IBM clones had become… you know, just PCs, a lot of devs either didn’t get the memo or chose to ignore it for the reasons you list.
Most of the time “lazy devs” are just “overworked and underfunded devs”, but the point is, that didn’t start this century.
Also games have gotten way more complicated since the gameboy colour era. I’ve coded a basic 2D physics engine from scratch (literally just circles with soft collisions) and its not just enough to set up the vector math correctly. You can literally make a true to real life physics model (as far as the math of infinitely rigid perfect spheres on a perfectly flat plane goes anyway) and have all sorts of problems crop up because computers aren’t the universe and order of computation is a bitch.
Even the first Dark Souls had game ticks tied to the FPS because consoles had been standardized to 30 FPS for decades.
On the PC port, it was locked to 30 FPS, but a super popular mod unlocked the FPS, and at 60 FPS DoT effects ticked twice as fast, and at even higher FPS could kill you before you had time to react.
Hah. In fairness, sound cards weren’t “minimum requirements”. It’s just that depending on the hardware you had the game would just have a completely different soundtrack, 75% of which sounded completely broken. If you were lucky the “minimum spec” was silence. If you were unlucky it was making your beeper sound like somebody had tripped a car alarm.
People these days are out there emulating Roland MT-32s on Raspberry Pis. I didn’t have a sound card until the Pentium era. Every DOS game in my memory sounds like a Furby got a bad case of hiccups.
I leave this as an example, but please understand this is the absolute best case scenario. Michael Land and the rest of the Lucas guys were wizards and actually cared to tune things for multiple options, including really impressive beeper music.
I have heard the difference of sound cards before in a video explaining it, but it is still just a wild too me to hear it, and nearly a bit difficult to imagine it actually being that way. Like I KNOW it was how sound on computers was at that time, but it is still hard to imagine my games sounding so completely differently depending on what pc I play it on.
I have the opposite problem, where I have to remind myself that a lot of people making these memes just don’t have a frame of reference for any of this. I’m used to having been there for the vast majority of home computing, it’s so hard for me to parse having been born with computers just mostly working the way they do now.
I can somewhat comprehens the difference in appearance and sometimes game play, but at the same time not really. I have seen the same game be different on pc vs console and a third version on handheld, and while I know this where all computers, I still very much think of them in the way of game consoles you could also do computer things on, even though I know that they were computers that you could play games on.
I blame it a bit on terminology, every time I hear about old computers, they are always referred to much more similar to how we refer to games consoles today then we do with computers. It is an Amiga 500, Amiga 1000 or an Atari 7800 or Atari ST. That is much more similar in my head to like Nintendo Wii, Nintendo Switch, PlayStation 2 or PlayStation 4.
I have never really heard computers be referred to in that manner now a days, they probably are to some extent in some circles, but I have completely managed to miss them, and I do have some interested in computers. Like I can tell you I have owned a Dell, a HP and a Lenovo amongst some, but I would really have to do some digging to maybe be able to tell what version of them it was.
I know my current one start with G and the following looks some what like lam, but I only know that because it kinda looked like Glam so I named it that and because I have needed to Google that exact model, to look up some stuff.
I can however understand how you feel having grown up with with the computers and now talking to and interacting with a lot of people that never experience the older ones. Realising that people have a completely different fram of reference to something is a very weird thing to experience and somewhat difficult to navigate.
I am however happy that you and other people do have different experiences as then I could learn about how sound cards made games sound completely different or how changing GPU or computers manufacturer could completely change the game.
Man I loved the hell out of my SB16. I still play a lot of old DOS games in emulation and work pretty hard to get them to sound like I remember vs the higher fidelity versions.
It was the first time I played it and found the combat frustratingly difficult because of the increased speed. Especially in dungeons where I had to bait enemies one by one just to not get overwhelmed. One hand was always holding a healing spell as well.
At my buddy’s house, he had a game called something like ‘wings of glory’ that was meant for an older clock speed. We were messing with the turbo button and it quickly became unplayable when not in the slower mode.
If you try it again, emulators like dosbox let you slow the game down to be playable. I don’t remember the exact setting but I’ve had to do it on things like Freddy Farkas iirc.
Right. That was common, too. Games were tiny and very expensive, so broken balance was often used to pad out length. And yeah, it got crazy once Americans started popularizing rental and publishers got desperate to make the games less economical to beat without purchasing them.
I did finish Ecco 1 legit, though. Once.
I’ve tried the last couple of stages a few times. I still don’t understand how tween me managed that. Even on a CRT with original hardware and zero lag that’s a stupid thing to try to do.
Bigger was better, since a larger game meant they packed in more content. Now the bloat is out of control since all game content is delivered over the Internet.
Bloat is out of control because games are HUGE and you can often trade size for performance if you have enough memory to do so.
Also, memory used to be extremely expensive, especially catridge ROMs. Outside of the Switch this is less of a concern now, that’s true, but the tradeoff is you get to have pin-sharp high resolution assets and tons of performance optimizations instead of… you know, just chopping enough frames of animation to fit your sprites in 16 megabits then charge a hundred bucks for the extra-sized cart. You can buy a terabyte of extremely fast storage now for the money it used to cost to buy a single game shipped on a cartridge.
The hard drive I had to wipe from the OS, as I mentioned above was a whole 20 gig. 386-ish era. It seemed so huge when I got it (and so expensive) and by the time that PC was done it was… well, a “wipe to OS to fit stuff in” drive.
But that’s not necessarily the point, the more relevant thing is how big things are relative to storage and how cheap it is to upgrade storage. It’s true that storage sizes and prices plateaued for a while, so a bunch of people are still running on 1-2 TB while the games got into the hundreds of GB. But still, storage had gotten so proportionately cheap before then, and very fast storage is so overkill now. A 1TB Gen 3 NVMe is 75 bucks, and most games will run fine on it, Sony propaganda notwithstanding.
You can fit loads of x360-ps3 era games in the same space CoD warzone takes. The irony is that, for PC players with lower specs, that’s a lot of wasted storage, since they’ll never use/load the higher res textures.
You can buy a terabyte of extremely fast storage now
That line of thinking is what leads to extreme, unnecessary bloat. “Just buy more storage, brah”
You can absolutely do that. You can also fit 16 frames of the Xbox 360 game into a single frame of the Xbox Series X game.
Sometimes people forget how much bigger a 4K target is compared to a 720p image, so I added a bit of a visual aid below. Those two screenshots are to scale, displayed at the native resolutions of their respective platforms. Just keep in mind that the big one is from a 1440p 21:9 monitor, so on a 4K TV the picture would have two of those stacked on top of each other.
It’s good that this is smaller, because If you squint you can also notice the Xbox 360 game is extremely blurry and looks like it’s in black and white. That’s because it is. The 360 had 512megs of ram, to share between the CPU and the GPU. The Xbox Series X has 16 gigs, so 32 times more, and it’s running a cool 300 times faster. 360 games were compressing textures within an inch of their lives to fit them into that tiny slab of memory, stripping color data among other things.
Computers are not magic. If you want to draw 15 million pixels of a wall and not have it look like soup you need data for each of those pixels. If you want that data to fit in less space you have to either spend resources compressing and decompressing it or you need more storage to put it in. Or you can draw it procedurally, I guess, but then you’re back to the performance problem.
On the other thing, it’s not “just buy more storage, brah”, it’s that storage has to ramp linearly with memory. If you are trying to build huge worlds running at hundreds of frames and streaming data at gigabytes per second out of a SSD you’re going to need to put those assets somewhere. The problem isn’t (just) that games are big, it’s that the ability to move those big assets has grown a bit faster than the ability to make cheaper, faster storage for the same price.
Games aren’t big because developers are lazy, they’re big because physics and engineering are hard and not every piece of technology improves at the same rate. But hey, on the plus side, storage HAS gotten cheaper. By the end of its life the PS3 was shipping 500 GB. The PS5 and Xbox Series ship 32 times more ram but only 1.5 to 2x more storage because storage is where everybody is skimping to contain costs. That’s not commensurate with the increase of visual fidelity or asset size, but at least you can add more for relatively little money, especially on PC.
EDIT: Sorry, this client didn’t like the picture going in. Link to an example below from a random image hosting site. Follow it at your peril, I make no claims about its safety. https://ibb.co/Ss7RfzW
I remember reading an early-2000s book on game dev. It did mention that some game (I want to say one of the Unreal games, but I can’t recall for sure) had to code their level loading in assembler because it was taking upwards of 10 minutes in C++.
Yeah, I definitely think the OP has super rose-colored glasses on. The free shareware was pretty awesome, though. I had one called “80 mega-hits” or something like that with a ton of games (many of which my poor old PC couldn’t run).
I do think that optimization has slacked off more as hardware prices generally trended down. Disk space I don’t so much mind, but memory and CPU are still expensive.
This is one of those things where I’m not sure what people mean when they say it.
There are bugs that affect performance, and yeah, we’re generally more likely to see bugs fro several reasons now. But there’s also games just being heavy. We’re not in a cycle where the top of the line hardware just maxes out many games, because… well, we’re doing real time path-tracing, we have monitors that go up to 400 Hz and resolutions up to 4K. The times of “set it to Ultra and forget about it” with a 1080 are gone and not coming back for a really long time. Plus everything has to scale wider now, because on the other end we have actual handhelds now, which is nuts.
So yeah, I’m not sure which one people are complaining about these days. I’ll say that if you can play a game in a handheld PC and then crank it up to look like an offline rendered path traced movie that’s way more thought to scalability than older games ever had, but maybe that’s a slightly different conversation.
The “free shareware” thing is kind of back. I’ve been noticing more and more games producing demos; check out Steam Next fest, for example.
I also remember playing a ton of games from a CD. I had a Mac at the time, but it was “dos compatible”, which meant it had a 486 in addition to the Mac processor at the time, so you could switch over into dos, though you could only allocate half the ram to it.
We ended up installing Windows 95 to play a lot of the games, which ran great on the available 4 MB of RAM.
The smallest standard for CDs was 63 minutes and 550-ish MB. For most of the life of the medium you’d mostly get the 74 min, 650MB one. The stretch 700 and up standards were fairly late-day. I tend to default to 500 in my head because it was a decent way to figure out how many discs you’d need to store a few gigs of data back in the day, though, not because I spent more time with the 63 min CDs.
The smallest standard for CDs was 63 minutes and 550-ish MB
I think I came along around 2 years after burners were commercially available, so I never saw that. And the 700 MB discs came along very shortly later. So I never had a concept of a 550 MB CD (btw you said 500 MB). This is the first I’ve heard of it.
It did exist, I promise. But again, I just default to 500 because it was such common shorthand to think about it in terms of needing two discs to store a gig. And to this day I still have 650 CDs laying around, even when 700MB ones were available they were both around at once.
I think some of the mismatch may also be that you’re thinking about it in terms of storage only (i.e. CD-Rs) because of your age and I’m probably a bit older and was mostly talking about them as read-only media. It was years between the first CDs in the late 80s and writers being widespread at all, assuming whatever game or application that came in a single CD was going to take 500 meg-ish to duplicate or install was, again, pretty useful.
In any case, this is obsolete trivia. The point is we went from games being tens of megs to hundreds of megs overnight, and we were all extremely pleased about it.
You are half right. I am misremembering 63min being the original standard of the red book audio CD, that was 650 already, although apparently 63 min CDs were used for audio mastering at some point? Info about that is sparse. As a side note, man, modern search engines suuuck.
Anyway, 63min/550MB was the low capacity standard of the CD-R instead.
And also this, from a eBay auction selling a box and labelling them “incredibly rare”, which apparently is accurate. I came just shy of digging through my pile of old CDs to see if I have any left. I may still do that next time I have them on hand.
Oh, that’s a whole other subject. “Old games were so polished and fully finished”. Meanwhile, half of the planet was either playing games squished down, in slow motion or both. And most of them didn’t even know.
It’s not as simple as that, either. May people think all games ran 15% slower. Many games did have some retiming somewhere, but it was definitely not great and people didn’t complain because with no internet, they often didn’t realize what was going on.
I once deleted the operating system just to fit a single game into my hard drive, booted from floppy while I was playing it and reversed the process when I was finished.
I remember doing this Battle of Britain and TIE Fighter! Man, memories.
Oh, man, imagine thinking that minimum requirements weren’t a thing before.
I once deleted the operating system just to fit a single game into my hard drive, booted from floppy while I was playing it and reversed the process when I was finished. Sometimes games were aiming at a specific speed of computer and if you had a computer that didn’t run at that specific number of megahertz the game just ran like a slideshow or in fast forward. I didn’t realize some of my favourite games were running under the speed cap for years sometimes. We just didn’t have a concept of things running at the same refresh rate as your screen in the early 3D era until APIs fully standardized. Sometimes you upgraded your GPU and the hardware accelerated version of your old software rendered game actually ran slower.
Also, game developers “then” made arcade games that literally charged you money for dying, then charged you more money for effectively cheating at the game and actively asked you to literally pay to win. We used to think that was normal.
Also, also, we used to OBSESS about games being bigger. The size the game took up was heavily advertised and promoted, especially on consoles. Bigger was better. We were only kinda glad that CDs could do 500 Mb, so we could keep getting bigger on a single disk, but by the time FMV games got popular triple A games were back to coming into books with disks instead of pages. This was still seen as a selling point.
Also, also, also, the assembly code of a whole bunch of old games is sheer spaghetti. Half of the mechanics in NES games are just bugs. There are a couple of great Youtube channels that just break these down and tweak them. In fairness, they didn’t have development tools as much as a notepad and a pencil, but still.
There’s some nostalgia goggles for sure.
I mean, the demo for Rollercoaster Tycoon (Mr. “Hand coded in assembly” there) bricked our Windows 98 machine when i installed it as a kid. My dad was pissed: we had to reformat the harddrive, reinstall windows, all that.
Seems like a golden era of running everything in ring 0, although that wasn’t called like this then, afaik
I remember having three or four games that you had to boot the computer into directly. As in, insert floppy and ctrl-alt-delete to launch the game.
Yeah I remember the specific clock speed thing! I had a game that I loved on a friend’s computer and didn’t get to play it much. Some sort of space sim / combat game. Years later I had my own much more powerful machine and was hyped to check it out. Installed via dosbox or whatever, loaded it up, and it ran at fucking 10x speed! It took seconds to walk around a city and the combat was completely unplayable. So sad but also pretty funny. No idea why they attached the FPS directly to the hardware. If you want an easier game, just get a worse computer apparently.
It’s the most trivial and straight forward thing to do. The game is a simple loop of:
The speed of the game is now 100% dependant on the speed of computation. NOT attaching fps to hardware is the hard thing, as you need to detach the game state loop and the drawing loop and then synchronize them. Doing that yourself is extremely complicated. Today developers don’t even need to think about that because the whole drawing loop is abstracted away by things like directX/Vulcan and the game engine. But without those tools, fps tied to CPU speed is basically the default.
And in fairness a lot of microcomputers at the time were closed specs. Even on PC for a while you were theoretically aiming at a 4Mhz XT or, at worst, also wanted to account for a 8MHz AT. By the time IBM clones had become… you know, just PCs, a lot of devs either didn’t get the memo or chose to ignore it for the reasons you list.
Most of the time “lazy devs” are just “overworked and underfunded devs”, but the point is, that didn’t start this century.
Also games have gotten way more complicated since the gameboy colour era. I’ve coded a basic 2D physics engine from scratch (literally just circles with soft collisions) and its not just enough to set up the vector math correctly. You can literally make a true to real life physics model (as far as the math of infinitely rigid perfect spheres on a perfectly flat plane goes anyway) and have all sorts of problems crop up because computers aren’t the universe and order of computation is a bitch.
Even the first Dark Souls had game ticks tied to the FPS because consoles had been standardized to 30 FPS for decades.
On the PC port, it was locked to 30 FPS, but a super popular mod unlocked the FPS, and at 60 FPS DoT effects ticked twice as fast, and at even higher FPS could kill you before you had time to react.
GTA San Andreas has an option to uncap the framerate on PC, which outright breaks certain mechanics.
SoundBlaster.
So glad things like that are the past.
Hah. In fairness, sound cards weren’t “minimum requirements”. It’s just that depending on the hardware you had the game would just have a completely different soundtrack, 75% of which sounded completely broken. If you were lucky the “minimum spec” was silence. If you were unlucky it was making your beeper sound like somebody had tripped a car alarm.
People these days are out there emulating Roland MT-32s on Raspberry Pis. I didn’t have a sound card until the Pentium era. Every DOS game in my memory sounds like a Furby got a bad case of hiccups.
I leave this as an example, but please understand this is the absolute best case scenario. Michael Land and the rest of the Lucas guys were wizards and actually cared to tune things for multiple options, including really impressive beeper music.
https://www.youtube.com/watch?v=Fr-84mjV3CI
I had a TI-99/4A (It’s part of the reason I’m a Texas Drunk) with the speech synthesizer peripheral. Everything sounded wild.
I have heard the difference of sound cards before in a video explaining it, but it is still just a wild too me to hear it, and nearly a bit difficult to imagine it actually being that way. Like I KNOW it was how sound on computers was at that time, but it is still hard to imagine my games sounding so completely differently depending on what pc I play it on.
I have the opposite problem, where I have to remind myself that a lot of people making these memes just don’t have a frame of reference for any of this. I’m used to having been there for the vast majority of home computing, it’s so hard for me to parse having been born with computers just mostly working the way they do now.
Oh, and while I’m at it, it also looked completely different:
https://www.youtube.com/watch?v=xo2_ksqxbiQ
Changing GPUs these days mostly just changes your framerate. That wasn’t always the case.
I can somewhat comprehens the difference in appearance and sometimes game play, but at the same time not really. I have seen the same game be different on pc vs console and a third version on handheld, and while I know this where all computers, I still very much think of them in the way of game consoles you could also do computer things on, even though I know that they were computers that you could play games on.
I blame it a bit on terminology, every time I hear about old computers, they are always referred to much more similar to how we refer to games consoles today then we do with computers. It is an Amiga 500, Amiga 1000 or an Atari 7800 or Atari ST. That is much more similar in my head to like Nintendo Wii, Nintendo Switch, PlayStation 2 or PlayStation 4.
I have never really heard computers be referred to in that manner now a days, they probably are to some extent in some circles, but I have completely managed to miss them, and I do have some interested in computers. Like I can tell you I have owned a Dell, a HP and a Lenovo amongst some, but I would really have to do some digging to maybe be able to tell what version of them it was.
I know my current one start with G and the following looks some what like lam, but I only know that because it kinda looked like Glam so I named it that and because I have needed to Google that exact model, to look up some stuff.
I can however understand how you feel having grown up with with the computers and now talking to and interacting with a lot of people that never experience the older ones. Realising that people have a completely different fram of reference to something is a very weird thing to experience and somewhat difficult to navigate.
I am however happy that you and other people do have different experiences as then I could learn about how sound cards made games sound completely different or how changing GPU or computers manufacturer could completely change the game.
Man I loved the hell out of my SB16. I still play a lot of old DOS games in emulation and work pretty hard to get them to sound like I remember vs the higher fidelity versions.
set blaster=a220 i7 d1 h5 t6
The last time I had that bug was with Oblivion.
It was the first time I played it and found the combat frustratingly difficult because of the increased speed. Especially in dungeons where I had to bait enemies one by one just to not get overwhelmed. One hand was always holding a healing spell as well.
Sounds like Commander Keen?
Edit: I meant Wing Commander
Damn there’s a throwback. Annnnd I feel old now hah.
That’s it! Wing Commander!
deleted by creator
At my buddy’s house, he had a game called something like ‘wings of glory’ that was meant for an older clock speed. We were messing with the turbo button and it quickly became unplayable when not in the slower mode.
If you try it again, emulators like dosbox let you slow the game down to be playable. I don’t remember the exact setting but I’ve had to do it on things like Freddy Farkas iirc.
Ecco the dolphin was made specifically hard to ensure people couldn’t beat it on rental during a weekend.
Right. That was common, too. Games were tiny and very expensive, so broken balance was often used to pad out length. And yeah, it got crazy once Americans started popularizing rental and publishers got desperate to make the games less economical to beat without purchasing them.
I did finish Ecco 1 legit, though. Once.
I’ve tried the last couple of stages a few times. I still don’t understand how tween me managed that. Even on a CRT with original hardware and zero lag that’s a stupid thing to try to do.
Pink Floyds welcome to the machine still gives me flashbacks to the last stage of Ecco 1.
Bigger was better, since a larger game meant they packed in more content. Now the bloat is out of control since all game content is delivered over the Internet.
Bloat is out of control because games are HUGE and you can often trade size for performance if you have enough memory to do so.
Also, memory used to be extremely expensive, especially catridge ROMs. Outside of the Switch this is less of a concern now, that’s true, but the tradeoff is you get to have pin-sharp high resolution assets and tons of performance optimizations instead of… you know, just chopping enough frames of animation to fit your sprites in 16 megabits then charge a hundred bucks for the extra-sized cart. You can buy a terabyte of extremely fast storage now for the money it used to cost to buy a single game shipped on a cartridge.
When I got my brand new 486 PC, I paid over $800 for a 4 MB SIMM card. That is 4 MEGS, not GIGS, 4 MB. That brought up my memory up to 8 MBs.
I was also king of the hill when I added a second hard drive for a total of 40 MBs!
The hard drive I had to wipe from the OS, as I mentioned above was a whole 20 gig. 386-ish era. It seemed so huge when I got it (and so expensive) and by the time that PC was done it was… well, a “wipe to OS to fit stuff in” drive.
But that’s not necessarily the point, the more relevant thing is how big things are relative to storage and how cheap it is to upgrade storage. It’s true that storage sizes and prices plateaued for a while, so a bunch of people are still running on 1-2 TB while the games got into the hundreds of GB. But still, storage had gotten so proportionately cheap before then, and very fast storage is so overkill now. A 1TB Gen 3 NVMe is 75 bucks, and most games will run fine on it, Sony propaganda notwithstanding.
20 GB during the 386 era does not check out for home PCs.
20 MB is more realistic for that era.
20 megs maybe.
Hah. Yeah, I meant 20 megs. My muscle memory just doesn’t want to type a number that low, it seems.
You can fit loads of x360-ps3 era games in the same space CoD warzone takes. The irony is that, for PC players with lower specs, that’s a lot of wasted storage, since they’ll never use/load the higher res textures.
That line of thinking is what leads to extreme, unnecessary bloat. “Just buy more storage, brah”
You can absolutely do that. You can also fit 16 frames of the Xbox 360 game into a single frame of the Xbox Series X game.
Sometimes people forget how much bigger a 4K target is compared to a 720p image, so I added a bit of a visual aid below. Those two screenshots are to scale, displayed at the native resolutions of their respective platforms. Just keep in mind that the big one is from a 1440p 21:9 monitor, so on a 4K TV the picture would have two of those stacked on top of each other.
It’s good that this is smaller, because If you squint you can also notice the Xbox 360 game is extremely blurry and looks like it’s in black and white. That’s because it is. The 360 had 512megs of ram, to share between the CPU and the GPU. The Xbox Series X has 16 gigs, so 32 times more, and it’s running a cool 300 times faster. 360 games were compressing textures within an inch of their lives to fit them into that tiny slab of memory, stripping color data among other things.
Computers are not magic. If you want to draw 15 million pixels of a wall and not have it look like soup you need data for each of those pixels. If you want that data to fit in less space you have to either spend resources compressing and decompressing it or you need more storage to put it in. Or you can draw it procedurally, I guess, but then you’re back to the performance problem.
On the other thing, it’s not “just buy more storage, brah”, it’s that storage has to ramp linearly with memory. If you are trying to build huge worlds running at hundreds of frames and streaming data at gigabytes per second out of a SSD you’re going to need to put those assets somewhere. The problem isn’t (just) that games are big, it’s that the ability to move those big assets has grown a bit faster than the ability to make cheaper, faster storage for the same price.
Games aren’t big because developers are lazy, they’re big because physics and engineering are hard and not every piece of technology improves at the same rate. But hey, on the plus side, storage HAS gotten cheaper. By the end of its life the PS3 was shipping 500 GB. The PS5 and Xbox Series ship 32 times more ram but only 1.5 to 2x more storage because storage is where everybody is skimping to contain costs. That’s not commensurate with the increase of visual fidelity or asset size, but at least you can add more for relatively little money, especially on PC.
EDIT: Sorry, this client didn’t like the picture going in. Link to an example below from a random image hosting site. Follow it at your peril, I make no claims about its safety.
https://ibb.co/Ss7RfzW
I remember reading an early-2000s book on game dev. It did mention that some game (I want to say one of the Unreal games, but I can’t recall for sure) had to code their level loading in assembler because it was taking upwards of 10 minutes in C++.
Yeah, I definitely think the OP has super rose-colored glasses on. The free shareware was pretty awesome, though. I had one called “80 mega-hits” or something like that with a ton of games (many of which my poor old PC couldn’t run).
I do think that optimization has slacked off more as hardware prices generally trended down. Disk space I don’t so much mind, but memory and CPU are still expensive.
This is one of those things where I’m not sure what people mean when they say it.
There are bugs that affect performance, and yeah, we’re generally more likely to see bugs fro several reasons now. But there’s also games just being heavy. We’re not in a cycle where the top of the line hardware just maxes out many games, because… well, we’re doing real time path-tracing, we have monitors that go up to 400 Hz and resolutions up to 4K. The times of “set it to Ultra and forget about it” with a 1080 are gone and not coming back for a really long time. Plus everything has to scale wider now, because on the other end we have actual handhelds now, which is nuts.
So yeah, I’m not sure which one people are complaining about these days. I’ll say that if you can play a game in a handheld PC and then crank it up to look like an offline rendered path traced movie that’s way more thought to scalability than older games ever had, but maybe that’s a slightly different conversation.
The “free shareware” thing is kind of back. I’ve been noticing more and more games producing demos; check out Steam Next fest, for example.
I also remember playing a ton of games from a CD. I had a Mac at the time, but it was “dos compatible”, which meant it had a 486 in addition to the Mac processor at the time, so you could switch over into dos, though you could only allocate half the ram to it.
We ended up installing Windows 95 to play a lot of the games, which ran great on the available 4 MB of RAM.
I used to have a meta-game where I tried to fit X-wing and Windows 3.1 on the same 40MB hard drive. Just barely made it.
I remember that for MegaMan we needed to turn off the turbo (yes the CPU button) or it ran really fast.
Maybe the opposite, the turbo button actually slowed down the CPU so you could play games that had a speed limit.
It was actually 700 MB
Oh, no. It was not.
The smallest standard for CDs was 63 minutes and 550-ish MB. For most of the life of the medium you’d mostly get the 74 min, 650MB one. The stretch 700 and up standards were fairly late-day. I tend to default to 500 in my head because it was a decent way to figure out how many discs you’d need to store a few gigs of data back in the day, though, not because I spent more time with the 63 min CDs.
I think I came along around 2 years after burners were commercially available, so I never saw that. And the 700 MB discs came along very shortly later. So I never had a concept of a 550 MB CD (btw you said 500 MB). This is the first I’ve heard of it.
It did exist, I promise. But again, I just default to 500 because it was such common shorthand to think about it in terms of needing two discs to store a gig. And to this day I still have 650 CDs laying around, even when 700MB ones were available they were both around at once.
I think some of the mismatch may also be that you’re thinking about it in terms of storage only (i.e. CD-Rs) because of your age and I’m probably a bit older and was mostly talking about them as read-only media. It was years between the first CDs in the late 80s and writers being widespread at all, assuming whatever game or application that came in a single CD was going to take 500 meg-ish to duplicate or install was, again, pretty useful.
In any case, this is obsolete trivia. The point is we went from games being tens of megs to hundreds of megs overnight, and we were all extremely pleased about it.
You really seem to be misremembering again, since the original CD spec could hold 650 MiB of data.
You are half right. I am misremembering 63min being the original standard of the red book audio CD, that was 650 already, although apparently 63 min CDs were used for audio mastering at some point? Info about that is sparse. As a side note, man, modern search engines suuuck.
Anyway, 63min/550MB was the low capacity standard of the CD-R instead.
People are aware of them, but man, it took me a while to find a contemporary technical reference to it being available. I ended up having to pull it from the Wayback Machine:
https://web.archive.org/web/20070110232445/http://www.mscience.com/faq55.html
And also this, from a eBay auction selling a box and labelling them “incredibly rare”, which apparently is accurate. I came just shy of digging through my pile of old CDs to see if I have any left. I may still do that next time I have them on hand.
Well now you’re changing the conversation to CD-R, not just CD.
No, I’m… correcting myself. That’s how correcting a statement works, you make a new statement. Read my previous comment carefully.
Same, in my case as a European. PAL is weird.
Oh, that’s a whole other subject. “Old games were so polished and fully finished”. Meanwhile, half of the planet was either playing games squished down, in slow motion or both. And most of them didn’t even know.
It’s not as simple as that, either. May people think all games ran 15% slower. Many games did have some retiming somewhere, but it was definitely not great and people didn’t complain because with no internet, they often didn’t realize what was going on.
I remember doing this Battle of Britain and TIE Fighter! Man, memories.