I don’t know, it’s been another year of overpriced GPUs that didn’t advance as much as some past generations have.
Overpriced, absolutely, but the 4080 and 4090 absolutely blow previous gens out of the water. Everything else does seem to fall on the side of incremental improvements, but nothing big
but the 4080 and 4090 absolutely blow previous gens out of the water
Just the 4090, the 4080 is not is not that big a jump compared to the 3090Ti. Considering the 4080 trades blows with the 7900xtx, can’t say that it blows previous gen out of the water
The 4090 isn’t that much better than the 4080, and the 4080 is much more energy efficient. I guess it all depends on what you consider “blowing previous gens out of the water” but that’s sort of marketing non-sense.
If you have a 3090Ti you aren’t going to be missing out anything on current gen games, and current gen developers aren’t in a hurry to develop exclusively for hardware most of their potential consumers aren’t going to be running on. So basically, it’s just “blowing previous gens out of the water” on flair at the moment.
The 4090 isn’t that much better than the 4080, and the 4080 is much more energy efficient
The 4090 often beats the 4080 with a margin of 25% or more often has a one percent low figure better than the average frame rates delivered by the 4080. If that is not enough for it to be considered much better, I don’t know what is.
Regarding your comments on efficiency, generally the high end is not as efficient as the other GPUs as they are pushed beyond the ideal point of the voltage-frequency curve to extract the most performance. And even after this, in GPU bound scenarios, the 4090 does great as it uses 25-35% more power to deliver 20-30% better fps which puts it on average around 10-15% behind in efficiency which is not enough in my opinion to claim that the 4080 is much more efficient.
Article based on hardware unboxed’s review - https://www.techspot.com/review/2569-nvidia-geforce-rtx-4080/
I guess I just don’t see a 25% as that much better when I’m pleased with my 4K setup and am already getting 120+ fps on most games when the card I’m using isn’t even current gen.
That 25-35% has been enough to cause a significant number of 4090 fatalities from loose contacts due to microdebris in the power connections. You aren’t just paying around 50% more for that 25%, you are paying for added risk, the higher power consumption, and the higher power capacity PSUs to match, and as the PC ages it’s going to wear out sooner as well. All for what’s really an RTX selling point that barely any game dev uses on a generation that all major coverage has criticized as being particularly expensive.
I frankly still believe that if there was any generation to skip on launch, it’s this one. Intel is slowly but surely joining the GPU market, if on the low-end, AMD cards are competing with NVIDIA on where it matters, and for all the threats NVIDIA has made about dropping out because of their AI nest egg, it knows it will need to keep the reigns over the 5000 series if it doesn’t want to get sidelined for good, and it will do so with all the experience that it has had about how overpricing this generation may have dropped their sales significantly.
So basically, it’s just “blowing previous gens out of the water” on flair at the moment.
Barring major changes in how things are done like rasterization to raytracing, the top end of the GPU market has always been about flair, as far as gaming goes, no?
Unfortunately, 4090 is the cheap man’s alternative for AI hardware, and with the China ban, it’s going to continue to be overpriced for the foreseeable future, well past the blockchain mining craze.
128-bit bus… that is pretty bad.
Even the non-TI RTX 3060 had a 192-bit bus and 12 GB of memory. The fact that a lower tier and older product sounds more appealing is a major error in Nvidia’s judgment for this generation of budget cards.
I don’t understand why they couldn’t keep pushing the 3060 until they were truly ready for a 4060 that’s better. You don’t have to release new products on every single segment every single time. Especially if they confuse the buyers of that segment.
I don’t understand why they couldn’t keep pushing the 3060 until they were truly ready for a 4060 that’s better. You don’t have to release new products on every single segment every single time. Especially if they confuse the buyers of that segment.
I think that’s why: confusing those less “hardcore” into thinking 4060 is a massive leap from 3060. It’s pure corporate greed and annoying as hell.
IMO the base 3060 is the best price/performance Nvidia card out right now and it’s over $100 cheaper than the 4060 Ti.
We seem to have hit a wall with CPU & GPU development, it’s not going to be easy to overcome it. My best guess is that future generations are going to have to focus on heat dissipation, because that’s basically the only thing left to improve.
I have a spreadsheet tracking all my for-myself builds from 2000. I used to average a new build (or upgrade) every 20 months. (it was my only hobby… I’m pretty boring)
Last 10 years it’s been more like every 36 months. But that’s after having “the itch” for at least a year.
I’m 40 months into this build, and an upgrade is hardly worth it. Maybe next year when it’s been a whole 2 generations CPU/GPU?
Heh… I started building in the late-90’s with a custom 486-sx. Went from that to a Pentium 100, then to an AMD-K6-II at 233, then to another AMD I forget… then to an i5-2500K in 2011. That build hit the sweet spot, and with just graphics card, memory and drive upgrades stayed roughly the same until I bought this -7-10700K last year… that’s now got 4Tb of nVME and 8tb of SSD, no spinning rust, and a 3070. I would like to upgrade the graphics card as I’ve got 2x 4K monitors and gaming is a bit sluggish at times on 4K, but… I just can’t justify the upgrade price to a graphics card that could improve. So, I’ll wait another gen, and probably get either a 4080 when the 5x series is out, or wait until the 5x series 5070-eqivalent is at a reasonable price. Other than increasing storage, I don’t see any other demands for CPU or GPU coming this console generation…