amd exists, people.
please put your money where your mouth is please?
Not at the higher end, it doesn’t.
Very few people actually need or more make use of the power that nvidia’s high end cards provide
dunno about this generation cause im not in the market for new parts, but amd usually comes within spitting distance at much lower prices.
it makes less and less sense to buy nvidia as time passes, yet people still do.
Have you seen nvidia 5 series? AMD is accidentally higher end now.
Or you know buy an AMD card and quit giving your money to the objectively worse company.
They tried this with the 4080 12 gb… This time they dug in and did it whole hog with all the 5080s.
Shame amd is not going to be competitive
For the people looking to upgrade: always check first the used market in your area. It is quite obvious for now the best thing to do is just try to get 40 series from the drones that must have the 50 series
lol this reminds me of whatever that card was back in the 2000’s or so, where you could literally make a trace with a pencil to upgrade the version lower to the version higher.
I remember using the pencil trick on my old AMD Duron processor to unlock it. 🤣
Yeah, those were the days when cost control was simply to use the same PCB but with just the traces left out. There were also quite a few cards that used the exact same PCB, traces intact, that you could simple flash the next tier card’s BIOS and get significant performance bumps.
Did a few of those mods myself back in the day, those were fun times.
Ok now how do I turn my 2070s into a 5090? 😅
Get 500 dollars then use AI to generate the other 3/4 of the money and buy a 5090.
Well of you ask Nvidia it’s now just the driver making frames for you.
Ya I’m sad to see and exit. Maybe in a year or two I’ll get that sapphire 7900 xtx or whatever it is.
Liquid nitrogen cooling
My 2070 is still treating me pretty well!
Mine too except for 3440x1440 with 144Hz possible. I hardly get that resolution and refresh rate on higher demanding games
Yeah same, but I only have a 1440p monitor, and I can barely tell the difference after 90hz, anyhow.
There were several GPUs and CPUs where that was true
Yeah. Those Durons were a stupidly good deal at the time since you could overclock the snot out of them and get a CPU on par with a top of the stack one for absolute pennies.
Unless they caught fire. But that mostly usually didn’t hapen all that often sometimes.
Vote with your wallets. DLSS and Ray Tracing aren’t worth it to support this garbage.
Wish more gamers would but that ship has long sailed unfortunately. I mean look at what the majority of gamers tolerate now.
Yeah, unfortunately you’re probably right. Brand image is also still good, somehow
Just don’t buy a new card unless you really need to.
I bought a 3080 because that gen seemed like a pretty solid bump in performance for the price. The new Nvidia gen is underwhelming so just wait.
Or if you really need one now buy a 30 or 40 series card.
That’s fair. I’m on an all AMD system with no need to change, tbh. 7800XT.
This is the way. Have a 3090, 980ti before that. Really no need to get anything new yet.
Similar for me too, had a 980ti and then got a 6800xt a few years ago and I’m fine until about 2030 probably
I got a 3070ti but my old one was something like a 780? I don’t remember exactly. Before that I had a Radeon 5970. That’s an upgrade roughly every 7 years I guess.
I waited like 6 months to get a 5080.
Im glad I waited, now I can get a 4080 for a good price.
I’ve got the feeling that GPU development is plateauing, new flagships are consuming an immense amount of power and the sizes are humongous. I do give DLSS, Local-AI and similar technologies the benefit of doubt but is just not there yet. GPUs should be more efficient and improve in other ways.
And then you pick up a steam deck and play games that were originally meant to play on cards the size of steam deck.
I’ve said for a while that AMD will eventually eclipse all of the competition, simply because their design methodology is so different compared to the others. Intel has historically relied on simply cramming more into the same space. But they’re reaching theoretical limits on how small their designs can be; They’re being limited by things like atom size and the speed of light across the distance of the chip. But AMD has historically used the same dies for as long as possible, and relied on improving their efficiency to get gains instead. They were historically a generation (or even two) behind Intel in terms of pure hardware power, but still managed to compete because they used the chips more efficiently. As AMD also begins to approach those theoretical limits, I think they’ll do a much better job of actually eking out more computing power.
And the same goes for GPUs. With Nvidia recently resorting to the “just make it bigger and give it more power” design philosophy, it likely means they’re also reaching theoretical limitations.
AMD never used chips “more efficiently”. They hit gold with the RYZEN design but everything before since Athlon was horrible and more useful as a room heater. And before athlon it was even worse. The k6/k6-2 where funny little buggers extending the life of ziff7 but it lacked a lot of features and dont get me started about their dx4/5 stuff which frequently died in spectacular manners.
Ryzen works because of chiplets and the stacking of the cache. Add some very clever stuff in the pipeline which I don’t presume to understand and the magic is complete. AMD is beating intel at it’s own game: it’s Ticks and tocks are way better and most important : executable. And that is something Intel hasn’t been able to really do for several years. It only now seems to be returning.
And lets not forget the usb problems with ryzen 2/3 and the memory compatibility woes of ryzen’s past and some say: present. Ryzen is good but its not “clean”.
In GPU design AMD clearly does the same but executes worse then nvidia. 9070 cant even match its own predecessor, 7900xtx is again a room heater and is anything but efficient. And lets not talk about what came before. 6xxx series where good enough but troublesome for some and radeon 7 was a complete a shitfest.
Now, with 90 70 AMD once again, for the umpteenth time, promises that the generation after will fix all its woes. That that can compete with Nvidia.
Trouble is, they’ve been saying that for over a decade.
Intel is the one looking at GPU design differently. The only question is: will they continue or axe the division now gelsinger is gone. which would be monunentally stupid but if we can count on 1 thing then its the horrible shortsightness of corporate America. Especially when wall street is involved. And with intel, wall street is heavily involved. Vultures are circling.
Just like I rode my 1080ti for a long time it looks like I’ll be running my 3080 for awhile lol.
I hope in a few years when I’m actually ready to upgrade that the GPU market isn’t so dire… All signs are pointing to no unfortunately.
Just like I rode my 1080ti for a long time
You only skipped a generation. What are you talking about?
I had the 1080ti well after the 3080 release. I Got a great deal and had recent switched to 1440p so I pulled the trigger on a 3080 not long before the 4000 series cards dropped.
I’m still riding my 1080ti…
My 1080ti was probably peak Nvidia for me. It was so good, and I was super happy with it. Truely felt like upgrade.
If you’re still playing at 1080 it’s still a capable card tbh.
I gave mine to a friend who still had a 660 when I upgraded to the 3080 lol.
Oh yeah, it still gets the job done.
Keeping my eyes open for used ones to upgrade with now that the new series is out though. Gives me an excuse to get the 1080 in my server.
Me too
1060 6GB gang here… I will probably get a 3060 or 4060 next time I uppgrade, unless I ditch Nvidia (thinking of moving to Linux).
4070ti super on linux here. It just works but I’m looking at amd cards to lower fan noise.
Yeah, I heard that that issue is mostly fixed nowadays. But still, I’m not very fond of Nvidia and their business practises. Though, I’m not really up to date with how well AMD does raytracing.
Same here.
Only upgraded when my 1080 died, so I snagged a 3080 for an OK price. Not buying a new card untill this one dies. Nvidia can get bent.
Maybe team red next time….
I’m extremely petty and swore off Nvidia for turning off my shadow play unless I made an account for my 660, had to downgrade my driver to get it back without an account. I only have an Nvidia card now because I got a 3080ti for free after my rx 580 died.
Fuck yeah man, I love that conviction. If only everyone were so inclined, the world would be a better place.
I can get behind petty if for the right reasons, and yours for sure is!
I hate everyone and their uncle requiring accounts for even the most menial of tasks. We all know it’s only for gathering and selling data, so kindly fuck off.
Philips requires an acc for me turning on my BT nightstand light, a Hue Go. I’d rather spend 10 hours finding a workaround
I’m still rocking a 2070 and doing great. Turns out the games that I like rarely depend on graphical fidelity, but rather on good visual design and game design.
But yeah if graphical fidelity is your bag, or if you need every possible frame for competitive reasons, then your options are much more limited and far more expensive. Sucks.
I can agree to that.
I’m super happy with 3440x1440, so I guess that helps out a bit too
Nvidia is just strait up conning people.
Where’s the antitrust regulation?
Jensen Huang is meeting with Trump to make sure there is none.
Trump’s America has none. But the FCC is suing public broadcast services. So thats what we get.
For what exactly?
Your mom
Everyone trusts their mum.
I thrust your mums
Wow, it looks like it is really a 5060 and not even a 5070. Nvidia definitely shit the bed on this one.