The_Great_Sephiroth
New member
I originally used a GTX 550 Ti with 1GB. Played just fine back in those days. I still have the card. I also did not realize that the 2080 only had 8GB. I have the Ti version which has 11GB.
Also, Ryzen is trading blows with Intel single threaded wise as well. They're literally neck and neck with each other. AMD does win in overall performance though. Also, good luck getting an Intel CPU that can trade blows with Ryzen given their slow production rate.Fox, you are correct. I was thinking of the game "Raft". That game can eat cores, even on my 6950X. As for per-core performance, it is why I pay a premium for Intel chips unless the box will be doing something that benefits from more cores, such as 4K60 encoding.
Yeah, I've been fed that line before. We have two Ryzens at my big client in town. Turns out Intel is faster. After having two Ryzens with identical issues we relegated those PCs to a less demanding department and ordered Intel systems with similar specs. Intel has no issues with the software the client runs.
I also remember when the XP1800+ was way cheaper than the 1.8GHz P4 despite being 400MHz slower. Turns out a wider bus doesn't make up for 400MHz in-game either. I can name a lot of instances like these. I know what people claim and I know what the fake benchmarks show, which don't account for bus width or design, among other things. AMD keeps talking smack (ray-tracing is a pipe-dream, then ray-tracing is common and we'll have it soon, then they finally released it years after nVidia and it is not even 50% as fast) but always fails to deliver in real-world scenarios, but I digress. I do respect AMD for pushing the core limit up, but let's be honest, not many games even use four cores, much less more.
Stock market doesn't paint the full picture. Most of that stock market is based on industrial use, like super computers / servers, not residential gamers. Besides, AMD is currently at 50.8% market share passing Intel in desktop CPUs according to this article: https://www.techradar.com/news/amd-overtakes-intel-in-desktop-cpu-market-share-for-the-first-time-in-15-yearsI'm not at war with AMD. If it was so much better their stock would show it.
I own and have beat 2077. On an i7-6950X. With everything maxed. Ran fine for me. The biggest buff I have seen as the owner of a decent CPU is that when DLSS is enabled you get a MAJOR FPS boost. In fact the way I understand it, many players cannot play without it, likely due to low core counts or sub-par PCs.
I'm not at war with AMD. If it was so much better their stock would show it. The simple fact is that you have all this fake fanfare out there out there, something AMD is awesome at generating, but I have yet to see it matter in the real world. Intel has always out-performed in the real world, and my company does yearly testing so we can update our offerings based on customer needs. Yes, we even do gaming rig builds, and out top performers in actual games have always been Intel chips, but we don't buy bad chips. We buy their X series and similar, which is probably why we have had such good experiences with them in comparison to AMD. Not that we don't purchase and sell AMD, but they are generally relegated to budget builds or encoding setups, or SOHO servers, etc.
A final thing I will note is that while AMD has focused on shrinking their lithography, they had chips at 7nm compared to Intel chips at 14nm as Intel began rolling out more thoroughly tested 10nm chips. AMD's 7nm chips were barely ahead of Intel's aging 14nm chips at the time. That speaks volumes about the differences between the two. I am sure AMD was embarrassed and has since taken advantage of a 7nm lithography in a more efficient manner, but that alone keeps me from buying them the day they arrive.
Oh, and let's not forget the "each core shares one cache which slows us way down" fiasco. I believe that was only fixed on AMD chips within the last year and the current-gen Ryzens. Intel learned that lesson on the Core2Duo's eons ago.
I originally used a GTX 550 Ti with 1GB. Played just fine back in those days. I still have the card. I also did not realize that the 2080 only had 8GB. I have the Ti version which has 11GB.
Don't know how old you are, but I have a guy I formed my clan with in 1999 who is still with me. He's a Marine and he travels, but we both have a good laugh about the time he bought a Radeon 9800 (AGP, years ago) and the ATI driver did reverse-rendering to gain a few FPS and compare to the nVidia of that day, which I had. The thing is, drawing player models (and plants, furniture, etc) last meant that in most games, he had a permanent, LEGAL wallhack. He sent me screenshots from UT99 and the Infiltration mod we played and he saw all players and such all the time. ATI released a fix, but it cost him some FPS. Still, those were fun times. ATI and AMD have always had driver issues with their GPUs though. Remember the Omega drivers? I do!While I like the fact that AMD is catching up to the greedy Nvidia in GPUs, I still prefer to buy Nvidia cards just because of gaming support dominance. If AMD can get their drivers and software to not suck and if AMD can gain dominance in gaming support, then I would happily switch over to AMD GPUs. In my opinion, support matters more than cost hence why I tolerate Nvidia's pricing.
I'll disagree with their GPUs. They may eb closing in on the 2080, but the 3090 is just WAY beyond anything else out there, and the 3080 is not far behind. CPUs? AMD is literally pricing Intel out of the market. I Will say this. The comparison linked above is Intel on old lithography versus a new 7nm AMD chip. The AMD is only 5% better overall. That speaks volumes to how sloppy AMD is. They should be 50% better or more since they can cram twice as many transistors onto the same die as Intel can, but it isn't.
Then you were born in 1985. I was born in 1980. Glad you remember those drivers. No idea what happened to them though.I'm 36. And yes, I remember Omega drivers. It was often the only way I could get AMD GPUs to work properly.
Heck, I do remember my first assembly of a PC where I had to choose between S3 Trio and VoodooRush ))) I made a bad choice then)), and the first pc of mine was a 33 Mhz IBM with TURBO mode to 66Mhz what did exactly nothing))) 10Mb hard drive! And before that I was frequently visiting one of my friends as his father was some sort of computer scientist that time and he had a "portable" computer using spool of magnetic tape)))Then you were born in 1985. I was born in 1980. Glad you remember those drivers. No idea what happened to them though.
Meganoth, I believe they are only 5% better because this is how they are reducing power and heat. Half the lithography but same transistor count means more space to breathe. AMD has always run hotter and drawn more power in the past. Perhaps while Intel twiddles its thumbs AMD will close that gap and when Intel begins to realize something is wrong, AMD will be make better use of their stuff for more speed also. Time will tell.