PC Specs?

I originally used a GTX 550 Ti with 1GB. Played just fine back in those days. I still have the card. I also did not realize that the 2080 only had 8GB. I have the Ti version which has 11GB.

 
Fox, you are correct. I was thinking of the game "Raft". That game can eat cores, even on my 6950X. As for per-core performance, it is why I pay a premium for Intel chips unless the box will be doing something that benefits from more cores, such as 4K60 encoding.
Also, Ryzen is trading blows with Intel single threaded wise as well. They're literally neck and neck with each other. AMD does win in overall performance though. Also, good luck getting an Intel CPU that can trade blows with Ryzen given their slow production rate.

 
Last edited by a moderator:
Yeah, I've been fed that line before. We have two Ryzens at my big client in town. Turns out Intel is faster. After having two Ryzens with identical issues we relegated those PCs to a less demanding department and ordered Intel systems with similar specs. Intel has no issues with the software the client runs.

I also remember when the XP1800+ was way cheaper than the 1.8GHz P4 despite being 400MHz slower. Turns out a wider bus doesn't make up for 400MHz in-game either. I can name a lot of instances like these. I know what people claim and I know what the fake benchmarks show, which don't account for bus width or design, among other things. AMD keeps talking smack (ray-tracing is a pipe-dream, then ray-tracing is common and we'll have it soon, then they finally released it years after nVidia and it is not even 50% as fast) but always fails to deliver in real-world scenarios, but I digress. I do respect AMD for pushing the core limit up, but let's be honest, not many games even use four cores, much less more.

 
Some software like Adobe Premiere supports Intel more due to popularity... that will eventually change though if things keep going the way they're going. Everything takes time.

Gaming wise though, pretty sure AMD is the clear winner when it matters. Old games like CS:GO favor Intel, but who needs 400+ fps?

 
Last edited by a moderator:
Yeah, I've been fed that line before. We have two Ryzens at my big client in town. Turns out Intel is faster. After having two Ryzens with identical issues we relegated those PCs to a less demanding department and ordered Intel systems with similar specs. Intel has no issues with the software the client runs.

I also remember when the XP1800+ was way cheaper than the 1.8GHz P4 despite being 400MHz slower. Turns out a wider bus doesn't make up for 400MHz in-game either. I can name a lot of instances like these. I know what people claim and I know what the fake benchmarks show, which don't account for bus width or design, among other things. AMD keeps talking smack (ray-tracing is a pipe-dream, then ray-tracing is common and we'll have it soon, then they finally released it years after nVidia and it is not even 50% as fast) but always fails to deliver in real-world scenarios, but I digress. I do respect AMD for pushing the core limit up, but let's be honest, not many games even use four cores, much less more.


Oh, we are back at the Intel-AMD wars.

You can always find software that runs better on Intel. Sometimes all that is necessary is that the company of that software uses Intel compilers to do the work. That works very well for 80% of the PCs out there (Intel compilers are very very good, on Intel hardware) and somewhat less well on the other 20% of PCs.

That doesn't seem to hold for games though.  Maybe because AMD is relatively popular with private buyers there are relatively few games where it makes a difference. I'm not talking about "tuned" benchmarks by Intel and AMD themselves, gaming magazines (paper-based and online) do real world benchmarks with typical AAA-games. And AMDs recent ryzens (3xxxx and up) are on-par with the best CPUs Intel can field. (And for a time, until Intel released moderately priced 10xxx CPUs, magazines declared AMD the clear winner on all counts)

Currently 4 cores are enough except for a few games, but the number of games who can use more are slowly growing. Which is the same that can be said about raytracing.

UPDATE: I checked out Cyperpunk 2077 because it is one of the most hardware-intensive games and looked for information on cores used. I found a website that said a current 6 core from AMD or Intel is the best in price-performance and the results indicated that 6 or even more cores are utilized. Then I found an article in german that surprised even me: It said that to run Cypberpunk with high details performantly you NEED at least 6 cores WITH HT and if you want raytracing then 8 or more cores WITH HT are advisable. Furthermore sizable performance increases could be seen up to 12 core cpus (i.e. 24 core with HT) suggesting that Cyperpunk is able to use as many cores as you let it: https://www.computerbase.de/2020-12/cyberpunk-2077-benchmark-test/4/

I would guess this is surely not the only triple AAA game that is able to use any amount of cores already as the time for speed increases through MHZ increase is over and the gaming industry knows that for quite some time now.

Why are we suddenly talking about GPUs? Yes, if you want ray-tracing or DLSS enhancement, AMD is still a large step and  a medium step behind. In all other cases the current line-up is good enough so I don't have to buy anything from the cutthroats at nvidia.

 
Last edited by a moderator:
I own and have beat 2077. On an i7-6950X. With everything maxed. Ran fine for me. The biggest buff I have seen as the owner of a decent CPU is that when DLSS is enabled you get a MAJOR FPS boost. In fact the way I understand it, many players cannot play without it, likely due to low core counts or sub-par PCs.

I'm not at war with AMD. If it was so much better their stock would show it. The simple fact is that you have all this fake fanfare out there out there, something AMD is awesome at generating, but I have yet to see it matter in the real world. Intel has always out-performed in the real world, and my company does yearly testing so we can update our offerings based on customer needs. Yes, we even do gaming rig builds, and out top performers in actual games have always been Intel chips, but we don't buy bad chips. We buy their X series and similar, which is probably why we have had such good experiences with them in comparison to AMD. Not that we don't purchase and sell AMD, but they are generally relegated to budget builds or encoding setups, or SOHO servers, etc.

A final thing I will note is that while AMD has focused on shrinking their lithography, they had chips at 7nm compared to Intel chips at 14nm as Intel began rolling out more thoroughly tested 10nm chips. AMD's 7nm chips were barely ahead of Intel's aging 14nm chips at the time. That speaks volumes about the differences between the two. I am sure AMD was embarrassed and has since taken advantage of a 7nm lithography in a more efficient manner, but that alone keeps me from buying them the day they arrive. Oh, and let's not forget the "each core shares one cache which slows us way down" fiasco. I believe that was only fixed on AMD chips within the last year and the current-gen Ryzens. Intel learned that lesson on the Core2Duo's eons ago.

 
I'm not at war with AMD. If it was so much better their stock would show it.
Stock market doesn't paint the full picture. Most of that stock market is based on industrial use, like super computers / servers, not residential gamers. Besides, AMD is currently at 50.8% market share passing Intel in desktop CPUs according to this article:  https://www.techradar.com/news/amd-overtakes-intel-in-desktop-cpu-market-share-for-the-first-time-in-15-years

And as far as I'm aware, Intel still hasn't really released any 10nm chips yet as they struggle to get any (or they haven't been able to tune their 10nm to compete with anything yet). And nm is just a chip size that increases power efficiency which is why Intel runs so hot compared to AMD now. Running more efficient means AMD can find new limits as they continue to fine tune it more with each generation. It takes time to fine tune new technologies. Intel on the other hand is so desperate, they sliced off a layer of their chips just to get better contact with the heatsink so they can overclock it ever so slightly more in order to keep up with AMD again. I imagine that makes them even more delicate too. AMD is already working on 5nm and 3nm chips while Intel still struggles to release any 10nm chips. At some point, Intel will need to spend some of their blood money and evolve in order to catch up, until then, they're still being lazy and cheap.

 
Last edited by a moderator:
I own and have beat 2077. On an i7-6950X. With everything maxed. Ran fine for me. The biggest buff I have seen as the owner of a decent CPU is that when DLSS is enabled you get a MAJOR FPS boost. In fact the way I understand it, many players cannot play without it, likely due to low core counts or sub-par PCs.


Which isn't surprising as i7-6950X is a 10 core CPU and that shows exactly what I was driving at. If you buy a 4 core CPU today you can still play a lot of games without problems but I would call that not future-proof or even a bad idea for the present time already. The games industry is in the midst of changing and supports a lot more cores already.

I'm not at war with AMD. If it was so much better their stock would show it. The simple fact is that you have all this fake fanfare out there out there, something AMD is awesome at generating, but I have yet to see it matter in the real world. Intel has always out-performed in the real world, and my company does yearly testing so we can update our offerings based on customer needs. Yes, we even do gaming rig builds, and out top performers in actual games have always been Intel chips, but we don't buy bad chips. We buy their X series and similar, which is probably why we have had such good experiences with them in comparison to AMD. Not that we don't purchase and sell AMD, but they are generally relegated to budget builds or encoding setups, or SOHO servers, etc.

A final thing I will note is that while AMD has focused on shrinking their lithography, they had chips at 7nm compared to Intel chips at 14nm as Intel began rolling out more thoroughly tested 10nm chips. AMD's 7nm chips were barely ahead of Intel's aging 14nm chips at the time. That speaks volumes about the differences between the two. I am sure AMD was embarrassed and has since taken advantage of a 7nm lithography in a more efficient manner, but that alone keeps me from buying them the day they arrive.


At another time when AMD called their CPUs Athlon and Intel called them Pentium it was exactly the other way round, AMD was trailing behind Intels litography and their CPUs were more than competitive. So what does that tell us? The litography is just one of hundreds of parameters that influence the power of a CPU.

Secondly the numbers like 7nm and 10nm are just labels now and not easily comparable as that depends very much on what exactly is measured in different litographic techniques. Here the info in laymans terms: https://www.techcenturion.com/7nm-10nm-14nm-fabrication

Quote: "What Intel calls as 10nm, is similar to what TSMC calls as 7nm".

So at the moment they produce somewhat equally good CPUs (if we use gaming benchmarks) with transistors about equally densly packed (see the Transistor Density Comparison table, Intels 10nm process is even slightly denser). There is no reason for AMD to be embarassed, maybe the 10(?) times bigger Intel should be embarrassed that they can't get a lead with a multiple times higher research budget.

Oh, and let's not forget the "each core shares one cache which slows us way down" fiasco. I believe that was only fixed on AMD chips within the last year and the current-gen Ryzens. Intel learned that lesson on the Core2Duo's eons ago.


So you are saying Intel made the same mistake? And learned from it like AMD did now? Well, here is 1 point for both 😉

 
Last edited by a moderator:
We should split this off into a CPU technical thread. I feel like we're OT now.

Anyway, I can find a single story about the very brief time AMD got ahead of Intel late last year/early this year due to Intel having production issues. You know, due to the world being closed for business. AMD has suffered also, don't get me wrong, but they had better output so it was a matter of people needing to have something.

This is from Jan 5, 2021

On the other hand, AMD lags way behind, as usual. Also as usual, AMD is still "closing the gap". Been told that since like, 2001? At the rate of closing that gap they should overtake Intel in about a thousand years.

Worldwide CPU Share

Passmark results updated daily

AMD still closing that gap

Again, both manufacturers make good chips, but Intel still dominates in every case we have put them through, including gaming. I do own a few AMD systems, but they are generally for other things. In fact I have a quad-core AMD laptop running PCLinuxOS for the wife to do web-stuff, email, streaming video, and even basic games on. Good system, but it runs hotter than my eight-core i5 laptop.

Speaking of which, we just had a discussion on here about AMD running hotter. A user was advised to check temps and another, with the same setup, mentioned the after-market fan/heatsink he was using on the chip to keep it cool. AMD may be cooler than prior generations but our Ryzens run hotter than the Intels when under load. I assume games would stress them the same way as production software. Especially 7 Days or Raft.

Fox is right about Intel in one thing. They need to get their butts in gear. AMD is listening to their users and tossing on more cores so gamers can stream and play (or simply play 2077 and pray they don't overload the CPU) while Intel has not, and when they have, they are charging INSANE prices now. My 6950X was ridiculous, but new chips are just stupid. It is the primary reason that I have NOT upgraded yet.

In fact, when two things happen I will likely build an AMD-based rig. First, I need a big jump in power and cores. I would like 16-core, 32-thread setup with at least 50% more performance per core. Second, make games work properly with AMD. Ark is a shining example of where, at least on Windows (not sure about Linux), AMD users have more issues due to being AMD users. More crashes being a big one. Unreal Engine is and always has been setup for Intel/nVidia setups and Epic now needs to give some love to AMD. I suppose a third thing would be for AMD to catch up to nVidia in ray-tracing. Once you have used it, you can NEVER go back. It makes things so much better! I imagine with Vulkan being bigger and bigger, AMD can close this gap soon enough. I also suppose I could do the AMD CPU/nVidia GPU combo, but I'm not sure yet.

Now, probably to your surprise, we're about to build to HEAVY servers. We chose Threadrippers over Xeon-W despite the performance gap. Why? Price. Xeon-W's are what we wanted, but it is well beyond the client's budget. Intel needs to stop their stupid "price a Xeon high so gamer's don't buy them" crap. My motherboard (ASRock X99 Extreme4) can handle Xeons with more cores and even ECC RAM, but Xeons are just too high. I do hope AMD can put a hurt on Intel and get them to wake the heck up. Intel needs AMD as much as AMD needs Intel.

*EDIT*

Fastest AMD versus fastest Intel X. AMD has a whopping 5% overall better performance, but the Intel overclocks better. I do not OC. The big elephant in the room? The AMD is nearly half the price. This is why AMD ***IS*** going to kill Intel.

5% better, 50% cheaper

 
Last edited by a moderator:
And let's not forget the abysmal performance that is Intel's 10th and 11th gen CPU's. It's pretty sad when some of the 9th-gen equivalents actually end up performing better in real-world scenarios.

For most people right now, and since Gen 3 Ryzen, AMD is the choice for performance-per-dollar in the CPU market. After over 20 years of having the dust kicked in their face by Intel, they have risen above and are truly ahead of the game for workload and performance.  It used to be that if you wanted to do little things fast (like games), you went AMD. If you needed to handle a workload of serious computing, you went Intel. Now AMD is doing both, and they're doing it better than Intel. 

And with Nvidia @%$#ing over the consumer market the past couple of years, AMD is getting a foothold there as well.  Sure they don't have the nose on Ray Tracing or DLSS, but how big of a market is that really?  Performance-wise their latest generation of GPU's is on-par or ahead of Nvidia.

We're talking consumer market here. PC's for normies.  Not servers. :)

 
Last edited by a moderator:
I'll disagree with their GPUs. They may eb closing in on the 2080, but the 3090 is just WAY beyond anything else out there, and the 3080 is not far behind. CPUs? AMD is literally pricing Intel out of the market. I Will say this. The comparison linked above is Intel on old lithography versus a new 7nm AMD chip. The AMD is only 5% better overall. That speaks volumes to how sloppy AMD is. They should be 50% better or more since they can cram twice as many transistors onto the same die as Intel can, but it isn't. Still, half the price even if it was the same performance says a lot to people on a budget. Like I said, I am looking at AMD for my next build unless Intel does something meaningful.

I would like to see Intel try to justify double the price for the same performance. That would be interesting.

 
While I like the fact that AMD is catching up to the greedy Nvidia in GPUs, I still prefer to buy Nvidia cards just because of gaming support dominance. If AMD can get their drivers and software to not suck and if AMD can gain dominance in gaming support, then I would happily switch over to AMD GPUs. In my opinion, support matters more than cost hence why I tolerate Nvidia's pricing.

 
With my newest computer, I went with the AMD Ryzen 7 3800X.  I looked at the comparable Intel (Core i7-9700k) which based on benchmarks, the Intel was better.  However the AMD gave me what I wanted and I got a really good deal on it.  I am not a hard core computer person like all of you, but I typically build my computers to last (7-10 years) and with the exception of adding new RAM or replacing a burnt out component, I usually do a complete new build when it comes time and my computer is seriously lagging in performance.

It makes it easy to get my wife on board to spend money on a new computer for me when I point out that every time I build one, it lasts for a minimum of 7 years.

I originally used a GTX 550 Ti with 1GB. Played just fine back in those days. I still have the card. I also did not realize that the 2080 only had 8GB. I have the Ti version which has 11GB.


Mine is the 2080 Super.  I don't think they had Ti or the regular ones available when I was building it, but I lucked out on getting this one.  I hope I can get a lot of years out of this one, but I don't think I play as demanding games (with the exception of 7D2D) as you all do.  As I have gotten older, I have gotten into the more casual games that I can enjoy when I am not spending time with the family  🙂

 
While I like the fact that AMD is catching up to the greedy Nvidia in GPUs, I still prefer to buy Nvidia cards just because of gaming support dominance. If AMD can get their drivers and software to not suck and if AMD can gain dominance in gaming support, then I would happily switch over to AMD GPUs. In my opinion, support matters more than cost hence why I tolerate Nvidia's pricing.
Don't know how old you are, but I have a guy I formed my clan with in 1999 who is still with me. He's a Marine and he travels, but we both have a good laugh about the time he bought a Radeon 9800 (AGP, years ago) and the ATI driver did reverse-rendering to gain a few FPS and compare to the nVidia of that day, which I had. The thing is, drawing player models (and plants, furniture, etc) last meant that in most games, he had a permanent, LEGAL wallhack. He sent me screenshots from UT99 and the Infiltration mod we played and he saw all players and such all the time. ATI released a fix, but it cost him some FPS. Still, those were fun times. ATI and AMD have always had driver issues with their GPUs though. Remember the Omega drivers? I do!

BFT, that is what drives most gamers. Price. If an AMD CPU is in the same ballpark as the Intel, but costs less, it WILL be sold. I hope Intel learns this lesson soon, or they WILL lose me to AMD for CPUs.

 
I'm 36. And yes, I remember Omega drivers. It was often the only way I could get AMD GPUs to work properly.

 
I'll disagree with their GPUs. They may eb closing in on the 2080, but the 3090 is just WAY beyond anything else out there, and the 3080 is not far behind. CPUs? AMD is literally pricing Intel out of the market. I Will say this. The comparison linked above is Intel on old lithography versus a new 7nm AMD chip. The AMD is only 5% better overall. That speaks volumes to how sloppy AMD is. They should be 50% better or more since they can cram twice as many transistors onto the same die as Intel can, but it isn't.


 If AMD had so much possibilities with double the transistors and could do only 5% better, then NOW when Intel has the same amount of transistors than AMD they should be ahead by 45%, right?

But they are not. Which means that you may be putting too much value on the effect of a shrink. 10 or more years ago every shrink of the litography was accompanied by a sizable increase in Mhz as well as better IPC. All of that together made a new generation of CPUs so much better than the previous one. But now the frequency stays constant and the small improvements with IPC and more transistors are not that much of a deal. 

Also both companies did probably notice that the notebook market is the current growth market (even before Corona) and have used the shrink to draw less power and not get out more performance

 
Last edited by a moderator:
I'm 36. And yes, I remember Omega drivers. It was often the only way I could get AMD GPUs to work properly.
Then you were born in 1985. I was born in 1980. Glad you remember those drivers. No idea what happened to them though.

Meganoth, I believe they are only 5% better because this is how they are reducing power and heat. Half the lithography but same transistor count means more space to breathe. AMD has always run hotter and drawn more power in the past. Perhaps while Intel twiddles its thumbs AMD will close that gap and when Intel begins to realize something is wrong, AMD will be make better use of their stuff for more speed also. Time will tell.

 
Then you were born in 1985. I was born in 1980. Glad you remember those drivers. No idea what happened to them though.

Meganoth, I believe they are only 5% better because this is how they are reducing power and heat. Half the lithography but same transistor count means more space to breathe. AMD has always run hotter and drawn more power in the past. Perhaps while Intel twiddles its thumbs AMD will close that gap and when Intel begins to realize something is wrong, AMD will be make better use of their stuff for more speed also. Time will tell.
Heck, I do remember my first assembly of a PC where I had to choose between S3 Trio and VoodooRush ))) I made a bad choice then)), and the first pc of mine was a 33 Mhz IBM with TURBO mode to 66Mhz what did exactly nothing))) 10Mb hard drive! And before that I was frequently visiting one of my friends as his father was some sort of computer scientist that time and he had a "portable" computer using spool of magnetic tape)))

 
My first was an Atari 400. 6502 CPU, 1.8MHz. I still own it, it is in factory shape, and it still works. I also own a Tandy 2100, IBM PS/2, and other relics which work. Maybe I can open a museum one day.

 
My first real experience with games was Pong. Later I wrote software on a TRS-80. Then the first PC I built was the first 16MHz home computer. Dad was VP of marketing for one of the big tech companies and got a Heathkit about a month before they hit the shelves. He tossed the instructions and just laid out the parts on the game room table. Took me about a week to sort it all out. Around '86-7 if I remember correctly. Still have that here somewhere. Have an old Mac 512k that still works too, but I lost the keyboard for it.

Been fixing electronics since I was like 8, and then just naturally went into PC's. Graduated high school and tried to get a job at Tandy, but they turned me down because "no experience". Then the day I was shipping out for the Army I got a call out of the blue from American Megatrends. Hadn't even applied there, but they had seen some of the stuff I did. They wanted me to work on code for BIOS software. Was really hard turning that down. I was literally getting into my truck to go report for duty when they called.

I really do need to clean out the closet. It's got a fair amount of old systems I should document, and a lot of e-waste I should just get rid of. LOL

 
Last edited by a moderator:
Back
Top