PC 7 Days To Die optimization

Watching a grandspartan stream who has a 3080ti graphics card with 64gbs of ram has massive lag i dont want anyone telling me i dont know what im saying the game is not optimized 1 bit and im sick and tired of people telling me otherwise. Dont tell the player base we dont know what we are talking about
You don*t know what you are talking about. 😁

More serious, the game is CPU- and probably cache-or datapath-bound in some situations as well, not GPU bound, and it is dependent on a graphics library that is not optimized for their use case.

"A chain is only as strong as its weakest link". You probably know that saying, right? Now in the case of 7d2d (different than most other games) the CPU is the weakest link. grandspartan could have two 3099ti overclocked in his computer (if such beast were available) and the game would not run one FPS faster. Because if the 3080ti was using 20% of its power then the two 3099ti would be running at 10% utilization. But the same 1 (!!) core of the CPU would be running at 100% just like it does now with a 3080ti.

Whatever you do to the GPU, the CPU is the weak link and it is running at full speed already.

Now could TFP change that? Not according to its developers, because the graphics engine (Unity) seems to feed the data to the GPU in one single thread running on 1 core of the CPU and that is the core that is already fully occupied with that task.

Since players who have super graphics cards expect to turn all options to ultra, their games often have low FPS and those FPS do NOT depend on their graphics card but on the CPU. Every graphics option they turn on also makes the CPU have to work slightly more in that single thread which is feeding the GPU with data. But that single thread in the CPU is already at its limit and so the game gets slower at ultra settings even though the GPU is bored out of its wits.

in the last years GPU performance and CPU performance have been improved as expected. But while the GPU was multi-core from the beginning and could fully employ those improvements, the speed of the CPU (for games at least) was almost equal to its single-core speed and all the improvements by adding more cores are half wasted by many games today. Since i7-4770k  (6 generations ago) the single-core performance improved only by about 30% (at least according to a comparison website). And graphics engines like Unity are only slowly converting to the situation that all tasks have to to be balanced across cores.

I don't have experience with Unity myself. I don't know what the current status is. Eventually all graphics engines will have to be fully balanced multi-core or they will be superseded by better engines because at the moment CPU improvements largely only happen by adding more cores.

 
Last edited by a moderator:
You don*t know what you are talking about. 😁

More serious, the game is CPU- and probably cache-or datapath-bound in some situations as well, not GPU bound, and it is dependent on a graphics library that is not optimized for their use case.

"A chain is only as strong as its weakest link". You probably know that saying, right? Now in the case of 7d2d (different than most other games) the CPU is the weakest link. grandspartan could have two 3099ti overclocked in his computer (if such beast were available) and the game would not run one FPS faster. Because if the 3080ti was using 20% of its power then the two 3099ti would be running at 10% utilization. But the same 1 (!!) core of the CPU would be running at 100% just like it does now with a 3080ti.

Whatever you do to the GPU, the CPU is the weak link and it is running at full speed already.

Now could TFP change that? Not according to its developers, because the graphics engine (Unity) seems to feed the data to the GPU in one single thread running on 1 core of the CPU and that is the core that is already fully occupied with that task.

Since players who have super graphics cards expect to turn all options to ultra, their games often have low FPS and those FPS do NOT depend on their graphics card but on the CPU. Every graphics option they turn on also makes the CPU have to work slightly more in that single thread which is feeding the GPU with data. But that single thread in the CPU is already at its limit and so the game gets slower at ultra settings even though the GPU is bored out of its wits.

in the last years GPU performance and CPU performance have been improved as expected. But while the GPU was multi-core from the beginning and could fully employ those improvements, the speed of the CPU (for games at least) was almost equal to its single-core speed and all the improvements by adding more cores are half wasted by many games today. Since i7-4770k  (6 generations ago) the single-core performance improved only by about 30% (at least according to a comparison website). And graphics engines like Unity are only slowly converting to the situation that all tasks have to to be balanced across cores.

I don't have experience with Unity myself. I don't know what the current status is. Eventually all graphics engines will have to be fully balanced multi-core or they will be superseded by better engines because at the moment CPU improvements largely only happen by adding more cores.
i looked back at the comments on his video and i may be wrong as he is playing a heavily custom map with only 2 biomes

 
You don*t know what you are talking about. 😁

More serious, the game is CPU- and probably cache-or datapath-bound in some situations as well, not GPU bound, and it is dependent on a graphics library that is not optimized for their use case.

"A chain is only as strong as its weakest link". You probably know that saying, right? Now in the case of 7d2d (different than most other games) the CPU is the weakest link. grandspartan could have two 3099ti overclocked in his computer (if such beast were available) and the game would not run one FPS faster. Because if the 3080ti was using 20% of its power then the two 3099ti would be running at 10% utilization. But the same 1 (!!) core of the CPU would be running at 100% just like it does now with a 3080ti.

Whatever you do to the GPU, the CPU is the weak link and it is running at full speed already.

Now could TFP change that? Not according to its developers, because the graphics engine (Unity) seems to feed the data to the GPU in one single thread running on 1 core of the CPU and that is the core that is already fully occupied with that task.

Since players who have super graphics cards expect to turn all options to ultra, their games often have low FPS and those FPS do NOT depend on their graphics card but on the CPU. Every graphics option they turn on also makes the CPU have to work slightly more in that single thread which is feeding the GPU with data. But that single thread in the CPU is already at its limit and so the game gets slower at ultra settings even though the GPU is bored out of its wits.

in the last years GPU performance and CPU performance have been improved as expected. But while the GPU was multi-core from the beginning and could fully employ those improvements, the speed of the CPU (for games at least) was almost equal to its single-core speed and all the improvements by adding more cores are half wasted by many games today. Since i7-4770k  (6 generations ago) the single-core performance improved only by about 30% (at least according to a comparison website). And graphics engines like Unity are only slowly converting to the situation that all tasks have to to be balanced across cores.

I don't have experience with Unity myself. I don't know what the current status is. Eventually all graphics engines will have to be fully balanced multi-core or they will be superseded by better engines because at the moment CPU improvements largely only happen by adding more cores.
Well you are right - the game is CPU based. However, i dont have that bad CPU - i should be able to play it if it would be CPU based only. But it might be also my GPU just being very bad for now. 

 
I’m running arguably the best ‘stock’ rig available at the moment….5950x cpu, a 3090 gtx factory overclocked videocard and 32 gigs of 3800mhz ram. I play at 4k but cannot turn every setting to ultra, it just isn’t possible. I have many but not all settings ultra and keep my fps at 120 even horde nights and downtown. If I max every setting, my fps drops below 50 and becomes very jittery. The key is adjusting your settings based on your cpu and pretty much ignoring your videocard.

 
So, not at my puter right now, but which settings are cpu intensive vs gpu?  I can't recall the description mentioning specifics.

 
You don*t know what you are talking about. 😁

More serious, the game is CPU- and probably cache-or datapath-bound in some situations as well, not GPU bound, and it is dependent on a graphics library that is not optimized for their use case.

"A chain is only as strong as its weakest link". You probably know that saying, right? Now in the case of 7d2d (different than most other games) the CPU is the weakest link. grandspartan could have two 3099ti overclocked in his computer (if such beast were available) and the game would not run one FPS faster. Because if the 3080ti was using 20% of its power then the two 3099ti would be running at 10% utilization. But the same 1 (!!) core of the CPU would be running at 100% just like it does now with a 3080ti.

Whatever you do to the GPU, the CPU is the weak link and it is running at full speed already.

Now could TFP change that? Not according to its developers, because the graphics engine (Unity) seems to feed the data to the GPU in one single thread running on 1 core of the CPU and that is the core that is already fully occupied with that task.

Since players who have super graphics cards expect to turn all options to ultra, their games often have low FPS and those FPS do NOT depend on their graphics card but on the CPU. Every graphics option they turn on also makes the CPU have to work slightly more in that single thread which is feeding the GPU with data. But that single thread in the CPU is already at its limit and so the game gets slower at ultra settings even though the GPU is bored out of its wits.

in the last years GPU performance and CPU performance have been improved as expected. But while the GPU was multi-core from the beginning and could fully employ those improvements, the speed of the CPU (for games at least) was almost equal to its single-core speed and all the improvements by adding more cores are half wasted by many games today. Since i7-4770k  (6 generations ago) the single-core performance improved only by about 30% (at least according to a comparison website). And graphics engines like Unity are only slowly converting to the situation that all tasks have to to be balanced across cores.

I don't have experience with Unity myself. I don't know what the current status is. Eventually all graphics engines will have to be fully balanced multi-core or they will be superseded by better engines because at the moment CPU improvements largely only happen by adding more cores.
i have all my game setting to as low as possible and here is my system specs, is there anything at all you think i could turn up to maybe help :)   Operating System
    Windows 10 Home 64-bit
CPU
    Intel Core 2 Duo E8400 @ 3.00GHz    81 °C
    Wolfdale 45nm Technology
RAM
    4.00GB Dual-Channel DDR3 @ 531MHz (7-7-7-20)
Motherboard
    Hewlett-Packard 3048h (XU1 PROCESSOR)
Graphics
    DELL 1704FPV (1280x1024@60Hz)
    DELL 1704FPV (1280x1024@60Hz)
    2047MB NVIDIA GeForce GT 610 (ZOTAC International)    74 °C
Storage
    465GB Seagate ST3500413AS (SATA)    39 °C
    465GB Seagate ST3500630NS (SATA)    44 °C
Optical Drives
    HL-DT-ST DVDRAM GH22NS50

 
i have all my game setting to as low as possible and here is my system specs, is there anything at all you think i could turn up to maybe help :)   Operating System
    Windows 10 Home 64-bit
CPU
    Intel Core 2 Duo E8400 @ 3.00GHz    81 °C
    Wolfdale 45nm Technology
RAM
    4.00GB Dual-Channel DDR3 @ 531MHz (7-7-7-20)
Motherboard
    Hewlett-Packard 3048h (XU1 PROCESSOR)
Graphics
    DELL 1704FPV (1280x1024@60Hz)
    DELL 1704FPV (1280x1024@60Hz)
    2047MB NVIDIA GeForce GT 610 (ZOTAC International)    74 °C
Storage
    465GB Seagate ST3500413AS (SATA)    39 °C
    465GB Seagate ST3500630NS (SATA)    44 °C
Optical Drives
    HL-DT-ST DVDRAM GH22NS50


It's a wonder its running 7d2d at all. Are you really only having 4G RAM? Or did you mean 2x4G?

If someone donates some RAM to reach 8G or a better CPU for exactly that socket to you, take it. But you simply won't get much bang for the buck upgrading any one part of that PC since all parts aren't far from being a limiting factor. An SSD might improve things somewhat but won't allow you to turn on more graphics options.

In other words putting money into that PC is largely wasted (except for a SSD which might be used with a new PC as well) and the money would be better spend on a new or secondhand computer that isn't too old.

 
It's a wonder its running 7d2d at all. Are you really only having 4G RAM? Or did you mean 2x4G?

If someone donates some RAM to reach 8G or a better CPU for exactly that socket to you, take it. But you simply won't get much bang for the buck upgrading any one part of that PC since all parts aren't far from being a limiting factor. An SSD might improve things somewhat but won't allow you to turn on more graphics options.

In other words putting money into that PC is largely wasted (except for a SSD which might be used with a new PC as well) and the money would be better spend on a new or secondhand computer that isn't too old.


Yeah... with 4MB of RAM how are you even generating maps to play on?

 
Well you are right - the game is CPU based. However, i dont have that bad CPU - i should be able to play it if it would be CPU based only. But it might be also my GPU just being very bad for now. 


You seem to have a laptop, that is a different case. Usually CPU and GPU in a laptop share the RAM which isn't good for RAM performance. Also an integrated GPU is generally underpowered for playing games so you actually might have the situation that your GPU **is** the bottleneck.

Furthermore laptops have a very tight heat budget and it is possible that your system is getting throttled when the heat goes above the limit. 

 
Last edited by a moderator:
It's a wonder its running 7d2d at all. Are you really only having 4G RAM? Or did you mean 2x4G?

If someone donates some RAM to reach 8G or a better CPU for exactly that socket to you, take it. But you simply won't get much bang for the buck upgrading any one part of that PC since all parts aren't far from being a limiting factor. An SSD might improve things somewhat but won't allow you to turn on more graphics options.

In other words putting money into that PC is largely wasted (except for a SSD which might be used with a new PC as well) and the money would be better spend on a new or secondhand computer that isn't too old.
so in laymans terms take a sledgehammer to it XD

 
Watching a grandspartan stream who has a 3080ti graphics card with 64gbs of ram has massive lag i dont want anyone telling me i dont know what im saying the game is not optimized 1 bit and im sick and tired of people telling me otherwise. Dont tell the player base we dont know what we are talking about
Fox already covered this in quite a lot of detail, but just wanted to add a little something.

A lot of streamers run CPU's like threadrippers because they are great at running multiple tasks on single threads. Like playing a single-threaded game (most are) while streaming and also encoding. The problem is that these CPU's are pretty weak when an application requires a lot of heavy multi-threaded CPU use, which is something 7 Days does.

And blowing a 5950X for something you're just going to play games on is a complete waste of money. The processor is only going to be utilized when you're running multiple programs like Photoshop, the Creative Cloud, video editing software, and DAWs. Otherwise most of those 32 cores are just going to be sitting at idle not doing a damn thing. Sitting and bragging you've got a 5950X, 64GB RAM, and a 3090Ti just to play games only shows that you have more money than common sense. I would strongly advise anyone preaching specs like this take a few minutes to do some research on how things actually function.

 
Last edited by a moderator:
You're playing on hardware so small you can't even see what is "optimized" or not. It's amazing you can even load up the game at all.
i know i keep wondering that myself XD

Fox already covered this in quite a lot of detail, but just wanted to add a little something.

A lot of streamers run CPU's like threadrippers because they are great at running multiple tasks on single threads. Like playing a single-threaded game (most are) while streaming and also encoding. The problem is that these CPU's are pretty weak when an application requires a lot of heavy multi-threaded CPU use, which is something 7 Days does.

And blowing a 5950X for something you're just going to play games on is a complete waste of money. The processor is only going to be utilized when you're running multiple programs like Photoshop, the Creative Cloud, video editing software, and DAWs. Otherwise most of those 32 cores are just going to be sitting at idle not doing a damn thing. Sitting and bragging you've got a 5950X, 64GB RAM, and a 3090Ti just to play games only shows that you have more money than common sense. I would strongly advise anyone preaching specs like this take a few minutes to do some research on how things actually function.
that is grandspartans PC specs and i dont know his whole setup either was just stating something i noticed at the time and realized later i made a mistake :)

 
Fox already covered this in quite a lot of detail, but just wanted to add a little something.

A lot of streamers run CPU's like threadrippers because they are great at running multiple tasks on single threads. Like playing a single-threaded game (most are) while streaming and also encoding. The problem is that these CPU's are pretty weak when an application requires a lot of heavy multi-threaded CPU use, which is something 7 Days does.

And blowing a 5950X for something you're just going to play games on is a complete waste of money. The processor is only going to be utilized when you're running multiple programs like Photoshop, the Creative Cloud, video editing software, and DAWs. Otherwise most of those 32 cores are just going to be sitting at idle not doing a damn thing. Sitting and bragging you've got a 5950X, 64GB RAM, and a 3090Ti just to play games only shows that you have more money than common sense. I would strongly advise anyone preaching specs like this take a few minutes to do some research on how things actually function.


Right now, you can´t buy any CPU that doesn´t lag in cities. It´s a fact, the lack of optimization shows as strong as never before during development. You can´t talk that away. Now i am not saying fully optimize right now, but denying the problem is just a no go tbh.

And it´s only cities, so turning options down, just for a certain area is annoying af and not a solution, it´s not even a workaround.

For me it´s either using KingGen and not having the new tile system and cities. Or waiting for optimization. As much as i like the new cities there is no fun in playing right now with those lags. 

 
Last edited by a moderator:
Right now, you can´t buy any CPU that doesn´t lag in cities. It´s a fact, the lack of optimization shows as strong as never before during development. You can´t talk that away. Now i am not saying fully optimize right now, but denying the problem is just a no go tbh.


So you have such a CPU? May I ask which one?

If not, are you basing this opinion on watching streamers where you don't even know what they have running beside the game and what options or mods they employ?

And it´s only cities, so turning options down, just for a certain area is annoying af and not a solution, it´s not even a workaround.

For me it´s either using KingGen and not having the new tile system and cities. Or waiting for optimization. As much as i like the new cities there is no fun in playing right now with those lags. 


Do you have dynamic meshes turned on or off?

If you have it on then you have turned on a new feature just introduced in A20 which still has an unhealthy amount of bugs and unoptimized code, anything else would really be astonishing. I'm sure the devs would totally agree with you that that code isn't optimized.

Cities ARE the most FPS-intensive terrain besides densely populated forrests because of high polygon count, there is no avoiding this. If you set your options so only the best case has playable FPS then the worst case will be unplayable. There is no way around that.

 
I actually do know someone with a 11900K who knows what he is doing when it comes to PC´s and gaming. And the fact that there is no way avoiding this is the problem here. Turning things off just for a certain area is just not it. A feature that you can´t use without having lag, meh. Well gotta wait if this can be solved, the lack of T5 POI´s generated is boring af anyways. 

 
Last edited by a moderator:
Back
Top