Stuttering badly on my PC - i7 7700k + RTX 2080ti

Robeloto

Refugee
I wiped the dust off my old PC and wanted to try using it as a dedicated server only.

i7-2600k
16GB DDR3 1066mhz
GTX 680 2GB

We were 3 players on this server. We just tried without using any mods and on A20 experimental. I noticed lags immediately, especially when there were more zombies. Only 3 zombies made it lag terribly. Opening and closing containers lagged. I know this game is very heavy on the CPU. But I remember playing Alpha16 without any troubles. Dedicated server + played at the same time.  I guess it is not enough anymore? Anyone else tried similiar setups without lags?

First I thought it was something else that was wrong. So I tried using my i7 7700k as dedicated server and playing at the same time. No lags what so ever.

Oh well, seems my old PC is gonna go back to collecting dust again. Or maybe I will just throw it away lol.
:)


Well time to change the topic as it seems it was my newer PC that was the source of the lag and not the old one.

 

 
Last edited by a moderator:
Maybe the dedicated server files need updating for the new A20 build... but the 2600k should be more than capable of hosting the game. I do hope you're running it off of an SSD though, as that might cause lag.

 
Last edited by a moderator:
IMHO, it's on the low-end, but should be just fine for the supported player count.  Recommended to have an SSD for sure though.

 
What size world? HDD? What were the server fps from the server log?

I ran test worlds on similar hardware with 5 players without issues.


Maybe the dedicated server files need updating for the new A20 build... but the 2600k should be more than capable of hosting the game. I do hope you're running it off of an SSD though, as that might cause lag.


IMHO, it's on the low-end, but should be just fine for the supported player count.  Recommended to have an SSD for sure though.
World size: 10240

Newest dedicated build a20 exp

On a new Kingston SSD 500gb

Fps : 120 but stutters and drops to 70.

 
Check your drive speed and SMART statistics. Kingstons are already slow, and get abysmally slow as they age. I just recently had to throw out a full array that was only two years old. 

 
Check your drive speed and SMART statistics. Kingstons are already slow, and get abysmally slow as they age. I just recently had to throw out a full array that was only two years old. 
Im gonna check it. I bought it just a few days ago. Should be working

I'd switch to 8k

Server fps not game fps. From your server log. Should be roughly 20-35 fps.
My bad, gonna check that up and see. 

 
I'm also a bit curious as to whether or not your slow ram could be the issue. If memory serves (pun not intended), DDR3 1066MTs is pretty much the bottom feeder ram you see in pre-built systems, and pre-built systems often only have a single stick of ram, losing out on the dual channel benefits. I don't have the slightest clue how ■■■■■■■ ram hosting a dedicated server is (other than the amount of GB it uses), but I think it could be a possibility worth looking into, unless someone else knows something I don't.

I know from past experience that dedicated servers (with probably any game) isn't CPU intense until you load up with a lot of players... 3 players is pretty much nothing if playing in co-op as the players usually hang out in the same area which is easier on the server. My retired FX-8320 could handle that (and has in the past) and we all know how bad those CPUs were.

 
Last edited by a moderator:
I run a dedi for the wife and 2 kids just fine with a similar setup (2700k, 16gb (668mhz), GTX 1660), even with mods and CP prefabs.

Might get a little lag here and there but nothing game breaking.

But this is still on 19.5 tho since I haven't converted over to 20 yet either.

 
I'm also a bit curious as to whether or not your slow ram could be the issue. If memory serves (pun not intended), DDR3 1066MTs is pretty much the bottom feeder ram you see in pre-built systems, and pre-built systems often only have a single stick of ram, losing out on the dual channel benefits. I don't have the slightest clue how ■■■■■■■ ram hosting a dedicated server is (other than the amount of GB it uses), but I think it could be a possibility worth looking into, unless someone else knows something I don't.

I know from past experience that dedicated servers (with probably any game) isn't CPU intense until you load up with a lot of players... 3 players is pretty much nothing if playing in co-op as the players usually hang out in the same area which is easier on the server. My retired FX-8320 could handle that (and has in the past) and we all know how bad those CPUs were.


I run a dedi for the wife and 2 kids just fine with a similar setup (2700k, 16gb (668mhz), GTX 1660), even with mods and CP prefabs.

Might get a little lag here and there but nothing game breaking.

But this is still on 19.5 tho since I haven't converted over to 20 yet either.
Thanks for your inputs.

I am getting this error in the server logs.

ERROR: Shader Shader is not supported on this GPU (none of subshaders/fallbacks are suitable)WARNING: Shader Unsupported: ‘PF/Standard’ - Setting to default shader.
I tried installing the Microsoft visual C Redistributables 2022  x64. I tried running the server on an 8k map and I get much less lag. However I am the only player so haven't tested it out with more players yet.

I am also still getting the error in the server logs. Maybe a reinstall of the dedicated server might fix it. Gonna try atleast...

 
Thanks for your inputs.

I am getting this error in the server logs.

I tried installing the Microsoft visual C Redistributables 2022  x64. I tried running the server on an 8k map and I get much less lag. However I am the only player so haven't tested it out with more players yet.

I am also still getting the error in the server logs. Maybe a reinstall of the dedicated server might fix it. Gonna try atleast...
I think  you can safely ignore this "error".  I've been getting this too in yellow for years dating back to 16 I think.  To my knowledge it's a unity issue and not something we can fix on our end.

 
Last edited by a moderator:
I think  you can safely ignore this "error".  I've been getting this too in yellow for years dating back to 16 I think.  To my knowledge it's a unity issue and not something we can fix on our end.
Oh ok, good to know! ^^

Well, tried playing now with a friend. As soon as it became 12 or more zombies the fps started dropping. Server fps dropped from 37 to 29. My own fps dropped from 100 to 50 fps. As long it is around 7 zombies only it is pretty stable. Gonna check the SSD tonight for errors, and maybe the RAM if I have time.

 
Server fps and client fps have no relation or causation at all. You could have 1 fps and the server could still produce 30fps, and you could have a thousand fps client side and only have 1fps on the server.

So if you're getting fps dips below what is acceptable, then blame the hardware on the client side, not server. Lower some settings and enable fps limiter to 60 to stabilize the fps and frame times. An unstable frame time will cause stutter issues which could make you think it's server related.

 
Server FPS isn't a framerate. It's not even remotely related. It has to do with CPU clock cycles, and though it has the same acronym, the meaning is completely different.  Frags Per Second regards the number of command fragments per second that have been put into the queues in the instance.  Some servers like CS:GO want to have this high around 120-160FPS, however other servers like 7 Days are better with it lower. Ideal is 35-40. Above 40 is wasting CPU cycles for the sake of having them. It has no real benefit.  Anything between 25 and 40 is acceptable, and when it starts dropping below 25 you know that the system is a bit loaded. Either the CPU isn't able to keep up with the demands, or there is other load somewhere else. (In one situation recently, it was the number of mods and how they were being loaded into RAM. Compressing the mods resolved the issue.)

 
  • Thanks
Reactions: Fox
Server fps and client fps have no relation or causation at all. You could have 1 fps and the server could still produce 30fps, and you could have a thousand fps client side and only have 1fps on the server.

So if you're getting fps dips below what is acceptable, then blame the hardware on the client side, not server. Lower some settings and enable fps limiter to 60 to stabilize the fps and frame times. An unstable frame time will cause stutter issues which could make you think it's server related.
I see. Thanks for the info. Well, hardware on the client side is a RTX 2080 ti,  i7 7700k and 32 GB DDR4 3200 mhz, so I know the hardware is more than enough to handle this game. Maybe there is something else? Maybe bad drivers or the network. Me and my friends gets fps dip at the same time, so that is why we think it is the server.

That's a good server fps. Maybe the problem is network.

Or, maybe the problem is not the server, but the client.


Alright. Yeah the problem may be network. The server is using wireless network. Maybe will try it wired and see if that will solve anything. I do not think it is the client, but I would not rule it out completely.

Server FPS isn't a framerate. It's not even remotely related. It has to do with CPU clock cycles, and though it has the same acronym, the meaning is completely different.  Frags Per Second regards the number of command fragments per second that have been put into the queues in the instance.  Some servers like CS:GO want to have this high around 120-160FPS, however other servers like 7 Days are better with it lower. Ideal is 35-40. Above 40 is wasting CPU cycles for the sake of having them. It has no real benefit.  Anything between 25 and 40 is acceptable, and when it starts dropping below 25 you know that the system is a bit loaded. Either the CPU isn't able to keep up with the demands, or there is other load somewhere else. (In one situation recently, it was the number of mods and how they were being loaded into RAM. Compressing the mods resolved the issue.)
Good to know. I did not know that. ^^

We started a new world on 8k now and now there were much less lag. But as soon as the zombies increased it became almost unplayable. We all have good PCs and we do not experience this if I run the server on the same PC as I play on. which is a bit frustrating, but when I have more time I am gonna test the rest of the servers hardware and see if every driver is up to date. Also gonna see if A19 is working better. The server runs without mods atm, until I figured out what the problem is.

 
What's your Upload speed like for your internet? I understand that a lot of ppl tend to have good download but terrible upload speeds, especially on copper lines.

You could also lower the view distance on the client side which would relieve stress on the server side as it loads up less of the map all the time for all 3 players. (It'll also help stabilize your own fps too)

I know your hardware is great, but I also know ppl with great hardware tend to think they can run on max settings with every game no matter what, which isn't always the case, especially during early access. I myself tend to run the game on medium - high settings even though I know I could set things up higher... but I prefer a stable fps over fancy graphics. With every major update, the devs tend to make the game even more demanding on hardware which means a constant re-adjusting of graphics settings. They did mention plants and zombies are now fully HD.

 
Last edited by a moderator:
What's your Upload speed like for your internet? I understand that a lot of ppl tend to have good download but terrible upload speeds, especially on copper lines.

You could also lower the view distance on the client side which would relieve stress on the server side as it loads up less of the map all the time for all 3 players. (It'll also help stabilize your own fps too)

I know your hardware is great, but I also know ppl with great hardware tend to think they can run on max settings with every game no matter what, which isn't always the case, especially during early access. I myself tend to run the game on medium - high settings even though I know I could set things up higher... but I prefer a stable fps over fancy graphics. With every major update, the devs tend to make the game even more demanding on hardware which means a constant re-adjusting of graphics settings. They did mention plants and zombies are now fully HD.
Oh sorry, forgot to answer. Happy new 2022 btw ;)

My upload speed is around 250mb/s.

Hey, I am the same as you. I have low/medium on pretty much all. Now we played at the same PC as I run the dedicated and sometimes. I guess it is around when the daily horde comes. It is stuttering like crazy for about 15-20 seconds for both of us. I never had this problem with earlier versions. But I also didn't have this graphic card. (2080 ti). I had GTX 1080 before.

Here is my Nvidia settings:

nvidiasettings.jpg

Here is my CPU and GPU. The CPU goes to around 79C max under load and GPU to around 64C.

CPU and GPU.jpg

Here is my Video settings in 7 Days to Die.

VIdeosettings.jpg

 
These are the settings I'm currently using in case you're interested. I get a solid 60fps (no dips) with my Ryzen 5 5600x, 32GB of ram and old GTX 1060. I know I could easily up the settings, but I'm too lazy to test the limits. These are also the same settings I used back when I ran it on the Ryzen 1600x / same GPU.

https://i.imgur.com/g8jGTeD.jpg

https://i.imgur.com/7bBmTE3.jpg

 
Last edited by a moderator:
Back
Top