Optimization of the server core

Tialas555

Refugee
The game has been around for many years, but why do game developers ignore multiplayer? I keep 3 servers with a maximum online capacity of 30 people, and the servers are still lagging on amd ryzen 9 9950x3d and ultra core 9 285k processors, although the frequency of the cores on which the servers are running, which is on the first processor, which is on the second, is 5700 MHz. I don't know what to do about optimization anymore..Maybe there are some ideas and suggestions?
 
30 people is probably an impossible task, the game is being designed for 8. Whether that's because the task is impossible, or that the devs are bad/evil, the game is designed to cook a server with 8 people on it. In a friendlier phrasing, the game utilizes a server fully in normal 8-man gameplay, no slack left on the table.

Hope you find something helpful/miraculous, but it's a tough job to "optimize" to 30 from there ;)
 
30 people is probably an impossible task, the game is being designed for 8. Whether that's because the task is impossible, or that the devs are bad/evil, the game is designed to cook a server with 8 people on it. In a friendlier phrasing, the game utilizes a server fully in normal 8-man gameplay, no slack left on the table.

Hope you find something helpful/miraculous, but it's a tough job to "optimize" to 30 from there ;)
I'm saying, why is there such a problem, there are only 8 players? Do the developers not know how to do multithreading? Then let the source code of the game be given to professionals and they will rewrite the entire game code in a month at the expense of AI)
 
Do the developers not know how to do multithreading?
While there have been issues with things like that, that isn't the real issue. The issue is that the game type just contains a mixture of actually hard problems; easy-to-grasp ones are physics simulation and AI, especially in a freely editable 3d environment. Both will happily eat supercomputers and still be relatively poor approximations and/or slow.

Most games that support 30+ people at once are MMOs, in entirely static environments with very simple AI tasks (bosses just aoe in different patterns, mobs just point at a player and press "run"). This isn't one of those, and even they usually split the population of an area to small separate servers, in the 50 people range.

TFP wants "human like behavior" for the bandits, even games with static environments struggle with anything of the sort.
TFP wants a fully destructable world, maintaining that with lots of moving people is expensive.
Etc etc ...
 
I'm saying, why is there such a problem, there are only 8 players? Do the developers not know how to do multithreading? Then let the source code of the game be given to professionals and they will rewrite the entire game code in a month at the expense of AI)

A voxel game is a different challenge, especially when it has to rely on a standard graphics library like Unity that is geared towards and optimized for static polygon-based worlds. And no matter how incompetent or competent TFP really is I am sure they will not release a successful game that keeps their company running to open source. THAT would be serious incompetence. :ROFLMAO:
 
While there have been issues with things like that, that isn't the real issue. The issue is that the game type just contains a mixture of actually hard problems; easy-to-grasp ones are physics simulation and AI, especially in a freely editable 3d environment. Both will happily eat supercomputers and still be relatively poor approximations and/or slow.

Most games that support 30+ people at once are MMOs, in entirely static environments with very simple AI tasks (bosses just aoe in different patterns, mobs just point at a player and press "run"). This isn't one of those, and even they usually split the population of an area to small separate servers, in the 50 people range.

TFP wants "human like behavior" for the bandits, even games with static environments struggle with anything of the sort.
TFP wants a fully destructable world, maintaining that with lots of moving people is expensive.
Etc etc ...
So what's the problem with putting the server core on a multithreading? Why was Naiwazi able to do this and is everything working more or less with it? Yes, the logic of zombies is partially broken, but at least 70 people are there and they won't lag. Who forbids multithreading? "Nobody." This is all being done, but why don't the game developers want this?
Post automatically merged:

A voxel game is a different challenge, especially when it has to rely on a standard graphics library like Unity that is geared towards and optimized for static polygon-based worlds. And no matter how incompetent or competent TFP really is I am sure they will not release a successful game that keeps their company running to open source. THAT would be serious incompetence. :ROFLMAO:
Even Minecraft in Java has been made multithreaded, what can I say? 7 days to die is also on multithreading, but not the server core
 
So what's the problem with putting the server core on a multithreading? Why was Naiwazi able to do this and is everything working more or less with it? Yes, the logic of zombies is partially broken, but at least 70 people are there and they won't lag. Who forbids multithreading? "Nobody." This is all being done, but why don't the game developers want this?
Post automatically merged:


Even Minecraft in Java has been made multithreaded, what can I say? 7 days to die is also on multithreading, but not the server core
Some people just like creating problems out of nothing
 
Why was Naiwazi able to do this and is everything working more or less with it?
No idea what you're advertising here, but their front page is a mix of empty promises and riddled with typos. Not the most typical looking scam, but doesn't really raise my confidence when the purchase process is "contact us on discord". Don't lose your credit cards, people.

Yes, the logic of zombies is partially broken
So, the game doesn't work, but it doesn't lag... yay. There's youtube for that, or facebook if you want to connect with friends.
 
No idea what you're advertising here, but their front page is a mix of empty promises and riddled with typos. Not the most typical looking scam, but doesn't really raise my confidence when the purchase process is "contact us on discord". Don't lose your credit cards, people.


So, the game doesn't work, but it doesn't lag... yay. There's youtube for that, or facebook if you want to connect with friends.
They only sell this optimizer in China now, but I still haven't received an answer from anyone to the question: why won't the server core be made multithreaded if Unity allows it?
 
I still haven't received an answer from anyone to the question: why won't the server core be made multithreaded if Unity allows it?
I don't know the details, only the dev team does; but "being allowed" isn't even a hurdle.

First we'd have to establish that it isn't already done.

Then, that a suggested change in threading could be utilized to produce an improvement - not all problems are trivially parallelized. Multithreading at the server might actually not produce notable improvements. Despite what some random server vendors claim.

Then, that the change would work on the main target hardware, that is consumer PCs; with a performance nod given to the weaker consoles. The game is aimed to be ran self-hosted, renting servers is more of a side-effect. Any optimizations, need to maintain that, preferably improve on it.

And then the budget, and the focus shift, and the potential market analysis to cover for the budget. You'd probably have to re-write a whole lot of systems.

At the moment, your suggestion is at the level of "make it faster!", I wouldn't be too cocky about presenting it...
 
I'm saying, why is there such a problem, there are only 8 players? Do the developers not know how to do multithreading? Then let the source code of the game be given to professionals and they will rewrite the entire game code in a month at the expense of AI)

Yes, the devs do know how to do multi-threading and they are professionals.

This is kind of like a conversation I had with a senior manager once who's background was law. He couldn't understand why buying the fastest CPUs on the market didn't buy his way into desired performance. He didn't understand the algorithms involved and that Moore's "Law" was dead.

Not all problems can be solved by multi-threading and games with physics simulations and player interactions often depend on a main game loop that cannot be parallelized. Other parts of 7D2D are multi-threaded.

EDIT: You do understand that adding more players isn't necessary a linear increase in demand, right? That is, 30 players isn't necessarily 3x 10 players when it comes to the game's algorithms. The number of lines of sight between players is n x (n -1) / 2, so the number of lines of sight between players:

2 players = 1 LOS
3 players = 3 LOS
8 players = 28 LOS
30 players = 435 LOS

Now add 64 zombies.

LOS can be multi-threaded if that's the only calculation being performed. But all of these things have a shared game state so something has to coordinate it all and the AI has to have many of those results to make a decision.

Multi-threading isn't a panacea and it certainly is no picnic when developing algorithms.
 
Last edited:
I don't know the details, only the dev team does; but "being allowed" isn't even a hurdle.

First we'd have to establish that it isn't already done.

Then, that a suggested change in threading could be utilized to produce an improvement - not all problems are trivially parallelized. Multithreading at the server might actually not produce notable improvements. Despite what some random server vendors claim.

Then, that the change would work on the main target hardware, that is consumer PCs; with a performance nod given to the weaker consoles. The game is aimed to be ran self-hosted, renting servers is more of a side-effect. Any optimizations, need to maintain that, preferably improve on it.

And then the budget, and the focus shift, and the potential market analysis to cover for the budget. You'd probably have to re-write a whole lot of systems.

At the moment, your suggestion is at the level of "make it faster!", I wouldn't be too cocky about presenting it...
The same thing was written about Minecraft, when the same servers on 1.7.10 are single-threaded, and on 1.16.5+ they are already multithreaded and the servers work has become many times better than on 1.7.10 and because of the addition of multithreading to the server core. So why is everything fine there, but they can't do it here? In addition, Java is much worse than Unity.
Post automatically merged:

Yes, the devs do know how to do multi-threading and they are professionals.

This is kind of like a conversation I had with a senior manager once who's background was law. He couldn't understand why buying the fastest CPUs on the market didn't buy his way into desired performance. He didn't understand the algorithms involved and that Moore's "Law" was dead.

Not all problems can be solved by multi-threading and games with physics simulations and player interactions often depend on a main game loop that cannot be parallelized. Other parts of 7D2D are multi-threaded.

EDIT: You do understand that adding more players isn't necessary a linear increase in demand, right? That is, 30 players isn't necessarily 3x 10 players when it comes to the game's algorithms. The number of lines of sight between players is n x (n -1) / 2, so the number of lines of sight between players:

2 players = 1 LOS
3 players = 3 LOS
8 players = 28 LOS
30 players = 435 LOS

Now add 64 zombies.

LOS can be multi-threaded if that's the only calculation being performed. But all of these things have a shared game state so something has to coordinate it all and the AI has to have many of those results to make a decision.

Multi-threading isn't a panacea and it certainly is no picnic when developing algorithms.
Logically. But here's the problem: servers are not capable of 100% CPU core load, regardless of the frequency of the stream, even when online is high
 
In addition, Java is much worse than Unity.

No, for two reasons. First, you might compare Java to C++ or C#, but Unity (written in C#) is a large framework that comes with some overhead.

Second, if you compare Java to C++ or C# there's more to it than "C++ runs closer to bare metal" while Java and C# have a JIT. That makes for a simple comparison, but it isn't a 100% correct statement. I remember once seeing a study where Java had better context switching than C++ and could outperform C++ depending on the algorithm. Java has had better garbage collection than C#, though I'm not sure if that's still true. Some folks say C# runs better on Windows, but Java can be better on other OS's, like Linux.

Logically. But here's the problem: servers are not capable of 100% CPU core load, regardless of the frequency of the stream, even when online is high

I'm not really sure how to interpret what you're saying there.

I think you're saying the CPU is NOT max'd out at 100%. How's your I/O? How's your Network I/O? Are your servers perhaps stuck waiting on all of those clients to respond?

Something else that was unclear to me: Do you have 3 servers with 30 players each, or is that 30 players across 3 servers so around 10 players each?
 
No, for two reasons. First, you might compare Java to C++ or C#, but Unity (written in C#) is a large framework that comes with some overhead.

Second, if you compare Java to C++ or C# there's more to it than "C++ runs closer to bare metal" while Java and C# have a JIT. That makes for a simple comparison, but it isn't a 100% correct statement. I remember once seeing a study where Java had better context switching than C++ and could outperform C++ depending on the algorithm. Java has had better garbage collection than C#, though I'm not sure if that's still true. Some folks say C# runs better on Windows, but Java can be better on other OS's, like Linux.



I'm not really sure how to interpret what you're saying there.

I think you're saying the CPU is NOT max'd out at 100%. How's your I/O? How's your Network I/O? Are your servers perhaps stuck waiting on all of those clients to respond?

Something else that was unclear to me: Do you have 3 servers with 30 players each, or is that 30 players across 3 servers so around 10 players each?
I have 3 servers, online with an average of 20-25 people each, I have allocated them only high-performance cores from the Ultra Core 9 285k processor, all have a frequency of 5700, they do not load at 100%, but the server is just bad. It would be worthwhile for the developers to transfer the logic of the zombies to the second stream, as Naiwazi (the Chinese core) does, and everything would be fine.
 
Moore's "Law" was dead.
That's not entirely true. It still works more or less for processors, but other problems arise. Just as a chain can't be stronger than its weakest link, a system can't be faster than its slowest part. Even the good old 386 processor spent an average of 60% of its time waiting for RAM to respond. Modern systems spend even more time on this.
 
So what's the problem with putting the server core on a multithreading? Why was Naiwazi able to do this and is everything working more or less with it? Yes, the logic of zombies is partially broken, but at least 70 people are there and they won't lag. Who forbids multithreading? "Nobody." This is all being done, but why don't the game developers want this?
Post automatically merged:

7D2D is already multithreaded. But not in the way of splitting players to cores. Because players all need much of the same data and need access to the still insufficiently mulitcored unity (on clients) that would not create any performance increase and even might slow down the game completely because of memory running out or internal data paths getting into limits and slowing down everything.

What is different with Minecraft? First of all Minecraft doesn't use an external library like Unity for their graphics, therefore not dependent on an external piece of software they have not much influence on and which has a specific interface that influences client and server structure.
Secondly MS is behind minecraft and MS easily has the money to (for example) rewrite the server completely to have a different architecture more fitting for the server, while TFP probably just reuses the client code without graphics for the server.

Both pieces of software have grown for a long time without mulitthreading being on anyones mind, it also is a matter of luck whether it is easy or difficult to rewrite some huge grown piece of code to work better under multithreading. Also TFPs "target" or supported mode is and always has been 1 to 8 players. There may be easier optimizations for that case that won't help high population servers.

It also is possible that Naiwazi is doing something that TFP simply hasn't found out or isn't able to do like you are suggesting. Though there is no chance they can ask the Naiwazi devs as they (probably illegaly) use it to generate money and they surely won't give up any secrets. So TFP would have to reverse-engineer that and delay the game even further.
Post automatically merged:

I have 3 servers, online with an average of 20-25 people each, I have allocated them only high-performance cores from the Ultra Core 9 285k processor, all have a frequency of 5700, they do not load at 100%, but the server is just bad. It would be worthwhile for the developers to transfer the logic of the zombies to the second stream, as Naiwazi (the Chinese core) does, and everything would be fine.

My guess is that the data paths (cpu to RAM, RAM to SSD) are the limits here, not the cpu frequency. Just a guess though. Does one server run much better when you have shut down the others?
 
That's not entirely true. It still works more or less for processors, but other problems arise. Just as a chain can't be stronger than its weakest link, a system can't be faster than its slowest part. Even the good old 386 processor spent an average of 60% of its time waiting for RAM to respond. Modern systems spend even more time on this.

Yeh, it's an arguable statement as industry luminaries regularly demonstrate. In its classic sense (transistor counts) it is dead. That's not to say folks aren't clever and other forms of innovation. The rise of the number of cores, for instance. But those come with processing limitations, like you describe.

You raise some interesting questions. I also wonder about the networking. Is that communication with 90 clients sharing a network card?

And, while this is an interesting exercise in seeing how far beyond the engineer's specs the system can be made to scale, it also makes sense that TFP wouldn't necessarily invest the time and effort into trying to scale higher, specially when they've got other features/promises to meet.
 
7D2D is already multithreaded. But not in the way of splitting players to cores. Because players all need much of the same data and need access to the still insufficiently mulitcored unity (on clients) that would not create any performance increase and even might slow down the game completely because of memory running out or internal data paths getting into limits and slowing down everything.

What is different with Minecraft? First of all Minecraft doesn't use an external library like Unity for their graphics, therefore not dependent on an external piece of software they have not much influence on and which has a specific interface that influences client and server structure.
Secondly MS is behind minecraft and MS easily has the money to (for example) rewrite the server completely to have a different architecture more fitting for the server, while TFP probably just reuses the client code without graphics for the server.

Both pieces of software have grown for a long time without mulitthreading being on anyones mind, it also is a matter of luck whether it is easy or difficult to rewrite some huge grown piece of code to work better under multithreading. Also TFPs "target" or supported mode is and always has been 1 to 8 players. There may be easier optimizations for that case that won't help high population servers.

It also is possible that Naiwazi is doing something that TFP simply hasn't found out or isn't able to do like you are suggesting. Though there is no chance they can ask the Naiwazi devs as they (probably illegaly) use it to generate money and they surely won't give up any secrets. So TFP would have to reverse-engineer that and delay the game even further.
Post automatically merged:



My guess is that the data paths (cpu to RAM, RAM to SSD) are the limits here, not the cpu frequency. Just a guess though. Does one server run much better when you have shut down the others?
Unfortunately not. It doesn't get any better that 1 server or 3 servers work the same way. DDR5 has a frequency of 8000 (3x32), SSD has a speed of 14,000 reads, 16,000 writes (or vice versa, I don't remember).
Post automatically merged:

Yeh, it's an arguable statement as industry luminaries regularly demonstrate. In its classic sense (transistor counts) it is dead. That's not to say folks aren't clever and other forms of innovation. The rise of the number of cores, for instance. But those come with processing limitations, like you describe.

You raise some interesting questions. I also wonder about the networking. Is that communication with 90 clients sharing a network card?

And, while this is an interesting exercise in seeing how far beyond the engineer's specs the system can be made to scale, it also makes sense that TFP wouldn't necessarily invest the time and effort into trying to scale higher, specially when they've got other features/promises to meet.
They fix one, break four:ROFLMAO:
 
Back
Top