PC NVIDIA SLI profile (BITS) NOW FOR 19.1 Alpha

enable exclusive fullscreen in launcher and press f4 two times for true fullscreen, i will upload the latest profile in 5 minutes, forgot to mention that the game always start as windowed fullscreen
https://www.mediafire.com/file/19csk3jt55n1tb8/7_Days_to_Die_V2_PROFILE.nip/file

Im sad hearing this, try again, use the latest profile and be sure the game is in exclusive fullscreen or sli will make things worse, im making a video proof so u see by yourself

SLI TEST VIDEO (ENABLED)

WINDOWED FULLSCREEN VS EXCLUSIVE FULLSCREEN WITH SLI

Also "4k is a bit cpu Bottlenecked" is a incorrect statement, it's the other way round, when you lower resolution it puts more load on the cpu processing more frames and raising the resolution puts more load on the gpu delivering less frames .

lower res= more load on the cpu

higher res= more load on the gpu

 
what's ur config? tell me plz, it's working as u see.

2560x1440 no vsync - sli disabled





2560x1440 no vsync - sli enabled





 
Last edited by a moderator:
this game never run well on 4k, too much cpu overhead and i've noticed more gpu usage with 4k and also cpu, due to the heavy use of postprocessing in this game, as far as i know what u say it's true, but in this game not so much.

 
Your "SLI Disabled" screenshot is running at a lower resolution than your "Sli Enabled" did you change settings between shots? single gpu shot is noticeably blurrier. I noticed in your vid you had an issue getting 4k res to stick could be why.

My settings

Everything turned on and at highest settings except motion blur disabled

System

I7 6700K @4.6ghz

16gb ddr4 3600Mhz

2x oc'd evga gtx 1080ti's

Win 10 pro 64bit

sli was run at stock gpu clocks, latest drivers (430.86), A17.4

 
u are bottlenecking it with SS REFLECTIONS... LOL disable them and retry

- - - Updated - - -

it's just implemented so it's really heavy.

ps: i've checked while making the screens that the resolution was correct, because u made me think that it doesnt works.

problably ss reflections do not allow the fps to go higher due to low optimization in the current state

 
Last edited by a moderator:
Your "SLI Disabled" screenshot is running at a lower resolution than your "Sli Enabled" did you change settings between shots? single gpu shot is noticeably blurrier. I noticed in your vid you had an issue getting 4k res to stick could be why.
My settings

Everything turned on and at highest settings except motion blur disabled

System

I7 6700K @4.6ghz

16gb ddr4 3600Mhz

2x oc'd evga gtx 1080ti's

Win 10 pro 64bit

sli was run at stock gpu clocks, latest drivers (430.86), A17.4
I think it's still the same resolution, but using lower quality textures. You can definitely tell that the textures are different between the two, and that the higher FPS picture has lower quality resolution.

So I tried something out.

1920x1080

Full texture resolution.

https://i.gyazo.com/4a1d323a3ceb5906a6c9560dffaadec1.jpg

Quarter

https://i.gyazo.com/6237d811c818f4822453e18734d7d09e.jpg

And it shows something weird. Even with a quarter resolution, the stairs look similar. However in the pictures provided in the post above, the granularity on the second is noticeably different.

So then I took one at 1280x720

https://i.gyazo.com/7b367e8543a597e56db4ae1afb30be04.jpg

My GPU automatically upscaled it to 1920x1080

So I tried actual 720p

https://i.gyazo.com/d276165c52b181f7c39470b437b1698b.jpg

Still looks like crap, but smaller

So I am unable to replicate the odd blurriness in the second shot with the higher framerate.

Also just for ♥♥♥♥s and giggles, I apparently didn't have Exclusive Fullscreen enabled in the first shots I took.

Here it is with it enabled.

https://i.gyazo.com/f189a1d7a876657d05daadcfae0b7447.jpg

I got what, like 5FPS more? Maybe 6?

Also, full disclosure, these are my settings. The only thing I changed in any shots was resolution, and texture quality.

44dc452aaa28384695edb7277b1990f2.png


Oh, and also just for reference, here's my specs...

Win 10

i7-3930k overclocked at 4.2GHz

16GB DDR3 RAM overclocked at @1600Mhz

One GTX1060 6GB with stock OC

OS and game on SSD's, save data on ultra-fast HDD's in RAID 0

(I do have plans on dropping a second 1060 in, but only because I've run out of ports for monitors.)

 
Also "4k is a bit cpu Bottlenecked" is a incorrect statement, it's the other way round, when you lower resolution it puts more load on the cpu processing more frames and raising the resolution puts more load on the gpu delivering less frames .
lower res= more load on the cpu

higher res= more load on the gpu
While in general, you are correct, you are also wrong.

GPU's like the RTX 2080 are CPU bottlenecked because the CPU cant feed the GPU fast enough ;) (this includes the 8700/9700K)

 
u are bottlenecking it with SS REFLECTIONS... LOL disable them and retry
- - - Updated - - -

it's just implemented so it's really heavy.

ps: i've checked while making the screens that the resolution was correct, because u made me think that it doesnt works.

problably ss reflections do not allow the fps to go higher due to low optimization in the current state
Changing quality setting won't make sli work. So in summary I couldn't get any better performance with it and the proof you provided has too many question marks and red flags and I have yet to see any concrete evidence from anyone that suggests increased performance can be achieved with muti gpus. Still, was interesting to look Into and see if it was possible but realistically even if it did work "perfectly" and the dev supported it, it's a heavily cpu bound game and a 20% increase would be optimistic imo but who knows one day maybe

 
While in general, you are correct, you are also wrong.
GPU's like the RTX 2080 are CPU bottlenecked because the CPU cant feed the GPU fast enough ;) (this includes the 8700/9700K)
It depends on the game and clock speeds and configuration a 9900k can be the bottleneck for a gt 730 if you massevly nerfed it's clockspeed

 
It depends on the game and clock speeds and configuration a 9900k can be the bottleneck for a gt 730 if you massevly nerfed it's clockspeed
who in the right mind would do that configuration?

but I digress, not the point. CPU can still bottleneck the GPU even @ 4K+

 
I think it's still the same resolution, but using lower quality textures. You can definitely tell that the textures are different between the two, and that the higher FPS picture has lower quality resolution.
So I tried something out.

1920x1080

Full texture resolution.

https://i.gyazo.com/4a1d323a3ceb5906a6c9560dffaadec1.jpg

Quarter

https://i.gyazo.com/6237d811c818f4822453e18734d7d09e.jpg

And it shows something weird. Even with a quarter resolution, the stairs look similar. However in the pictures provided in the post above, the granularity on the second is noticeably different.

So then I took one at 1280x720

https://i.gyazo.com/7b367e8543a597e56db4ae1afb30be04.jpg

My GPU automatically upscaled it to 1920x1080

So I tried actual 720p

https://i.gyazo.com/d276165c52b181f7c39470b437b1698b.jpg

Still looks like crap, but smaller

So I am unable to replicate the odd blurriness in the second shot with the higher framerate.

Also just for ♥♥♥♥s and giggles, I apparently didn't have Exclusive Fullscreen enabled in the first shots I took.

Here it is with it enabled.

https://i.gyazo.com/f189a1d7a876657d05daadcfae0b7447.jpg

I got what, like 5FPS more? Maybe 6?

Also, full disclosure, these are my settings. The only thing I changed in any shots was resolution, and texture quality.

44dc452aaa28384695edb7277b1990f2.png


Oh, and also just for reference, here's my specs...

Win 10

i7-3930k overclocked at 4.2GHz

16GB DDR3 RAM overclocked at @1600Mhz

One GTX1060 6GB with stock OC

OS and game on SSD's, save data on ultra-fast HDD's in RAID 0

(I do have plans on dropping a second 1060 in, but only because I've run out of ports for monitors.)

how u think this profile would help u if u have a 1060? and only one. (SLI IS NOT SUPPORTED WITH 1060 AND 1050 AND LOWER)

PEOPLE I DID NOT CHANGE THE SETTINGS TO HAVE MORE FPS! it's senseless also that you think that, i just want to clarify that im not from nvidia or a developer for unity, i've maded a lot of other profiles for unity games, this one is a bit complicated to achieve a good scaling.

 
Changing quality setting won't make sli work. So in summary I couldn't get any better performance with it and the proof you provided has too many question marks and red flags and I have yet to see any concrete evidence from anyone that suggests increased performance can be achieved with muti gpus. Still, was interesting to look Into and see if it was possible but realistically even if it did work "perfectly" and the dev supported it, it's a heavily cpu bound game and a 20% increase would be optimistic imo but who knows one day maybe
Campbellh24, sincerily if u enable raytracing in battlefield 1 what happens? u loose 70% of ur fps, what u think is the impact of SS reflections just implemented in a unity game like this, because i can really show u that i have 60-70 more fps if i turn off the ss reflections, and u are killing the cpu gpu loop with that, dont except more fps if the game barely handles SS reflections.

 
who in the right mind would do that configuration? but I digress, not the point. CPU can still bottleneck the GPU even @ 4K+
Haha it was just an example

- - - Updated - - -

Campbellh24, sincerily if u enable raytracing in battlefield 1 what happens? u loose 70% of ur fps, what u think is the impact of SS reflections just implemented in a unity game like this, because i can really show u that i have 60-70 more fps if i turn off the ss reflections, and u are killing the cpu gpu loop with that, dont except more fps if the game barely handles SS reflections.
Just to be clear those are the settings I used for the test, I use lower settings when I play normally

 
how u think this profile would help u if u have a 1060? and only one. (SLI IS NOT SUPPORTED WITH 1060 AND 1050 AND LOWER) PEOPLE I DID NOT CHANGE THE SETTINGS TO HAVE MORE FPS! it's senseless also that you think that, i just want to clarify that im not from nvidia or a developer for unity, i've maded a lot of other profiles for unity games, this one is a bit complicated to achieve a good scaling.
I know I can't run SLI bridged. (I can do it via software, but the results are questionable at best.) I was just attempting to figure out why the granularity of your images was so different between your 50 FPS shot with a sharp image, and your 91 FPS SLI image that was basically blurred looking as if it had a lower texture resolution. (Because we all know that the easiest way to boost FPS is to lower settings.)

Also, the main purpose of SLI is to get better FPS performance with higher resolution and details through shared processing. If you aren't doing it for that, then what in the world are you doing it for? It's senseless to think that you would be doing it for any other reason.

 
I know I can't run SLI bridged. (I can do it via software, but the results are questionable at best.) I was just attempting to figure out why the granularity of your images was so different between your 50 FPS shot with a sharp image, and your 91 FPS SLI image that was basically blurred looking as if it had a lower texture resolution. (Because we all know that the easiest way to boost FPS is to lower settings.)
Also, the main purpose of SLI is to get better FPS performance with higher resolution and details through shared processing. If you aren't doing it for that, then what in the world are you doing it for? It's senseless to think that you would be doing it for any other reason.
For now 4k fps are too low or unstable i would say, bloodmoon is unplayable and i love to have 60fps stable or 2k or 4k because my screen is natively 4k, im disappointed that all who tried it say that it doesnt work, im asking myself wtf it's wrong or im missing, for sure would be better have someone directly from nvidia to create a custom profile for this game, using pre existing sli bits does not allow it to work at best

 
I know I can't run SLI bridged. (I can do it via software, but the results are questionable at best.) I was just attempting to figure out why the granularity of your images was so different between your 50 FPS shot with a sharp image, and your 91 FPS SLI image that was basically blurred looking as if it had a lower texture resolution. (Because we all know that the easiest way to boost FPS is to lower settings.)
Also, the main purpose of SLI is to get better FPS performance with higher resolution and details through shared processing. If you aren't doing it for that, then what in the world are you doing it for? It's senseless to think that you would be doing it for any other reason.
Also i think that was just a bug, i was showing u fps at 2k after going into real fullscreen, it maybe was blurry due to nvidia capture software for some reason, obviusly just opened was in 4k and each time was pressed f4 it scaled to 2k as u can see from Window size, so there was a fps boost due to real fullscreen and 2k being the right resolution for the 1070 showed a bit more fps and no stutter due to overhead being cut out..

 
This is a 3D Voxel game. The only way SLI really works, is if all the textures are rendered in the same frame. This is because it is basically splitting the work of processing the whole frame between the two GPU's. One GPU does one half, the other GPU does the remaining half, and then both halves are joined together for the output to your screen.

Even simple features like motion blur can defeat SLI because it's rendering textures from the previous frame to create the effect. I would think that the way the voxels work with rendering the surface textures on individual blocks would cause similar issues with SLI.

 
so i should try with Split frame rendering u think? not alternate that can only render things twice sometimes and give no boost

 
so i should try with Split frame rendering u think? not alternate that can only render things twice sometimes and give no boost
I have tried afr and afr2, is there a split frame rendering option?

 
Back
Top