Page 1 of 1

Framerate Nitpick

Posted: Mon Jul 17, 2017 6:53 am
by UTNerd24
So Ive noticed lately, that without having added any new mods recently, I'm only averaging 55FPS. 57 if im lucky. But when I have NViDIA Shadowplay standing by, Its back up to 60. Even though 55 is still rather smooth, I can't help but feel that I may have tampered with something in the past that has effected performance. Is the 60FPS from Shadowplay just placebo? If you want me to share my graphics settings, just let me know.

Re: Framerate Nitpick

Posted: Mon Jul 17, 2017 9:07 am
by Carbon
Hard to help without knowing your system specs and OS.

Re: Framerate Nitpick

Posted: Mon Jul 17, 2017 10:10 am
by UTNerd24
My Rig:

Windows 8.1
NViDiA Geforce GTX 1060 3GB
Intel Core i5-4460 CPU @ 3.20GHz
7.9 GB RAM
1920 x 1080, 60Hz display

Settings:

Code: Select all

[D3D9Drv.D3D9RenderDevice]
UseHardwareClipping=True
ZRangeHack=True
NoAATiles=True
NumAASamples=0
UseAA=False
UseSoftwareVertexProcessing=False
UsePureDevice=False
UseTripleBuffering=False
MaskedTextureHack=True
SmoothMaskedTextures=False
SceneNodeHack=True
FrameRateLimit=60
SwapInterval=0
UseFragmentProgram=True
TexDXT1ToDXT3=False
DynamicTexIdRecycleLevel=100
CacheStaticMaps=False
UseTexPool=False
UseTexIdPool=False
UseSSE2=False
UseSSE=False
SinglePassDetail=False
SinglePassFog=True
ColorizeDetailTextures=False
DetailClipping=True
DetailMax=2
RefreshRate=0
MaxTMUnits=0
NoFiltering=True
MaxAnisotropy=0
Use565Textures=False
Use16BitTextures=False
UseS3TC=True
UseTrilinear=False
UsePrecache=False
UseMultiTexture=True
MaxLogTextureSize=8
MinLogTextureSize=0
OneXBlending=False
GammaCorrectScreenshots=False
GammaOffsetBlue=0.000000
GammaOffsetGreen=0.000000
GammaOffsetRed=0.000000
GammaOffset=0.000000
LODBias=0.000000
DetailTextures=False
DescFlags=0
Description=
HighDetailActors=True
Coronas=True
ShinySurfaces=True
VolumetricLighting=True
PS: I have filtering off to give a DOOM-esque feel to the graphics.

Re: Framerate Nitpick

Posted: Mon Jul 17, 2017 1:08 pm
by Carbon
Well I'm no guru but it seems that you have vsync off and the refresh rate at 0. Try changing these values and see if it makes a difference.

Set the refresh rate @ 60 and/or swapinterval=-1 (vsync on).

Make a backup of your .ini file and try these settings:

Code: Select all

D3D9Drv.D3D9RenderDevice]
ZRangeHack=True
NoAATiles=True
NumAASamples=4
UseAA=True
UseSoftwareVertexProcessing=False
UsePureDevice=True
UseTripleBuffering=True
MaskedTextureHack=True
SmoothMaskedTextures=False
SceneNodeHack=True
FrameRateLimit=60
SwapInterval=0
UseFragmentProgram=True
TexDXT1ToDXT3=False
DynamicTexIdRecycleLevel=100
CacheStaticMaps=True
UseTexPool=True
UseTexIdPool=True
UseSSE2=True
UseSSE=True
SinglePassDetail=False
SinglePassFog=True
ColorizeDetailTextures=False
DetailClipping=False
DetailMax=2
RefreshRate=75
MaxTMUnits=0
NoFiltering=False
MaxAnisotropy=16
Use565Textures=False
Use16BitTextures=False
UseS3TC=True
UseTrilinear=True
UsePrecache=True
UseMultiTexture=True
MaxLogTextureSize=8
MinLogTextureSize=0
OneXBlending=False
GammaCorrectScreenshots=False
GammaOffsetBlue=0.000000
GammaOffsetGreen=0.000000
GammaOffsetRed=0.000000
GammaOffset=0.000000
LODBias=-1.000000
DetailTextures=True
DescFlags=0
Description=Your Video Card Description
HighDetailActors=True
Coronas=True
ShinySurfaces=True
VolumetricLighting=True

Re: Framerate Nitpick

Posted: Mon Jul 17, 2017 1:14 pm
by UTNerd24
Cheers! Turns out the refresh rate did the trick.
Now Im curious if there are other people out there who prefer to play without filtering...

Re: Framerate Nitpick

Posted: Mon Jul 17, 2017 2:23 pm
by sektor2111
I see FrameRateLimit=60, I think your system can do more... but is your problem. I prefer to let RefreshRate as OS in order to stop flickering screen at changing refresh and forcing monitor to an additional work when game is started/stopped/crashed.

Re: Framerate Nitpick

Posted: Mon Jul 17, 2017 7:58 pm
by nogardilaref
I think it's also worth noting that whenever you have an Intel + nVidia on Windows, it's highly likely for UT to run on the Intel GPU instead of the dedicated nVidia one.

While it generally runs as good as in the dedicated one (due to how old the game is and how little it actually uses a GPU at all), you might run into trouble concerning the drivers themselves and get some artifacts when running on Intel.
So my advice is to always make sure to run the game in the dedicated GPU, as if anything, it has the better and more mature drivers, preventing those issues.

This can be done by forcing it with right click, configuring it to run on the dedicated GPU by default in nVidia control panel, or even to set the system to always run on the dedicated one no matter what (although I think that to be a bad idea, as the switching works well and there's no need to use more power drain for the same experience).

Re: Framerate Nitpick

Posted: Tue Jul 18, 2017 1:41 am
by Carbon
I'm on NV/Intel and have never had this happen and no, it isn't likely to run as well on the discreet GPU at all. Misinformation, I think.

Re: Framerate Nitpick

Posted: Tue Jul 18, 2017 10:48 am
by nogardilaref
Carbon wrote:I'm on NV/Intel and have never had this happen and no, it isn't likely to run as well on the discreet GPU at all. Misinformation, I think.
Well, I know that at least on laptops this happens often, owned some of them myself from different brands, different CPUs (all Intel), different dedicated GPUs (all nVidia), different qualities (mid-range, high-end, very high-end) and every single time I run UT on them it would always default to the Intel GPU.
The only way to run on the dedicated one was always to add UT to the list of applications to run on nVidia on its control panel.

I don't mean that it happens in every single case, each system is a system, but in my own sampling thus far comprised by different machines UT was never considered to be something to run on a nVidia card by default.

Having that said, if you own a desktop instead, I have no idea how the systems are set by default. It wouldn't surprise me if the behavior was the exact opposite on desktops, since they aren't exactly made to try to actively spare energy consumption, unlike laptops. So if what you have is a desktop, yeah, the opposite behavior is probably the most likely one.

Furthermore, UT is not the only game where this is a problem. Unity had a similar problem in the past, as well as another game engine/physics simulator which would choose the Intel GPU by default, and here my sampling was way bigger since I wasn't certainly the only one with this problem.
It was severe enough so that I reported the problem on the latter one, gave them a guide from nVidia in how to fix it, and they did end up fixing it and it started to run on the dedicated GPU by default.

If in newer systems this was a problem already, it's not unthinkable that an old game such as this to have this kind of behavior most of the time, on laptops at least.


PS.: My current setup is a MSI gaming laptop, with the same graphic card as UTNerd24 (with 6GB instead), but all the other specs superior (Windows 10, Intel Core i7, 16GB RAM, etc), and I run UT in it, and I had to set UT to run on the dedicated card there as well.

Re: Framerate Nitpick

Posted: Tue Jul 18, 2017 12:02 pm
by Dr.Flay
As far as I remember the preferred GPU is a driver option and the default is to use intel because it is lower power.
Most nerds will have found and changed this within minutes of first using the laptop.
You may also have the option to completely ignore the intel GPU but this will increase power use during normal 2D stuff.

You can test to see if Windows is preferring the intel chip by trying an OpenGL screensaver. If it runs like a stuttering bag of sludge, then intel is the default.

Re: Framerate Nitpick

Posted: Tue Jul 18, 2017 9:23 pm
by nogardilaref
Dr.Flay wrote: Most nerds will have found and changed this within minutes of first using the laptop.
You may also have the option to completely ignore the intel GPU but this will increase power use during normal 2D stuff.
I don't advice one nor the other though, as it is it works very well actually, specially on a laptop.
These things are only thought for newer software though, so older software is bound to potentially run on the "wrong" GPU.
Dr.Flay wrote: You can test to see if Windows is preferring the intel chip by trying an OpenGL screensaver. If it runs like a stuttering bag of sludge, then intel is the default.
To be fair, Intel has been stepping up their game on that regard for a couple of years already.
Old Intel GPUs do have severe performance problems (not with UT though, generally, only artifacts on menus and the like), but the newest ones are pretty much on par with mid-range dedicated nVidia cards from a few years ago, which is saying a lot.

They won't run any new games (that's what the dedicated one is for), but for anything from around 2012 it should hold its ground at the very least. Probably even UT3 runs reasonably well on Intel nowadays (idk, never tried, but it would be an interesting experiment).