The CVar "r.Shaders.Optimize" cannot be set to false from DefaultEngine.ini in the section "[ShaderCompiler]" as its help text describes.
Also tested on UE5-Main, CL: 34200103
1. Download the attached test project
2. Set r.Shaders.AllowCompilingThroughWorkers=0 in ConsoleVariables.ini
3. Observe that "r.Shaders.Optimize" is set to 0 in DefaultEngine.ini, under the "[ShaderCompiler]" section. Even though "r.Shaders.Optimize" is marked ECVF_Cheat, its help specifically says it can be set under this section in Engine.ini files.
4. Set a breakpoint on the first line of FShaderSettingHelper::GetBoolForPlatform(...) from ShaderCore.cpp. (Line is: "bool bEnabled = false;")
5. Open the test project in Debug.
6. If the project opens without hitting the breakpoint, run "RecompileShaders all"
7. Continue debugging until "SettingCVar" matches "r.Shaders.Optimize" - one way is to match the help text.
8. Observe that it will skip loading the value from DefaultEngine.ini, because the default value of the CVar is "1".
There's no existing public thread on this issue, so head over to Questions & Answers just mention UE-217499 in the post.
1 |
Component | UE - Rendering Architecture - Shaders |
---|---|
Affects Versions | 5.4, 5.4.2, 5.5, 5.4.1 |
Created | Jun 7, 2024 |
---|---|
Updated | Jun 24, 2024 |