Performance regression
Not sure what has caused the performance regression. Cyberpunk is hitting 5 or a little more fps loss on average and min suddenly when benchmarking. 9070xt and 7800x3d.
Not sure what has caused the performance regression. Cyberpunk is hitting 5 or a little more fps loss on average and min suddenly when benchmarking. 9070xt and 7800x3d.
Ok, iv ran into an issue and I'm not sure why FSR4 is not showing up in Cyberpunk. Its always worked just fine in the past. I'm using a 9070xt in CachyOS, using protonCachyos.
WINEDLLOVERRIDES="d3dcompiler_47=n;dxgi=n,b" PROTON_FSR4_ENABLE=1 PROTON_ENABLE_WAYLAND=1 PROTON_ENABLE_HDR=1 LD_PRELOAD="" game-performance %command%
Is anyone else experiencing this? Or can anyone see anything wrong in what I'm doing? Haven't played the game in a while. I really don't want to play in FSR3. Could RenoDX be interfering somehow?
So I'm back to tinkering some more. Not on the gaming side, as KDE was fixed to respect HDR source from games now. Something I'm struggling with is HDR on the desktop and you tube. In calibration, we have peak and max SDR. I want my screen to be just as bright as SDR on desktop when in HDR mode. The gamma works, the colors all work, it looks great. However, that max SDR slider which now respects gaming, seems to largely effect HDR video playback and the general brightness of the screen on desktop. My screen will default to a peak of 604 cd/m2 and then the second screen to 277 cd/m2 (100% max APL) And I'm OK with this being a little higher than the standard 203 or even 140 as I'm in a lit room. But at 277, it seems that HDR video playback is dimmer. Effecting peaks as well. What confuses me is that bringing this slider up all the way to max which it caps at 600 cd/m2 finally matches SDR brightness if i toggle back and forth, HDR video playback also seems to be beautifully bright now and colors still all look accurate and great. Is this slider maybe mislabeled? Because at the end it says that the value that was just set was paper white also. To me this is just acting like W11 SDR/HDR slider which i would also have maxed out for full brightness on the desktop (although i wouldn't in windows because of the gamma mismatch). Am i correct in this thinking and it maybe needs a wording change? Or is this buggy behavior?
Im on the most recent KDE with CachyOS so kernel is very up to date as well. 9070xt.
Monitor is an LG27GS95QE WOLED connected via DP 1.4
So I'm back to tinkering some more. Not on the gaming side, as KDE was fixed to respect HDR source from games now. Something I'm struggling with is HDR on the desktop and you tube. In calibration, we have peak and max SDR. I want my screen to be just as bright as SDR on desktop when in HDR mode. The gamma works, the colors all work, it looks great. However, that max sdr slider which now respects gaming, seems to largely effect HDR video playback and the general brightness of the screen on desktop. My screen will default to a peak of 604 cd/m2 and then the second screen to 277 cd/m2 (100% max APL) And I'm OK with this being a little higher than the standard 203 or even 140 as I'm in a lit room. But at 277, it seems that HDR video playback is dimmer. Effecting peaks as well. What confuses me is that bringing this slider up all the way to max which it caps at 600 cd/m2 finally matches SDR brightness if i toggle back and forth, HDR video playback also seems to be beautifully bright now and colors still all look accurate and great. Is this slider maybe mislabeled? Because at the end it says that the value that was just set was paper white also. To me this is just acting like W11 SDR/HDR slider which i would also have maxed out for full brightness on the desktop (although i wouldn't in windows because of the gamma mismatch). Am i correct in this thinking and it maybe needs a wording change? Or is this buggy behavior?
I'm just trying to understand what is correct and what is recommended with this. I have an LG 27GS95QE. Its a WOLED that calibrates to a peak of 604 and has a 100% APL of 277 nits. In Crimson Desert for example, It recommends a game brightness value of 141 nits. This is just to dim and ruins the picture. Now changing this to 277 balances it out and looks much more correct. I had a user on another post say that he was chatting with people in the discord, and they even recommended setting this value to 70% of peak? Doing this does in fact make the overall picture as bright as SDR would look now and looks great. Other games i play this might change and something like 141 or 203 looks just fine. Is this normal HDR behavior?
What are your hot spot temps? i want to make sure I'm not worrying to much here. My Nitro+ 9070xt at default (330w) will see a hot spot of about 80c rn plus or minus a few. Memory is much lower. I do an under volt of -80mv as well with memory at 2700. If i go to +10 PL (375w) my hot spot will go to 90c-95c. According to AMD this is within spec? Seems like RDNA4 just gets a bit hotter when it comes to junction? This card is like a year old now, apparently uses PTM 7950 from factory. In a Cooler master TD 500 mesh. 3 front intakes, radiator on top exhausting CPU and one exhaust on rear. Im monitoring wit LACT on Linux.