r/nvidia

How many of you use undervolting?
🔥 Hot ▲ 215 r/nvidia

How many of you use undervolting?

On a 3070 the improvements in temperature and power consumption are really good. It uses 80W less and runs 13°C cooler, with only a 1–2 FPS loss.

u/TACOGT — 6 hours ago
Linux DXVK vs Linux Native Benchmark CIVILIZATION VI
▲ 17 r/nvidia+2 crossposts

Linux DXVK vs Linux Native Benchmark CIVILIZATION VI

CIVILIZATION VI has been tested in Linux (Pop!_OS 24.04, COSMIC/Wayland) my dual-boot machine:

RTX 5070 Ti

Ryzen 9 5900X,

RAM 32 GB

Each operating system has its own identical 1TB SSD drive.

The game was run at 1080p using the Ultra profile, VSYNC OFF, FRAME LIMITER OFF.

While Proton converts DirectX to Vulkan using DXVK/VKD3D, the native Linux version primarily uses OpenGL. There is a huge difference. Proton regularly delivers significantly improved CPU utilization, smoother frame pacing, and higher performance. The performance difference might reach several times higher FPS in certain situations, demonstrating how inefficient OpenGL is in comparison to contemporary Vulkan translation layers.

Given that it reflects the actual gaming experience for modern Linux users, this test clearly demonstrates why many Linux benchmarks now focus on Proton performance.

I'm looking forward to the next NVIDIA driver 595 update for POP_OS, which should boost Linux DX12 game performance.

***********************************************

Disclaimer: Why I Test with Pop!_OS + NVIDIA

***********************************************

  1. Windows gamers

The whole point of these benchmarks is to show that Linux gaming exists, works well, and isn’t nearly as complicated as many Windows users think. I’m basically trying to show a realistic migration path from Windows to Linux, not build a perfect Linux-only lab.

  1. NVIDIA dominates the gaming GPU market.

According to the Steam Hardware Survey, NVIDIA usually sits around ~75–80% of GPUs in gaming PCs. If I test on NVIDIA, I’m covering what most gamers actually use.

  1. Pop!_OS is one of the easiest distros for NVIDIA users.

It ships with dedicated NVIDIA ISOs, drivers are integrated, and updates are straightforward. I run tests on official Pop!_OS drivers, so the setup reflects something an average user could realistically install.

  1. If Linux gaming works on NVIDIA, it works for most gamers.

Yes, AMD often performs better on Linux. I’m aware of that. But testing only on AMD would shrink the scope from ~80% of the market to a much smaller slice. My goal is broader relevance, not best-case scenarios.

youtu.be
u/RoniSteam — 2 hours ago
DLSS 4.5 Denoiser vs Ray Reconstruction
🔥 Hot ▲ 78 r/nvidia

DLSS 4.5 Denoiser vs Ray Reconstruction

Alex Battaglia from DF recently found out that DLSS 4.5 has a built-in denoiser that produces a much more stable and detailed picture compared to the default UE denoiser.

I thought it would be interesting to compare the DLSS 4.5 denoiser to Ray Reconstruction. I made a brief comparison in Silent Hill 2 between the default UE denoiser, DLSS 4 Ray Reconstruction, and DLSS 4.5 with the UE denoiser disabled.

Here are a few observations of DLSS 4.5 compared to Ray Reconstruction:

  • Slightly better anti-aliasing in reflections.
  • Less grain.
  • A similar level of detail.
  • A similar amount of ghosting.
  • More flickering.
  • Seems to lack temporal stability, resulting in much blurrier reflections when the camera or character is in motion.

Overall, I think the denoiser in DLSS 4.5 provides picture quality that is almost on a par with Ray Reconstruction. This is good news, as we can now essentially achieve a Ray Reconstruction level of denoising in any game that allows us to disable the native denoiser and has DLSS support.

What are your thoughts on this?

youtube.com
u/soft-tack — 7 hours ago
Image 1 — NR200p mod to fit bigger GPU
Image 2 — NR200p mod to fit bigger GPU
Image 3 — NR200p mod to fit bigger GPU
▲ 6 r/nvidia+1 crossposts

NR200p mod to fit bigger GPU

So I bought a Gigabyte 5080 gaming 340mm in length and tried to fit in my case. No go decided it was time to cut her. Not the cleanest. I had to use a sawzall, reciprocating saw and sander to get it done, but hey it works.

u/electricmigz — 1 hour ago
I made dynamic MFG work flawlessly in Cyberpunk with vsync+gsync+Reflex cap, this is how I did it
▲ 27 r/nvidia

I made dynamic MFG work flawlessly in Cyberpunk with vsync+gsync+Reflex cap, this is how I did it

I finally figured out how to make dynamic FG multiplier switching behave exactly how i want in Cyberpunk. I have ryzen 7600X+5070Ti, latest 596.02 driver and latest cyberpunk version. In nvidia control center i have gsync enabled, vsync set to on (off in cyberpunk) and max frame rate is set to 158 fps because i have a 165Hz 1440p monitor. Cyberpunk is set to max details and path tracing, DLSS Quality.

First thing i did was to replace all DLSS libraries in DLSS Swapper for the 310.6 version. I didnt install the Streamline libraries even though a lot of people say they are mandatory (maybe in other games they are but not in CP). Then I installed the latest nvidia inspector revamped and set it like this for cyberpunk

nvidia inspector MFG settings in cyberpunk

My goal is to prioritize base fps, i never want to hit my 158 fps Reflex cap and start dropping gpu utilization and base fps. In other words i want to use the highest FG multiplier that still keeps total fps below 158. I didnt limit how high the multiplier can go for testing purposes, I wanted to see if it can really go from 1x to 6x and back automatically, obviously i dont want to use 5x or 6x multiplier on a 165Hz monitor, 4x is the highest i would ever go.

The main trick is to properly set your Dynamic target fps. If i set it to 140 or higher, the multiplier refuses to go back down, it can go from 4x to 5x to 6x, but when it is time to go back down to 4x again, it just stays stuck at 6x at my 158 fps cap with 70-80% gpu utilization. You need to set the target low enough so that there is 30 fps gap from your monitor's refresh rate (it is possible it should be percentage based instead of just flat 30, so maybe on 240Hz monitors the gap has to be closer to 40-45 fps, not sure).

I noticed that the multiplier drops whenever my fps gets to the 150-155 range, and the multiplier increases when i drop to 115-120 range, which is basically exactly what i want. Some people might prefer to actually keep their fps at the fps cap and gpu utilization around 85-90%, which is typically dont to minimize stutters and improve 1% lows. This however doesnt work with dynamic MFG, once you allow your fps to get too high, the multiplier just refuses to go back down. So you need to stay below you Reflex fps cap in order for dynamic FG to work (Reflex cap = Hz - (Hz * Hz / 4096).

reddit.com
u/Sad-Victory-8319 — 5 hours ago
▲ 10 r/nvidia

Streaming stuttering in all games

9800X3D

5090 OC

32GB DDR5 6000CL30

SSD NVME PCIe 5.0

Nvidia driver: 595.97

System and components fully up to date

Hello,

I am reporting a persistent and systematic Streaming Stutter issue across 100% of my gaming library (notably in Assassin’s Creed Shadows, Cyberpunk, Still wakes the deep, The casting of Frank Stone et tant d'autres...) that started exactly with the October 2025 Windows 11 cumulative update.

Hardware Validation (Proving it is a Software/Kernel Issue):

To rule out hardware failure, I have tested two different flagship platforms, and the stuttering remains identical:

* Oct 2025 – Jan 2026: System running Ryzen 9 7950X3D + RTX 5070Ti OC. (Stuttering started immediately after the Oct update).

* Feb 2026 – Present: Complete upgrade to Ryzen 7 9800X3D + RTX 5090 OC

The fact that the exact same "Streaming Stutter" persists across two generations of X3D CPUs and a flagship 50-series GPU proves this is a Windows Kernel/Scheduler regression, not a hardware bottleneck.

Technical Symptoms:

* Engine/UI Streaming Stutter: Frame time spikes occur every time the engine calls for an asset, spawns an NPC, or even displays a simple UI message/notification on screen.

* Kernel Prioritization Failure: The Windows Scheduler is failing to prioritize game-critical I/O requests, even with "Ultimate Performance" power plans and all overlays (Steam, Ubisoft, Epic) disabled.

* Environment: Tested on a clean Windows 11 install with PCIe 5.0 NVMe storage.

System Specs:

* CPU: AMD Ryzen 7 9800X3D

* GPU: NVIDIA RTX 5090 OC (900W Peak)

* Storage: Gen5 NVMe SSD

* Power: Stabilized via Sine-Wave UPS.

Conclusion:

This is a critical regression in how Windows 11 handles DPC latency and thread scheduling for high-end hardware. We need a kernel-level fix to restore smooth asset streaming for the RTX 50-series and Ryzen X3D architectures.

Thank you for your help,

reddit.com
u/Square-Dog-7348 — 3 hours ago
Cyberpunk 2077 @ 7680×2160 (32:9) RT Overdrive | RTX 5090 | DLSS Q/P/UP + 6× MFG
🔥 Hot ▲ 76 r/nvidia

Cyberpunk 2077 @ 7680×2160 (32:9) RT Overdrive | RTX 5090 | DLSS Q/P/UP + 6× MFG

https://preview.redd.it/lkjvjljbbzsg1.jpg?width=3840&format=pjpg&auto=webp&s=c710396c0b67e40d19b23374f69cb30310027df8

https://preview.redd.it/pn4do3icbzsg1.jpg?width=3840&format=pjpg&auto=webp&s=3761e986ab0f36a4fa434e607f6fb31290675ec3

https://preview.redd.it/cju288mdbzsg1.jpg?width=3840&format=pjpg&auto=webp&s=762f5ca585750f773801e87ef04ea57c7b2da74b

https://preview.redd.it/6ixik0aebzsg1.jpg?width=3840&format=pjpg&auto=webp&s=b28586c394b61f7970015b4dafc62d34de7b91a2

https://preview.redd.it/pktiuy6fbzsg1.jpg?width=3840&format=pjpg&auto=webp&s=dcecc1673f25802677b6aa4bd9005051207e176e

https://preview.redd.it/dg2ssqsfbzsg1.jpg?width=3840&format=pjpg&auto=webp&s=8c1a3b7c06c2dec504807136d0ac920c404858a8

https://preview.redd.it/mmz8okxgbzsg1.jpg?width=3840&format=pjpg&auto=webp&s=7f20fcbfd42d1c8403f29ed5111a02d7460fb64a

https://preview.redd.it/9v0coohibzsg1.jpg?width=3840&format=pjpg&auto=webp&s=2ff2c25c4926f830e5a4188d3735de23ad5bc50b

https://preview.redd.it/wdo9ay0jbzsg1.jpg?width=3840&format=pjpg&auto=webp&s=8619771e96792f24d89f8a72019269655de2ec7d

https://preview.redd.it/wxrqj72kbzsg1.jpg?width=3840&format=pjpg&auto=webp&s=b5615e0377b1230c4ebdbd7f93eec113624ae636

https://preview.redd.it/ftg3qy0lbzsg1.jpg?width=3840&format=pjpg&auto=webp&s=d205ccad9d76732bfc617076568f372b4de8a248

https://preview.redd.it/azyw82kmbzsg1.jpg?width=3840&format=pjpg&auto=webp&s=4f57f35de00875bae7d5d6a86d8f190b3c2235e7

DLSS Ultra Performance Benchmark

DLSS Performance Benchmark

DLSS Quality Benchmark

  • System Specs:
    • CPU: Ryzen 7 9800X3D
    • RAM: 64GB DDR5 6000 CL30
    • GPU: ZOTAC RTX 5090 AMP Extreme Infinity
    • Display: Samsung 57" Odyssey Neo G9 (G95NC) Quantum Mini-LED
    • Resolution: 7680×2160 (32:9 Dual UHD / 8K2K) @ 240Hz
  • Settings:
    • Preset: RT Overdrive, HDR10+ Gaming
    • DLSS: Quality / Performance / Ultra Performance
    • DLSS Multi Frame Generation: 6× via NVIDIA App override
  • Screenshots:
    • All gameplay screenshots captured using DLSS Performance + 6× MFG (Fixed) at 7680×2160

Average FPS:

DLSS Mode 4× MFG Avg FPS 6× MFG Avg FPS FPS Increase % Increase
Ultra Performance 243.39 311.63 +68.24 +28.0%
Performance 172.91 230.90 +57.99 +33.5%
Quality 120.09 164.82 +44.73 +37.2%

Minimum FPS:

DLSS Mode 4× MFG Min FPS 6× MFG Min FPS FPS Increase % Increase
Ultra Performance 225.64 291.45 +65.81 +29.2%
Performance 160.89 213.16 +52.27 +32.5%
Quality 111.79 152.22 +40.43 +36.2%
reddit.com
u/JohnGalactusX — 13 hours ago
PSA: NVPI Revamped (Nvidia Profile Inspector fork) is now available via WinGet, Chocolatey, and Scoop
▲ 18 r/nvidia

PSA: NVPI Revamped (Nvidia Profile Inspector fork) is now available via WinGet, Chocolatey, and Scoop

NVPI Revamped by xHybred is a maintained fork of Nvidia Profile Inspector with UI improvements, dark mode, and additional features for managing driver profiles.

I set up automated packaging so it stays up to date across all three Windows package managers:

winget install xHybred.NVPIRevamped

choco install nvpi-r

scoop bucket add nvpi-r https://github.com/AmirulAndalib/nvpi-r-auto && scoop install nvpi-r

New upstream releases are picked up automatically every 6 hours. Settings persist across upgrades. Chocolatey and Scoop create desktop shortcuts; WinGet adds nvpi-r to PATH.

Packaging repo: https://github.com/AmirulAndalib/nvpi-r-auto

u/AmirulAndalib — 5 hours ago
▲ 2 r/nvidia

RTX 5070 Ti ROG Strix OC worth?

Holy title, the name of that card is way too long.

I am currently on a RX 6900 XT and want to both upgrade, and go back to Nvidia and am considering the aforementioned card (16gb version) for around 1.3k.

I was/am also just considering biting the bullet and grabbing a RX 9070 XT Sapphire (or smth like that) cause the performance should be around the same, yet the price is way lower.

I just don't really know that much about actual performance differences, base 5070 Ti seems around the same as the 9070 XT, but can't really find any benchmarks or direct comparisons to the Strix OC variant, hence my question.

Would greatly appreciate any and all input :)

reddit.com
u/_Raziel1337 — 29 minutes ago
▲ 1 r/nvidia

Latency in games - specifically CP2027 5070 vs 4080?

Is it possible that real world latency is better with 5070? I got a deal on a 4080, but even with it's FG2x the latency is higher at the same settings than on my previous 5070 with 3xFG, how is that possible? FPS after FG is +- same (meaning 4080 has higher base fps obviously) . I had around 50ms with my 5070 at 100fps and now it's sometimes over 60, which I can start to feel, aiming feels more "floaty" than with the 5070.
Is this right? And why is that? Thank you

reddit.com
u/Just-Contribution344 — 5 hours ago
Postazione lavoro 🧑‍💻
▲ 1 r/nvidia

Postazione lavoro 🧑‍💻

Sì fissato con avere sempre lo stesso sfondo su tutti i dispostitivi

u/Ferdi_kng — 9 hours ago
▲ 0 r/nvidia

RTX PRO 4500 Blackwell owners ?

I'm about to receive an RTX 4500 Pro Blackwell. Does anyone own this card? I play a lot with AI (training of loras ect...), and play video games too. There aren't any reviews or real benchmarks ( nothing on youtube , reddit ect..), but it seems to have the TDP of a 5060 ti, its raw power is just below a 5070 Ti I think, it has the specs (CUDA cores, Tensor cores, etc.) of a 5080, and the VRAM of a 5090. In short, a Frankenstein card!

I paid €1,000 less than for a 5090, but €1,000 more than for a 5080. I used to have a 5090, but my PC is in my bedroom, and in the summer, it was a real pain when training lora during hours, even at 70% of the TDP. It was just impossible.... I currently have a 5080, but I love doing LoRa training, and its 16GB of VRAM isn't enough for me ( for LTX 2.3 lora training for example etc...). I could have bought a used 4090, but all my AI apps use the drivers and PyTorch for the RTX 50 series, and there's no way I'm reinstalling everything, and it is still a 450w tdp. Plus, I'm very interested in testing the upcoming DLSS 5 (I assume it will be compatible, since even the new DLSS 4.5 is). The Blackwell 4500 seems like a good compromise, even if it's still far too expensive for what it offers. But that's the current market price, and it will be until 2027…or 2028.

The main "problem" is the tdp I think. 200w is super conservative, just 50w more would unleash the 10k CUDA cores and core boost and would surely easily place it at the level of a 5070ti in raw power.

So, any opinions of 4500 blackwell owners ?

reddit.com
u/cc_aa_tt_zz — 3 hours ago
▲ 0 r/nvidia

Nividia RTX 3050 GTA V

Bu Ekran kartla Gta V okunabilir miyim? laptop

ben zaten onunla Delta force oynadım ve proplemsiz ama GTA V bilmem henüz denemedim

bilmeniz için "delta force" bu çok yeni bir oyun 2025'ta çıktı ve sadece güçlü laptoplar onu çalıştıryorlar?

GTA V hakkında bir şey biliyorum GTA V 2013'ta çıktı

reddit.com
u/Muhammet-BM20062026 — 8 hours ago
▲ 0 r/nvidia

Found a possible fix for dlss 4.5 grass or hair like structure shimmering.

For example i used dlss mod in Elden Ring, i saw very bad shimmering while ray tracing max and dlss m with quality option, so after trying out many option i found out motion blur is the issue, after turning it off it was 98 percent gone. But after checking the same thing for Black myth wukong where there are hair like structure everywhere, I found out main culprit was sharpening, anti-aliasing and shadow quality, i used preset m with quality 66%, so what i changed was turning shadow quality from high to very_high(tried cinematic it also had that continuous shimmering) then turned anti-aliasing to low to be honest with you this barely changed anything but game looked better than before with changing sharpening from 4 to 2.
i was using 1080p monitor so shimmering would be far higher than 4k.
conclusion: so I think for any game with ray-tracing on, turn of motion-blur and shadow quality to maximum though I should say check it yourself which shadow settings fix yours.

reddit.com
u/Lock_Extreme — 3 hours ago
▲ 0 r/nvidia+1 crossposts

Proxmox pcie passthrough of nvidia l4 gpu on my dell poweredge r810

Im stuck here. I have reached the end of AI help, chatgpt is telling me to get a new server now after 4 days of debugging. Hoping a real person that has maybe done this might give me a small bit of hope.

reddit.com
u/Mphmanx — 23 hours ago
▲ 0 r/nvidia

DLSS or DLAA?

I own a 1080p monitor and don't really know what to use.

I need help on when and what to use along with when I shouldn't use DLSS or DLAA

EDIT: I forgot to add, are the DLSS 4.5 models worth using on 1080p?

reddit.com
u/Holiday-Chemical1037 — 43 minutes ago
Week