r/losslessscaling
Can I combine my desktop gpu with my laptop one for frame gen?
Apologies if it sounds stupid, but I was hoping someone smarter than me could help me with this question.
I'm planning on picking up a pc thats rocking a 6950 xt, but also currently have a laptop that has a 4060. Is there a way to add DLSS to the pc/display I'm using with my laptop?
Getting flickering in strong light/white sources with HDR in Steam OS for all games
Not too annoying but I want to know if anyone has found a solution. I'm on SteamOS and it happens in all games.
Is my mobo the problem?
Mobo: Gigabyte X670 Aorus Extreme AX
7800X3D
9070xt w/ 6600xt
Psu: C1000 NZXT
Monitor: 4k OLED PG32UCDM
I play Helldivers a ton and it’s the first game testing out lossless scale with two gpu’s
—-
When in the ship lobby with LS inactive. Game’s fps is high with stutters every other second, and latency fluctuating like an irregular heart beat; even the in-game latency graph was geeking out high on caffeine.
With LS active, latency goes above 40 and the graph skyrockets and fps drops below 20 to 10.
Game went choppy slowmo.
-
I turned LS off. Googled on my phone what could be the problem… I glanced up to my screen and the latency graph was showing stable :0
I didn’t turn on LS, and Task Manager was showing my 9070xt at 98% utilization at that time. Then after 20 seconds the graph went back to stutter, and 9070xt utilization lowed to 55-66%.
—-
Before hand. I’ve set the 9070xt as preferred gpu in Windows settings; drivers are up-to-date; in LS settings preferred gpu is the 6600xt; main monitor is plugged into 6600xt, but secondary monitor is plugged to 9070xt.
Second game I’ve tested is Warframe. Same issues.
What have I done wrong?
Is there a way to change the Draw FPS size or transparency?
Even the ability to only showcase the total fps would be great but man would it be great to hotkey the fps counter and customize how it shows up on the screen.
Vsync and capping fps below refresh rate
Hey all.
So, what is the recommendation regarding fps cap and vsync, for monitors that don't have any vrr/gsync?
I use a dual gpu setup where the igpu is a intel one, and as such, I cannot use nvidia fast sync. Therefore, I have being using lossless scaling vsync option instead of default option, since camera pans with default seems to stutter a little bit from time to time. My laptop screen is OLED 2.8k 90 hz.
Does capping fps below refresh rate rather than exactly at refresh rate really decreases latency when vsync is on (to avoid tearing)? Like, now I am capping the games at 44 fps, then doubling to 88, so 2 fps below refresh rate. Is that really the best approach? Have people measured different approaches to find which one has less end to end latency in this kind of setup?
Can i use double upscaling with dlss and LS1 ?
For example, i play crimson desert on 4K TV with RTX 4060. Ofcourse my gpu cannot handle 4K even with DLSS performance.
So my plan is i play at 1080p with dlss quality to get 60 fps then upscale it with Lossless Scaling to achieve 4k. is it better or i just wasting gpu power ?
i dont really care about framegen. As long as im on 60 native im happy.
Help with losless scaling on crimson desert
I really can’t explain why this is happening so basically I’m running crimson desert at dlss quality on a 1440p monitor at 68fps capped through nvidia control panel low latency mode ultra Vsync and gsync both on and for some reason my game with be running perfectly smooth at 68fps doubled with frame gen then after a random amount of time will suddenly start dropping like to 45fps 57 back to 45 completely randomly after running perfectly smooth for ages temps are fine so it can’t be that and the weird thing is my base fps isn’t changing it’s completely locked 68fps as I have the steam fps counter on pls help me with this as the in game frame gen is horrible and introduced a ton of screen tear
God of War Ascension: 4K nativo - 60 FPS estables con LSFG 3.1 (RTX 5060 Ti + i5 12400F)
Hola a todos,
Si han intentado jugar God of War Ascension en RPCS3, saben que es un dolor de cabeza por las caídas de FPS y los molestos cuadros negros en las sombras.
Después de varias pruebas con mi RTX 5060 Ti de 16GB y un i5 12400F, encontré el 'sweet spot' de configuración, usando el Resolution Scale Threshold en 412 logré eliminar los artefactos visuales, y con el nuevo Lossless Scaling (LSFG 3.1) pasé de unos 30-35 FPS inestables a una experiencia de 60 FPS constantes en 4K nativo.
El input lag con el Frame Generation 3.1 es prácticamente nulo si se configura correctamente.
Aquí pueden ver el benchmark y mi configuración exacta de RPCS3: https://www.youtube.com/watch?v=AdkD_Ovyfe0&list=PLxlbC6ibgyMyFPJoyQF86S-fwQQDjD5Pb
Espero que esto le sirva a alguien que tenga este procesador o gráfica, ya que el i5 suele sufrir bastante con este título (cuello de botella del Cell). ¡Cualquier duda con los settings me dicen en los comentarios o mirar las configuraciones en el video!
LSFG-Android
Hi everyone, I want to show you what I've managed to do: an Android app for using Lossless Scaling on Android smartphones. So far, I've only tested it on my REDMAGIC 11 Pro smartphone with a Snapdragon 8 Elite Gen 5, and it works great. Here's one of the tests; obviously, it also works on other devices like games and videos, but here in the test, I'm showing you a GIF where I'm doing 8x framerate.
Obviously, the app needs improvement, and apparently it doesn't work on other SoCs yet, from what I've tested.
Before you ask, yes, the app is also vibecoding, but if it works, it shouldn't matter; the only thing that matters is that Lossless Scaling is working for the first time on an Android smartphone.
I hope I don't get any offense or negative comments for this.
AI should be helpful, and for me, it is.
Dual GPU
Alright, so I know there's a lot of dual GPU questions already, I know the basics like which one to plug my monitor into and how to separate which runs the game and which runs the frame gen. My question is which if my GPU's should I use for the game and which for FG, I have a RTX 3060 and a 7800xt
Trying to play The Quarry and Elden Ring
Hi, I own the Legion go 2 z2e, I bought Lossless scaling on steam, installed the plugin on bazzite and even put them not he launch command line, but on the quarry it says that a D3D11 compatible GPU is needed, don't know what that means, because without the command it runs fine-ish, and when I try to launch Elden ring it brings up an error, can somebody help please. I don't know what im doing wrong.
Best Lossless scaling settings for intergrated graphics
Hello, i just want to know how i can get less input delay while using lossless scaling, and the game can feel like its in slowmotion here are my specs: ryzen 5 4650u with vega 7 16 gb ram, the games i play are mostly minecraft schedule 1 and some racing games, with most of them my base fps is just below 30 fps. Im not searching for a very big increase but 10-20fps while the game still feeling natural would help.
Am I tripping or sometimes lossless scaling is better than FSR FG and XESSFG?
I am trying that new game SAMSON, and I tried the native fsr fg there...everything felt weird,sluggish...than I tried xess fg through optscaler and...same thing..then I tried Lossless scaling fg and it was perfect...I found it weird since everywhere people say that the native fsr fg and even XESS FG in optscaler is way better but that wasnt true for me...
Dual GPU - AMD for rendering and Nvidia for lossless
I'm planning to make a new build with a 9060 XT running over PCIe 5.0 x 16.
I also have an eGPU setup with a RTX 3060 over Oculink that I could plug into a spare M2 slot (PCIe 4.0 x 4).
Does anybody have experience with using an AMD card for rendering games and Nvidia for upscaling/frame gen?
I would probably render at 1440p 60hz and upscale at 4k 120hz with x2 frame gen.
Thanks in advance!
30 fps vs. 60 fps difference?
Am I the only rare gamer who can't tell the difference between a stable 30 fps vs. a stable 60 fps situation?
Is there an app or a video game that make it really obvious for me to appreciate the difference?
Is there a way to make certain profiles load automatically for games?
Let me explain I'm using lossless scaling on steam deck. I'm currently using it for switch games and I see most games run better with "vsync" on and some with it off. It gets annoying switching vsync on and off depending what game I play. So I created profiles for vsync on and another with it off. Is there a command I'm supposed to use to make a certain profile load automatically with a specific game?
Can i limit fps to 120 via nvidea app then lossless scale to 240?
Is this possible?
Massive Input lag with arc raiders lsfg dual gpu
As the title says, I'm using lsfg and it seems no matter what settings I use the input lag is unbearable, it feels like the screen moves almost a full second behind my mouse.
I play 4k, all setting maxed, no frame gen on a RTX 3090 Kingpin, My fps usually sit in the high 50s but can go as low as 47 and as high as 70+ depending on map etc....
It plays great with my single 3090 kingpin with no fsr or lsfg but as soon as I try using lsfg with a RTX 2060 super it's completely unplayable, I've tried almost every setting I can think of in lsfg but the input lag still remains very high.
Specs:
Ryzen 9 5950x
3090 kingpin (2060 super secondary)
64gb ddr 3600 (4x 16GB)
Gigabyte aorus ultra x570 revision 1.0
Windows 10 pro
P.S. I have a GTX 1080 kingpin, would this be better than the 2060 super? Or would i lose newer technologies since it isn't an rtx card? If I did, what would I lose?
Note: I also tried lsfg with my 3090 and a 3060ti but I recently sold it, the input lag with the 3060ti as a second card also had massive input lag.
Lossless Scaling caps my fps at 30.
For context I have RTX3060 for laptop. I tried Lossless Scaling in two different games. I get around 70-80 fps in both games. I capped their fps to 60 to and tried x2 to go for 120 and it was 30 / 30. I then capped my fps to 15 and used x2 to go for 30fps to see if its broken and this time it was 15 / 15.
I asked AI and tried many things like changing lossless scaling settings, forcing it to use my 3060, disabling all the overlays and stuff. Does anyone know how to fix this or it just doesn't work for my machine. Thanks in advance.