r/webgpu

🔥 Hot ▲ 363 r/webgpu+1 crossposts

Would you believe I got the file size for Scale Space down to 1mb? (spoilers: I did)

This video is 600x larger in file size than the application it's recording.

Cross platform update is coming!

u/solidwhetstone — 2 days ago
▲ 32 r/webgpu

Supersonic flow simulation

Hi everyone,

I made this webgpu implementation of AUSM+-up/SLAU/SLAU2 finite volume methods with a body-fitted O grid generator as a spring break project, and I've been working on it occasionally since then. Here are some of the features

  • Poisson equation solver to smooth body fitted grid from linear interpolation initial guess
  • A few object presets and an airfoil loader
  • MUSCL reconstruction for interface states for sharp shock capturing
  • TVD RK3 time integration with automatic CFL dt calculation using 1 pass reduction
  • Adaptive timestepping based on performance
  • Various visualization modes (numerical schlieren, density, pressure, temp, mach, etc) with fragment shader based contour rendering

The simulation runs at ~5k steps/sec @ 60 fps on RTX 4070 mobile with 512*384 grid, grid generation runs 10k Jacobi iterations in ~75-100 ms.

All above simulations were run using the SLAU2 method and videos are in roughly real time

Live demo

u/hai31415 — 3 days ago
▲ 1 r/webgpu+1 crossposts

First-time contribution: BiRefNet in the browser

Hi everyone, Alex, frontend developer here, finally having some time to dip my toes into running ML models in the browser. I'm building a proof of concept segmentation / BG removal app in the browser with onnxruntime-web / transformers.js.

I hope this is the right place to post this. If not, please direct me to the right subreddit :)

I am able to run SAM3 in the browser on WebGPU no problem, and also Bria's RMBG-1.4 (great model!) runs fine. However, RMBG is not MIT licensed, and I wanted to build a fully free stack, so I ended up with BiRefNet.

Unfortunately, I did not get the BiRefNet lite model 1024x1024 to run on either WebGPU (not enough storage buffers) or WASM (Out-of-memory error). So, I managed to figure out how to resize the model to 512x512. I took a lot of trial and error, since BiRefNet uses deform_conv2d, which is not available any more in a modern Python stack. I had to run it through docker (ouch!) to get the right export.

But, with this new export it works in onnxruntime-web, which makes me very happy! It is unfortunately a little low on resolution but it runs reliably on my Macbook Pro M1. I'm curious if this is at all useful to anyone, and if the model card is in a format that is clear and useful. Also, if anyone has any idea on how to get the resolution higher without crashing the onnx runtime, that would be amazing.

Here is the link: https://huggingface.co/studioludens/birefnet-lite-512

Any feedback is more than welcome!

u/Affectionate-Peak975 — 5 days ago