Openclaw locally runs very slow. Openclaw web is not feasible.
Hey I tried openclaw to run locally.
I chose ollama route sinse i am just a student and paying for the api and running openclaw on cloud would require money.
I tried ollama deepseek 1.5 B model which is small and fast and can be run on my laptop.
I have rtx 4050 with 6 gb vram.
Running the model just with ollama is fast and can run at speeds which I can easily work with but when I used openclaw and used that model there the query openclaw takes is very slow. Also other things in openclaw felt very slow.
Slow things include the reply which I get even hello is replied very late( not a problem when I single run model). The ui, when I go to different tabs like skills , channels,instances, sessions,cron jobs and other tabs they feel like taking 1 minute to load.
I want a solution to run openclaw faster locally( can't pay for web version as I just want to experiment)
I heard there was a lighter version for it but didn't understood what it means so recommended it ( idk what it is)
My model( even though small can run 9B or 7B model ) can be used in ollama. I can link lmstudio api so model running is not hard but openclaw itself is slow.
Everyone teaching openclaw is either in good hardware( mac or high end windows or Linux) or using web hosting so pc won't break I just want to experiment small and will do carefully.
Can anyone help with running it locally far better.