u/Far-Temporary6630

▲ 299 r/rabbitswithjobs+1 crossposts

I replaced my project planning process with a … rabbit

Not the usual post… This month I thought I’d try something new.

I’ve always got too many ideas and I can never pick between them, so I’ve decided I’m not going to pick anymore.

I’ll let something else handle it. More advanced than any AI system.

Deploy rabbit- Observe selection- Commit.

He already controls most of my personal life anyway, so it was only a matter of time before he took over my professional life too.

u/Far-Temporary6630 — 4 days ago

BullsAI turns any dartboard into an AR coach and gaming platform. A phone watches the board through computer vision and detects every dart in real time. Spectacles overlay coaching prompts, target highlights, and games onto the actual physical board. .

The phone is the input device. Mount it on a tripod, run the webapp, and OpenCV handles detection, HSV colour masking finds the board, ellipse fitting locates the centre, and a 4-dart calibration locks in a perfect perspective transform from any camera angle. Player places darts at Double 20, 6, 3, and 11. Four known points on a circle yield a matrix that maps the camera view to ideal board coordinates. The simplest solution beat every clever auto-detection approach I tried. Detection runs every 200ms with position-based deduplication so the same dart never sends twice, only when it actually moves more than 4% of the board does it trigger a new event. Pull the dart out, throw another, the next one fires immediately.

The Spectacles app is built around a single input system. Communicating with Supabase, listneing to dart events. Every game is a self-contained TypeScript file that subscribes to a callback and reacts. Adding a new game is one new script plus a button in the lobby.

Ive made 5 games modes as is; Dart Assist is the coaching mode: structured lessons on board layout, throw technique, and 12 progressive drills with smart feedback after every miss. The feedback isn't generic, it analyses the actual drift vector between target and dart, then gives technique advice. "Pulled hard right, relax your grip and throw straighter." "Right number, pulled inside. Trust the throw, aim slightly outward." "Drifting right, tighten your wrist on release."

Bubble Pop is Puzzle Bobble but on the dartboard. 12-20 bubbles spawn in clusters around the rim in three colours. Your dart is assigned a random colour each throw, only matching bubbles pop. Hit a cluster and a flood fill finds every connected same-colour bubble, then they all burst outward with explosive physics, gravity, and spin. Score is bubbles cleared per dart taken.

Apple on Head is William Tell a Bitmoji holds an apple, hit the apple to launch it sideways with physics, hit the face and the Bitmoji shakes.

Tic Tac Toe runs on a 3x3 grid mapped to the board with cell takeover (land on opponent's X and it becomes your O), auto-resets after wins.

Free Throw is a heat map every hit position spawns a marker so you can see your shot pattern build up.

Multiple Spectacles can join the same game code and see darts in sync. Multiplayer is "shared view" right now, both wearers see every dart, both react, but each runs its own game state. Proper turn-locking is on the roadmap.

Supabase powers the entire backend. Two tables dart_games for sessions and dart_throws for individual hits. Phone writes detections, all Spectacles read. No custom server, no socket setup, just polling at 200ms.

A reactive slime-face character watches every throw and reacts with surprise on bullseyes, sad on misses, and idle blinking the rest of the time.

Optimisation: detection on the phone, rendering on Spectacles. The Lens itself runs lean because the heavy CV work is offloaded entirely. Every game uses object pooling for spawned markers, the slime face is unlit shaded, and the dartboard reuses a single anchor disc that all game prefabs parent to.

I know having to set up a phone as the camera is awkward, that part isn't where this is meant to live long term. The thinking is BullsAI works best at venues. Some pubs and bars already have cameras around their dart boards, and Spectacles can potentially pair with that existing setup so the player just walks up, puts the glasses on, and it works. The phone is the dev kit. The venue install is the product.

I've actually built something in this direction before. A few years back I worked with a company in the UK called Axed making smart lanes for axe throwing. Object tracking was rough at the time so we shipped projectors and tablets instead of AR, but the principle was the same: take a traditional physical sport and layer a digital counterpart on top of it. Scoring, games, leaderboards, all driven by what's actually happening on the target. AR finally makes that experience personal instead of shared on a screen, and Spectacles are the right form factor for it.

Looking ahead, voice coaching with ElevenLabs, form analysis using the Spectacles forward camera to grade throw motion, LLM-powered personalised coaching that learns your weak zones over time, more games (Battleships, Around the World, Zombies), tournament mode with proper turn-locking, and safety warnings using depth sensing if someone walks in front of the board.

Was a tonne of fun building and showing this off at the Spectacles bootcamp.

Ive also submitted to the XRCC hackathon- "Kill The Manual".

Repo: https://github.com/ohistudio/BullsAI

Try the phone detector live: https://ohistudio.github.io/BullsAI/web/darts.html

Lens Link: https://www.spectacles.com/lens/e2cccc74d86949fc84ea5f084188a5df?type=SNAPCODE&metadata=01

Trailer: https://www.youtube.com/watch?v=wr63v62yR7k

u/Far-Temporary6630 — 14 days ago