I vibe coded a photo album and facial recognition app
Hi all,
I have been building PicUr (picur.my), an AI face-recognition photo sharing service for events like weddings, corporate functions, and conferences. The premise is simple: a photographer uploads photos to an event, guests open one link, scan their face, and get back every photo they appear in. No app install, no tagging.
A few people have asked me to break down the architecture and the security model. Here is the cleaned-up version.
What the user sees
Host signs up, creates an event, drag-and-drops photos.
Host shares a single URL (and QR code) with guests.
Each guest opens the link in their phone browser, takes a quick selfie, and gets back the photos they appear in.
They can download individuals, share single photos, or grab the whole set as a zip.
Architecture
- Frontend: a server-rendered React app behind a global CDN.
- Backend: a Python web service.
- Database: a relational database (Postgres-family).
- Object storage: a self-hosted S3-compatible store. Photos are served via short-lived signed URLs, never as public objects.
- Background workers: a queue-and-worker pair for face indexing, thumbnail generation, retention sweeps, and transactional emails.
- Face engine: a self-hosted open-source face recognition engine, running on the same infrastructure. The AI never calls out to a third-party model vendor. No AWS Rekognition, no Google Vision, no commercial vision API in the loop.
- Edge: a CDN handles TLS, DDoS protection, and edge caching for public marketing pages. Origin servers are not directly internet-exposed for application traffic.
- Payments: Stripe for subscriptions and one-time event packages.
- Deploy: zero-downtime rolling deploys behind a load balancer.
Security
Encryption
- TLS in transit on every request.
- Encrypted server storage at rest.
- Passwords stored as bcrypt hashes. Never stored or logged in plain text.
- Photo download URLs are signed and short-lived. They expire with the event.
Photo isolation
- EXIF metadata (GPS coordinates, camera serial, capture timestamps) is stripped on upload. Whatever the file knew about where or how it was taken, the server forgets.
- Photos are private to one event. No global gallery, no cross-event discovery, no public listing.
- Each event has a retention window (30 days on Free, 6 months on Starter, 1 year on Pro, custom for one-time packages). When the window expires, the photos, thumbnails, face embeddings, and audit rows tied to that event are deleted everywhere they touch.
Face data
- The AI stores numerical embeddings, not face crops. The embedding alone cannot be inverted into a usable image.
- Guest selfies during a scan are converted into a temporary embedding, compared against the event's embeddings, then discarded. The selfie itself is never written to disk.
- Face data dies with the event. No global face database across events.
- I do not train any AI on customer photos.
Operator access (the part most products do not talk about)
- The in-app admin console exposes only event metadata to authorised superadmins. There is no photo viewer for staff anywhere in the app.
- A small operator team has server-level access to the underlying infrastructure, same as anyone running their own cloud service. That access is used only for support, abuse investigation, valid legal process, or maintenance. It is recorded in an audit log.
- Anyone can email and ask for an attestation that no operator has touched their event.
Abuse prevention
- Login, password-reset, and per-event guest authentication are rate limited.
- All public endpoints sit behind WAF and bot protections at the CDN.
Pricing
Free for a single 25-photo event, $9/mo Starter (5 events, 250 photos each), $29/mo Pro (20 events, 500 photos each), custom for anything larger.
We are in beta and offering free tailor-made event packages in exchange for honest feedback.
Links
- Product: https://picur.my
- Security explainer: https://picur.my/security
- Privacy policy: https://picur.my/privacy
Happy to answer questions on the user-facing flow, the tradeoffs of self-hosting the AI versus going with a managed vendor, or the pricing logic. Genuinely want feedback on whether the operator-access disclosure reads as reassuring or unusual.