
Anyone got pi 4 for sale?
Does any have a pi for sale..dm ...me. ( any seniors passing out 🤧)

Does any have a pi for sale..dm ...me. ( any seniors passing out 🤧)
Long-time lurker, first time posting a project here.
Github :
https://github.com/colonelblacc/Dynamic-Braille
DynaBraille is a reading desk for blind students. The embedded side is what I want to share — the Raspberry Pi handles all the AI, and the Arduino is purely a real-time servo driver receiving character packets over UART.
---
The Braille Cell
6 SG90 micro servos in a 2×3 grid inside a custom 3D-printed PLA housing. Each servo arm has a pin attached — rotating it raises or lowers the pin through the top plate to form a Braille dot. Any Grade-1 Braille character = a specific 6-bit servo configuration.
The Pi sends packets over serial UART at 9600 baud:
```
BRAILLE:100100\n
^^^^^^
dots 1–6 (1 = raised, 0 = lowered)
Dot layout:
1 4
2 5
3 6
```
The Arduino sketch parses the packet, maps each bit to a servo, and positions them. 400ms settle time between characters to let the servo arms fully actuate before the user touches the dots.
---
Touch Sensor for Word Advance
A capacitive touch sensor is mounted on the **side** of the housing. The student's fingers rest on the Braille dots; they tap the side sensor with their thumb to advance to the next word. The Arduino detects the tap and sends a `NEXT` signal back to the Pi over the same UART line. Hands never leave the device during reading.
---
The Pi Side (brief)
- Pi Camera 3 → perspective warp → CLAHE enhance
- PaddleOCR (English) with confidence fallback to Tesseract (Malayalam)
- Gemma 2B via Ollama for OCR cleanup — runs fully on-device
- pyttsx3 TTS + Vosk offline ASR
- Gemini 1.5 Flash optional (explain/summarize/describe diagrams)
Full offline mode: `python main.py --no-gemini`
Happy to discuss the servo timing, UART protocol design, or the touch sensor debounce logic.
Long-time lurker, first time posting a project here.
Github : https://github.com/colonelblacc/Dynamic-Braille
**DynaBraille** is a reading desk for blind students. The embedded side is what I want to share — the Raspberry Pi handles all the AI, and the Arduino is purely a real-time servo driver receiving character packets over UART.
---
**The Braille Cell**
6 SG90 micro servos in a 2×3 grid inside a custom 3D-printed PLA housing. Each servo arm has a pin attached — rotating it raises or lowers the pin through the top plate to form a Braille dot. Any Grade-1 Braille character = a specific 6-bit servo configuration.
The Pi sends packets over serial UART at 9600 baud:
```
BRAILLE:100100\n
^^^^^^
dots 1–6 (1 = raised, 0 = lowered)
Dot layout:
1 4
2 5
3 6
```
The Arduino sketch parses the packet, maps each bit to a servo, and positions them. 400ms settle time between characters to let the servo arms fully actuate before the user touches the dots.
---
**Touch Sensor for Word Advance**
A capacitive touch sensor is mounted on the **side** of the housing. The student's fingers rest on the Braille dots; they tap the side sensor with their thumb to advance to the next word. The Arduino detects the tap and sends a `NEXT` signal back to the Pi over the same UART line. Hands never leave the device during reading.
---
**The Pi Side (brief)**
- Pi Camera 3 → perspective warp → CLAHE enhance
- PaddleOCR (English) with confidence fallback to Tesseract (Malayalam)
- Gemma 2B via Ollama for OCR cleanup — runs fully on-device
- pyttsx3 TTS + Vosk offline ASR
- Gemini 1.5 Flash optional (explain/summarize/describe diagrams)
Full offline mode: `python main.py --no-gemini`
Happy to discuss the servo timing, UART protocol design, or the touch sensor debounce logic.