r/hardwarehacking

Image 1 — I don’t understand how this mini arcade works
Image 2 — I don’t understand how this mini arcade works

I don’t understand how this mini arcade works

There’s no chips, nothing! I do not get it. Where does the logic for the games live? This is an 8 in one, surely they can’t be burnt into a bit of silicon under the epoxy?

What can I do here? What is there to learn from this toy? Is it possible to slurp out the logic or practice something with this? I was looking at this writeup( https://hackaday.com/2025/07/21/reverse-engineering-a-tony-6502-based-mini-arcade-machine/ ) for a different kit and wonder if I’m better off pivoting to something like that to practice with?

u/oldschooldaw — 2 hours ago
Unable to proceed after U-Boot(?) / ZHAL shell on Realtek router

Unable to proceed after U-Boot(?) / ZHAL shell on Realtek router

Hello all,

I have an old router (AOT-5221ZY, RTL9607DQ SoC) and managed to access its UART interface at 115200 baud using an ESP32. I was able to interrupt the normal boot process and reach a prompt that shows:

ZHAL>

However, I seem to be stuck at this point. The shell accepts input (including newlines), but it produces no output in response to any commands.

I have tried commands like:

  • help
  • ?
  • boot
  • env

but none of them return any output, the prompt just reappears.

Is there a way to proceed further from here, or at least get a basic command like help to execute?

Thanks.

The board

J1: UART contacts, where I connected

The UBOOT/ZHAL shell

Full normal boot log: https://gist.github.com/ShravanAYG/a7a13eeb904fcad54d53690a0c08b8d9

reddit.com
u/Kinnirasna — 7 hours ago
▲ 1 r/UnteachableCourses+1 crossposts

Quantum computing in 2026 is where classical computing was in the early 1950s — room-sized machines solving academic problems, with a transformative future visible in theory and invisible in daily life. The difference is the 1950s scientists didn't have quarterly earnings calls.

Google's Willow chip completed a benchmark calculation in five minutes that would take a classical supercomputer 10^25 years — a number that exceeds the age of the universe by 15 orders of magnitude. IBM promised quantum advantage by end of 2026. Microsoft debuted the first topological qubit processor in February 2025. D-Wave's stock is up 200% in a year. The headlines suggest the revolution has arrived.

The practical reality: quantum computers are not commercially useful at scale. Most real-world applications remain experimental. They are expected to outperform classical computers in specific, commercially meaningful tasks sometime after 2030, not before.

Here's where things actually stand in April 2026, stripped of the press releases.

The field sits in the NISQ era — Noisy Intermediate-Scale Quantum computing. Current processors operate with dozens to a few hundred physical qubits, and those qubits are fragile. They're sensitive to temperature (superconducting quantum computers operate near absolute zero, about 15 millikelvins), electromagnetic interference, vibration, and any interaction with their environment. These interactions cause errors — qubits lose their quantum state through decoherence — and current error rates are high enough that computations longer than a few thousand operations become unreliable.

IBM's Nighthawk processor, delivered late 2025, achieves roughly 5,000 reliable gate operations. IBM expects 7,500 by late 2026, 10,000 by 2027. Those are genuine improvements. They're also roughly five to six orders of magnitude below what's needed for the applications that justify the investment.

The path from "interesting but impractical" to "commercially useful" runs through quantum error correction — using multiple physical qubits to encode a single logical qubit protected against errors. Google's Willow demonstrated "below threshold" error correction where adding more qubits decreased errors rather than increasing them. That's foundational. But the demonstration was limited to quantum memory, not gate operations, and logical error rates are still orders of magnitude from practical.

One telling detail about where the field stands: there's no consensus on what a qubit should even be made of. In classical computing, the transistor won decades ago. In quantum computing, at least five competing technologies are under active development with billions behind each — superconducting qubits (IBM, Google), trapped ions (IonQ, Quantinuum), neutral atoms (QuEra, Atom Computing, Pasqal), photonic approaches (PsiQuantum, Xanadu), and Microsoft's largely unproven topological qubits.

A few things have happened since the Willow announcement that are worth tracking:

In January 2026, a multi-university paper in Science (UChicago, Stanford, MIT, Innsbruck, Delft) explicitly compared the current state of quantum technology to the pre-transistor era of classical computing — foundational physics established, functional systems exist, but scaling to utility requires engineering breakthroughs that could take years or decades. They called it a "transistor moment," which sounds optimistic until you remember how long it took from the first transistor to the first useful computer.

In February, Fermilab and MIT Lincoln Lab demonstrated trapped ions controlled by in-vacuum cryoelectronics — a key step toward scalable ion-trap quantum computing, because current systems rely on impractical wiring between room-temperature electronics and cryogenic traps that breaks down as you add qubits.

In March, IBM released the first published quantum-centric supercomputing reference architecture — a blueprint for integrating quantum processors alongside GPUs and CPUs in hybrid systems. This is significant because it acknowledges what the field has quietly accepted: quantum computers aren't going to replace classical computers. They're going to work alongside them, handling specific subtasks where quantum offers advantage. The hybrid model is the realistic path, and IBM formalizing an architecture for it matters.

On the neutral atom front, Microsoft and Atom Computing plan to deliver an error-corrected quantum computer to Denmark's Novo Nordisk Foundation in 2026. QuEra delivered a machine ready for error correction to Japan's AIST and plans global availability this year. Both teams expect to put 100,000 atoms into a single vacuum chamber within a few years — a scalability advantage that superconducting approaches can't easily match.

D-Wave claimed an industry-first in scalable on-chip cryogenic control for gate-model qubits in January, addressing the wiring bottleneck. Their stock reflects the hype cycle more than the technical reality, but the underlying engineering is genuine.

What quantum computers actually can do today: simulate molecular behavior (the most natural application — using a quantum system to simulate a quantum system), certain optimization problems, and cryptography research. What they cannot do: run AI models, replace cloud computing, speed up databases, or accomplish any general-purpose task more efficiently than a classical machine. NIST finalized post-quantum cryptography standards in 2024 because the threat to current encryption is real — it just requires millions of error-corrected qubits that don't exist yet.

IBM's roadmap targets fault-tolerant quantum computing — their Quantum Starling machine, ~200 logical qubits across ~10,000 physical qubits — by 2029. IBM has been hitting interim milestones consistently, which matters because roadmap credibility is rare in this field. Their 2025 Loon processor demonstrated the key hardware components, and they achieved real-time error decoding in under 480 nanoseconds, a year ahead of schedule.

The pattern is familiar if you've followed fusion or autonomous vehicles: genuine technical progress, consistent milestone achievement, and a commercial timeline that keeps resolving into "a few more years." The most honest framing isn't that quantum computing doesn't work — the physics absolutely works. It's that the gap between where we are and where we need to be is measured in orders of magnitude, and orders of magnitude don't close on schedule.

Longer analysis covering the error correction problem, the qubit technology competition, IBM/Google/Microsoft roadmaps, and what "quantum advantage" actually means versus how it's marketed:

https://unteachablecourses.com/quantum-computing-2026/

Genuine question for the technical people here: does the neutral atom approach (QuEra, Atom Computing) end up winning the qubit race specifically because of the scalability advantage — 100,000 atoms in a single chamber vs. the wiring nightmare of scaling superconducting systems — or is the gate speed disadvantage too steep for it to matter?

reddit.com
u/unteachablecourses — 5 hours ago
GB-BKi3HA-7100 BIOS recovery — CH341A + MX25L8073F (1.8V chip)

GB-BKi3HA-7100 BIOS recovery — CH341A + MX25L8073F (1.8V chip)

The BIOS chip is a Macronix MX25L8073F. The CH341A detected the chip without issue, but when I added the generic 1.8V adapter, it stopped recognizing it. The solution was to ignore the adapter and connect it directly.

To slightly lower the voltage, I used two USB extenders between the PC and the CH341A. Software: Modified official CH341A version, available at https://www.instructables.com/CH341A-Programmer/ — chip selected manually (SPI 25 Series / Macronix / MX25L8073F), with the NUC completely disconnected and the SOIC8 clip in place.

First, I read and saved a dump as a backup, then I opened the official firmware from the Gigabyte website for the GB-BKi3HA-7100 Rev 1.0 and flashed it using Program. It took about 6 minutes. When finished, I removed the clip, and the NUC booted with an image.

Post-recovery shows an RTC error, but nothing serious. This information is useful to anyone who might find it helpful, because I tried everything and since I had put it up for sale and someone asked about it, I wanted to make one last attempt, haha.

u/Ok-Constant7269 — 5 hours ago
▲ 6 r/Hacking_Tutorials+1 crossposts

Released a fully open source M5Stack hardware hacking lab for learning and pentesting

I’ve been meaning to share this for a while and finally got it ready.

I built a hardware hacking lab using M5Stack that focuses on practical, real world pentesting scenarios instead of just CTF style challenges:

https://github.com/gromhacks/vuln-m5stack/tree/main

This project is a way for me to give back. A friend helped me get started in hardware hacking and I wanted to create something that makes it easier for others to get hands on experience.

Everything is fully open source and always will be.

There are already some great platforms out there like RHME by Rescure/ Keysight (https://github.com/Keysight/Rhme-2016 ) but I wanted to build something that feels like a real device you might encounter during an assessment while still being affordable and easy to reproduce.

If you’re into hardware security or embedded stuff and want something practical to learn on, feel free to check it out.

Happy to hear feedback or ideas for improvements.

reddit.com
u/GromHacks — 19 hours ago

vSOL v2801q SPI dump

does anyone have good working and default admin password spi dump of vsol v2801q onu. current one is admin access locked. and reset button dont work

reddit.com
u/BorofMonster — 12 hours ago

How to get started

Hello, I have a passion for harware in general and got interested in hardware hacking, the idea that you can use a device for other purposes that it wasn't made for fascinates me.

That's why I was wondering how to get started in this field, are there any ressources or beginner level projects you suggest ? What was your first project ?

reddit.com
u/PlaneInevitable8700 — 5 hours ago
Image 1 — It is possible to change default MAC and Wifi Key for a Netgear WAX214v2?
Image 2 — It is possible to change default MAC and Wifi Key for a Netgear WAX214v2?
▲ 0 r/HomeNetworking+1 crossposts

It is possible to change default MAC and Wifi Key for a Netgear WAX214v2?

Becouse I have frong Backplate for the PCB

u/Adept-Bug-7227 — 12 hours ago
Week