
u/TomMooreJD

STATEMENT: CAP Praises Hawaii Passage of Bill To Undo Effects of Citizens United, Urges Governor To Sign It Into Law
americanprogress.orgHawaii legislature passes bill to undo Citizens United
Extremely good news out of Hawaii today. The legislature has passed SB 2471, which would no longer empower corporations to spend in the state's politics. It is now off to the governor's desk for his signature.
Hawaii becomes the first state to embrace this approach, which is also headed toward Montana's November ballot as a voter initiative. Legislators in 13 other states introduced similar bills this year, but Hawaii's effort was the only one to make it all the way through the process.
More details on the underlying legal approach here: https://www.americanprogress.org/article/the-corporate-power-reset-that-makes-citizens-united-irrelevant/
Update on Hawai‘i's bold move to make Citizens United irrelevant: AG kill switch is out; final votes on SB 2471 are Friday. This is really close!
Big news! SB 2471, the bill that no longer grants political-spending power to corporations and other artificial entities in Hawaiʻi, received identical floor amendments in both chambers yesterday. The bill is now resting for the constitutionally required 48 hours and is scheduled for final votes in both houses tomorrow, Friday, May 8 — the last day of session. If it passes both chambers, it heads to Governor Josh Green's desk.
What the bill does. SB 2471 takes a structural approach to corporate political spending that no other state has enacted. Rather than regulating speech (the path foreclosed by Citizens United in 2010), it operates upstream of that decision by defining the powers Hawaiʻi grants when it charters a corporation, LLC, or other artificial entity, and the powers Hawaiʻi requires foreign entities to respect when doing business here. Political spending is not among the powers granted. The reform treats artificial persons as creatures of state law whose powers the state defines — which is black-letter corporate law going back two centuries — and applies that principle to election and ballot-issue activity.
if signed, the bill takes effect July 1, 2027.
Why this would be historic. Hawaiʻi would be the first state in the country to enact this kind of reform. A parallel ballot-initiative effort is going gangbusters in Montana, but voters there can't speak on this until November. If SB 2471 clears both floors Friday and is signed, Hawaiʻi will set the template for every other state whose citizens want to reclaim their politics from dark and corporate money.
This is the furthest any state has gotten. It has been a remarkable session of work by the chairs, the conferees, the staff, and Hawai‘i's fired-up people, who want to make this change happen.
Senate Bill 2471, which would make Citizens United irrelevant in Hawaii and get all dark and corporate money out of the state's politics, is achingly close to passage. It cleared its conference committee late last week and the final round of amendments will be voted on tomorrow (Wednesday, May 6).
The House and Senate caucuses meet today (Tuesday) to discuss what the floor amendments will be. There is one floor amendment that has been agreed upon: They're going to push the effective date from Jan. 1, 2027, to July 1, 2027. (That seems fine.)
The amendments are then introduced and voted on tomorrow. Both houses have to agree to the same amendments or the bill dies (this is possible, but not expected). The bill then sits for two days and there's a final vote on it in each chamber, as amended, on Friday.
Then the bill is off to Gov. Green for his signature.
There's action to be taken now, if you like: At the final moment, the conference committee added an unfortunate and badly crafted amendment to the bill that gives the attorney general unilateral power to erase the law from the books altogether if she believes a part of it is unenforceable. This should come out.
The person who will decide whether it stays or goes is Rep. Scot Matayoshi. His email is repmatayoshi@capitol.hawaii.gov; his office number is 808-586-8470. It's an easy ask: "Please kill the Attorney General kill switch in SB 2471."
Let's be polite about it—Rep. Matayoshi had that provision added out of a sincere concern about how courts might treat out-of-state corporations differently from Hawaii corporations. But the language that was added doesn't do the job, and it weakens the stability (though not the effect) of the bill.
Even if the kill switch stays in, SB 2471 is well worth supporting. The kill switch makes a great bill a little less steady, but it's still a great bill.
My robot pal and I came up with this. I barely understand the technical details, but it does seem to work. Paste this into some app like Claude Code or ChatGPT Codex, and they'll take it from there.
The end result I'm working on is: Toss Antigravity or Insta360 native video or photo files into a folder, and end up with a splat a few hours later.
=====
If you own an Antigravity A1 360° drone and want to feed its footage into a Gaussian Splat trainer (or any other downstream tool that expects equirectangular video/images), the entire pipeline is now makesplat <folder_name> — drop your .insv and .insp files into a folder, run one command, get a .ply splat out the other end.
The hard part wasn't the splat training — that's well-trodden ground (COLMAP for SfM, Brush for the actual training, runs natively on Apple Silicon). The hard part was the first step: the public Insta360 MediaSDK refuses to stitch A1 files because the A1's lens isn't in its dispatcher. Without the SDK, no batch processing. Without batch processing, no automated pipeline. Antigravity Studio's GUI was the only path, one file at a time.
This post is a recipe for the unlock + a brief tour of the rest of the pipeline.
The full pipeline at a glance
- A1
.insv/.inspfiles - → Insta360 MediaSDK in Docker (the byte patch unlocks this for A1 footage)
- → Equirectangular
.mp4/.jpg - → ffmpeg cubemap split (6 perspective faces per frame, 90° FOV)
- → COLMAP
automatic_reconstructor(SfM, runs CPU-only on Mac) - → Brush splat training (Apple Silicon native, WGPU/Metal)
- →
.plyGaussian Splat
All open source. All Mac-native (Brush + COLMAP + ffmpeg) or Mac-via-Docker (MediaSDK runs as x86_64 Linux under Rosetta — no GPU needed). No CUDA, no Linux box, no cloud.
Before you start: get the Insta360 SDK
The MediaSDK isn't a free public download — you have to apply for access through Insta360's developer portal. The process is light: visit insta360.com/sdk, fill out the application form (it'll ask what platform you want, what you're building, and basic contact info — a personal/research project description is fine), and wait. Approval took me about 12 hours, but it could be longer. They email you a link to download the SDK package — the Mac/Linux flavor is the one you want for this pipeline (specifically, libMediaSDK-dev-X.Y.Z-amd64.deb for Linux x86_64, which is what runs inside our Docker container).
You don't need to mention the A1 in your application — applying as an Insta360 SDK developer is sufficient, and the byte patch in this post handles the A1-specific part.
What was blocking step 1: A1 isn't in the public Insta360 SDK
Drop an A1 .insv into the public MediaSDK (libMediaSDK-dev-3.1.1.0-amd64.deb, latest as of November 2025) and you get:
CameraName is empty. CameraLensType is Unknown.
Origin Offset: ..._10496_5248_155_...
no implemention!
ErrorCode:1; ErrorDescr: offset is not support
A1 reports lens type 155. The SDK supports lens types 41, 71, 113, 283, etc. (X3, X4, X5, ONE-RS variants, etc.). 155 isn't in the dispatcher — Antigravity is a partner-OEM camera and its lens profile only lives inside Antigravity-branded software (the Studio app, the Reframe Premiere plugin, the Android app — confirmed by tearing all three apart, more on that below).
The unlock: a 16-byte byte-patch per file
A1 optics are nearly identical to the X4 (8K dual-fisheye, square sensor per lens, ~2.4% sensor-size delta). If you tell the SDK "this is X4 footage", it applies X4's geometry math to A1's actual per-unit calibration values (which are stored in the same offset string and remain correct), and the result is a clean stitch with no perceptible distortion at the seam.
For .insv (video):
Find: "_10496_5248_155_"
Replace: "_10496_5248_113_" # X4 video lens type
Count: 4 occurrences in the file's trailer
Length: same — preserves all MP4 box offsets
Then a 180° rotation post-process (ffmpeg -vf "vflip,hflip") to fix orientation — the A1 is drone-mounted upside-down vs the X4's handheld-upright assumption.
For .insp (photo): the SDK's image-stitcher path enforces stricter trailer integrity than the video path, and the same byte patch alone gets rejected. The trick: rename .insp → .insv so the file routes through the video stitcher. The SDK demuxes the file as a 1-frame "video", produces a 1-frame equirect MP4, and you extract the JPG with ffmpeg. Same 155→113 patch, same vflip+hflip.
That's the entire unlock. Both A1 file formats handled by one byte-replacement + one orientation correction.
The rest of the pipeline (briefly)
Once stitching works, the rest is conventional:
- Cubemap split — ffmpeg's
v360=e:flat:h_fov=90:v_fov=90:yaw=N:pitch=N:w=1536:h=1536filter splits each equirect frame into 6 perspective faces. Cubemap faces are easier for SfM than equirect because COLMAP doesn't natively support equirectangular cameras — it wants pinhole-style perspective views. - COLMAP SfM —
automatic_reconstructorwithSIMPLE_PINHOLEintrinsics (focal=cx=cy=768 for 1536² faces at 90° FOV). On Apple Silicon CPU, ~30 minutes for ~600 cubemap images. - Brush splat training — point it at the COLMAP workspace, default 30K steps trains in ~30 minutes on Apple Silicon GPU. Outputs
.plyconsumable by any standard splat viewer.
What makes a good A1 splat
A 360° camera captures every direction in every frame, so what matters is camera-body trajectory, not orientation. For splats:
- Drone hovering in one spot — ❌ no parallax
- Linear flyover — ❌ each scene point seen from a narrow angle range
- Orbit around a target — ✅ ideal — convergent multi-view
- Multiple-altitude orbits + figure-8 — ✅ best for outdoor scenes / buildings
I learned this by training a splat from a handheld walk-through-multiple-rooms test capture. It was a needle-storm. Don't walk; orbit.
Nitty-gritty deep dive (for humans and LLMs)
File format
A1 .insv and .insp files are MP4 containers with a proprietary Insta360 trailer. The trailer ends with:
[ ... trailer body (length = N bytes) ... ]
[ <length:u32 LE> ][ 0x03 0x00 0x00 0x00 ][ "8db42d69...026bf" (32 ASCII hex chars) ]
^---- file end
The magic UUID 8db42d694ccc418790edff439fe026bf is the same across all Insta360-format files (X3, X4, X5, A1).
Inside the trailer, calibration is stored in TLV-like records. Three header types observed:
ba 03 ?? ??— 13-float entry (image stitcher format)b2 03 ?? ??— 16-float entry (video stitcher format, primary)c2 03 ?? ??— 16-float entry (variant/duplicate)
Each entry's payload is an ASCII offset string of the form:
2_<float>_<float>_..._W_H_<lens_type>_<float>_..._W_H_<lens_type>_<entry_trailer>
Where 2_ indicates dual-lens calibration, the floats are per-lens calibration values (focal, principal point, rotation, optionally translation + distortion coefficients), W × H is the sensor resolution, and <lens_type> is the integer the SDK dispatches on.
Camera ↔ lens-type mapping (observed in real files)
- Insta360 X3 — sensor 6080×3040, image lens type 41
- Insta360 X4 — sensor 11904×5952, image lens type 71, video lens type 113
- Insta360 X5 — sensor 11904×5952, no
.insp(uses.dng+ camera-stitched.jpg) - Antigravity A1 — sensor 10496×5248, image lens type 112, video lens type 155
X5 dispenses with the .insp format entirely — saves DNG (raw) + camera-stitched JPG instead. So the X5 photo workflow is "use the camera's JPG directly, no SDK involvement needed."
What was tried that didn't work (to save others time)
- Brute-force lens-type integers 100–600 against
.insp— all rejected. The image stitcher dispatcher uses a different lens-type table than the video one. - Patch image-format lens 112 → known supported (41 X3 image, 71 X4 image) — rejected with
ErrorCode:11. - Corrupt the
b2 03TLV header bytes to make SDK skip the entry — rejected. The SDK validates trailer structural integrity. - Delete the entire 16-float TLV entries, update the top-level trailer length pointer — rejected. There are nested length/offset references in the binary metadata block at the trailer end that also need updating.
Why the .insp rename trick works
The MediaSDK CLI dispatches by file extension (visible in the public SDK's example main.cc):
if (suffix == "insp" || suffix == "jpg") {
auto image_stitcher = std::make_shared<ImageStitcher>();
// ... image stitcher path with strict trailer validation
} else if (suffix == "insv") {
auto video_stitcher = std::make_shared<VideoStitcher>();
// ... video stitcher path with looser validation
}
Renaming forces the looser-validation path. The single JPEG inside the .insp gets demuxed as a 1-frame video, stitched with X4 geometry (because we patched 155→113), and emitted as a 1-frame MP4. Extract the JPG. Done.