
A fully automated UI pipeline between UI designers and Unreal Engine 5.
Hey fellas.
Working on an UE editor plugin called UIWidgetBuilderEditor.
The goal is building a fully automated UI pipeline between designers and Unreal Engine. Instead of rebuilding everything manually in UMG, the system imports PSD/JSON layouts and automatically generates:
UMG widget hierarchies
Blueprint logic
panel/screen workflows
modal systems
DPI/platform setup
layout wrappers (SafeZone / ScaleBox)
visibility & interaction helpers
The important part:
A UI designer can test and iterate layouts directly inside Unreal Engine with very little engine knowledge.
The workflow is based on a structured naming convention from the design side.
Example naming roles:
GRP_ → UI Groups / Screens
PNL_ → Panels
BTN_ → Buttons
TXT_ → Text widgets
IMG_ → Images
ModalBG_ → Modal overlays
The importer reads those naming rules and automatically builds the proper runtime hierarchy and Blueprint logic.
Example:
You can build a full Settings UI in Photoshop/Figma, export JSON, and automatically generate:
settings panels
navigation logic
modal backgrounds
visibility switching
input/cursor handling
runtime widget hierarchy
without manually rebuilding the entire UI in UMG.
Current pipeline already supports:
✅ JSON hierarchy reconstruction
✅ automatic UMG generation
✅ generated Blueprint logic
✅ per-panel usage settings
✅ modal background generation
✅ screen-type detection from layout groups
✅ visibility logic generation
✅ platform/DPI setup
✅ SafeZone + ScaleBox wrapper generation
Goal is turning this:
PSD / Figma / JSON
→ automated Unreal UI generation
→ production-ready UI scaffolding
instead of spending hours rebuilding layouts manually.
Currently exploring:
runtime screen managers
UI state machines
animation generation
automatic navigation
responsive layouts
design-to-runtime workflows
CommonUI integration
Curious what other UE UI developers/designers would want from a tool like this.
What features would save the most production time in your UI workflow?