Automate What
You Can See

If the camera can see it, you can automate it. No 3D scanners, no laser profilers. A camera, a vision model, your rules — running on one device at the machine.

On-premises . Real-time . Multi-robot . No cloud

Rocket Welder Automation Platform — camera preview with vision overlay, robot status, PLC tags, and program workspaces on a single screen

The Camera Finds the Geometry.
Your Program Computes the Path.
The Robot Follows.

The camera is the sensor. The vision model extracts geometry from every frame. Your program turns that geometry into robot waypoints or real-time corrections. The robot executes. The camera sees the result. Loop repeats.

1

Camera

Camera frame capturing the welding process

Captures the process. Every frame.

Welding weld joint
Inspection part surface
Measurement component dimensions
2

Vision Model

Camera frame with keypoints and polygons overlaid by vision model

Extracts geometry of recognized objects — keypoints, edges, dimensions, polygons — as numbers your program can use.

Welding seam edges, gap 2.3mm
Inspection defect at 3 o'clock
Measurement width 45.2mm +/-0.1
3

Your Program

IDE screenshot with program code computing robot waypoints

Keypoints become 3D points in robot frame. You compute waypoints — or push real-time corrections.

Welding follow the real seam
Inspection trigger reject on defect
Measurement flag out-of-tolerance
4

Robot / PLC Executes

Robot and PLC executing computed path with live dashboard

Robot follows the computed path — or PLC triggers based on what the vision model measured.

Welding robot corrects 1.7mm
Inspection PLC diverts to reject
Measurement log result, stop line if OOT

Two Ways to Close the Loop

Path Planning

Scan -> Measure -> Execute

Camera scans the part. Vision model finds the geometry. Your program computes the full path with corrections. Robot executes. Done in seconds.

Use case: seam finding before welding, pick-and-place offset correction, part localization.

Real-Time Steering

Real-Time Correction

Robot is already moving. Camera and vision model track the process continuously. Your program pushes corrections to the robot 25 times per second. The robot adapts mid-motion.

Use case: seam tracking during welding, adaptive torch correction, real-time quality-based parameter adjustment.

Don't Take Our Word for It

Real welding, real solutions. We can't name every customer, but we can show how it could work on your materials. Send us your parts — we can schedule a demo on-site or remote.

Runs on Neuron — at the Machine

Compact. DIN-rail mountable. Air-gapped capable.

Neuron device, DIN-rail mounted, cables connected

One box. Plug in the camera, connect the robot, wire the PLC. Everything runs here — the vision processing, your program, the robot commands. No cloud, no server room, no IT department.

Camera, robot, PLC — plug in via Ethernet. GigE Vision, ABB EGM, Fairino, Modbus TCP.

Vision model on-device — GPU always included. NVIDIA for high-performance inference at 25+ FPS. No cloud round-trip.

Works air-gapped — no internet required for operation. Connect when you need it — remote support same day, over-the-air updates.

Open a browser — access the full platform from any device on the network. Program, monitor, record.

Works With

ABB IRB robot arm

ABB

EGM, 250 Hz feedback

Now
Fairino FR5 cobot

Fairino

Direct TCP control

Now
Universal Robots cobot

Universal Robots

URScript integration

Coming soon
KUKA robot

KUKA

RSI interface

Coming soon

for any PLC . Any GigE Vision / USB camera . Rocket Retina welding camera

Frequently Asked Questions

Technical answers for automation engineers.

No. Welding is our first and deepest vertical. But the platform is general-purpose. Any workflow that involves a camera, a vision model, and industrial hardware (robot, PLC, or both) can run on Rocket Welder. Quality inspection, measurement, assembly verification — the same platform, same tools.
A compact edge computer with GPU that runs the Rocket Welder platform. Mounts on DIN rail at the machine. Industrial-grade, air-gapped capable. NVIDIA GPU available for high-performance vision inference at 25+ FPS. Camera input, vision processing, your program, robot and PLC control — all run on this one device.
ABB IRB series via EGM (250 Hz feedback) and Fairino cobots (direct TCP control). Both accessed through a unified IRobot interface — swap robot brands without rewriting your program. Universal Robots and KUKA support is planned.
You need to know your process — welding rules, inspection criteria, measurement tolerances. The AI assistant helps you turn those rules into a working program. Under the hood it's C#, but you describe what should happen in plain language. The assistant writes the code. You review and adjust.
Camera captures frames, vision model detects keypoints and geometry, with a laser range detector our SDK returns keypoints directly in robot coordinate system — no manual calibration math needed. Your program reads 3D points it can send straight to the robot. Two modes: path planning (scan the part, get keypoints in robot coordinates, compute the full path, execute) and real-time correction (vision model and program run every frame while the robot moves, pushing corrections continuously). For ABB robots via EGM, corrections run at 250 Hz.
Yes. Everything runs on the local Neuron device. No cloud connection required. Vision processing, recording, robot control — all on-premises. Updates can be pushed via USB or local network.