Automate What
You Can See
If the camera can see it, you can automate it. No 3D scanners, no laser profilers. A camera, a vision model, your rules — running on one device at the machine.
On-premises . Real-time . Multi-robot . No cloud

The Camera Finds the Geometry.
Your Program Computes the Path.
The Robot Follows.
The camera is the sensor. The vision model extracts geometry from every frame. Your program turns that geometry into robot waypoints or real-time corrections. The robot executes. The camera sees the result. Loop repeats.

Camera
Captures the process. Every frame.
| Welding | weld joint |
| Inspection | part surface |
| Measurement | component dimensions |

Vision Model
Extracts geometry of recognized objects — keypoints, edges, dimensions, polygons — as numbers your program can use.
| Welding | seam edges, gap 2.3mm |
| Inspection | defect at 3 o'clock |
| Measurement | width 45.2mm +/-0.1 |

Your Program
Keypoints become 3D points in robot frame. You compute waypoints — or push real-time corrections.
| Welding | follow the real seam |
| Inspection | trigger reject on defect |
| Measurement | flag out-of-tolerance |

Robot / PLC Executes
Robot follows the computed path — or PLC triggers based on what the vision model measured.
| Welding | robot corrects 1.7mm |
| Inspection | PLC diverts to reject |
| Measurement | log result, stop line if OOT |
Camera

Captures the process. Every frame.
| Welding | weld joint |
| Inspection | part surface |
| Measurement | component dimensions |
Vision Model

Extracts geometry of recognized objects — keypoints, edges, dimensions, polygons — as numbers your program can use.
| Welding | seam edges, gap 2.3mm |
| Inspection | defect at 3 o'clock |
| Measurement | width 45.2mm +/-0.1 |
Your Program

Keypoints become 3D points in robot frame. You compute waypoints — or push real-time corrections.
| Welding | follow the real seam |
| Inspection | trigger reject on defect |
| Measurement | flag out-of-tolerance |
Robot / PLC Executes

Robot follows the computed path — or PLC triggers based on what the vision model measured.
| Welding | robot corrects 1.7mm |
| Inspection | PLC diverts to reject |
| Measurement | log result, stop line if OOT |
Two Ways to Close the Loop
Path Planning
Scan -> Measure -> Execute
Camera scans the part. Vision model finds the geometry. Your program computes the full path with corrections. Robot executes. Done in seconds.
Use case: seam finding before welding, pick-and-place offset correction, part localization.
Real-Time Steering
Real-Time Correction
Robot is already moving. Camera and vision model track the process continuously. Your program pushes corrections to the robot 25 times per second. The robot adapts mid-motion.
Use case: seam tracking during welding, adaptive torch correction, real-time quality-based parameter adjustment.
The Platform
How you go from hardware on the shop floor to a system you trust.

01
Setup
Industrial grade. We set it up. After that — plug and play.
First installation is on us. We connect the camera, the robot, the PLC. We calibrate. After that, if something changes — swap a camera, add a device — you can do it yourself if you want to. The AI assistant is always there to help.
"I don't need to worry about the hardware."

02
Program
You describe the rules. AI writes the code.
You know your process: "when the gap is wider, oscillate more." You describe the rule. The AI assistant turns it into a working program. You review the code — it's readable. Under the hood it's C#, but you don't need to be a programmer. You need to know your process.
"I don't need to be a programmer."
See how ->
03
Operate
See what the robot sees. Trust the machine.
The operator looks at the HMI. The vision overlay shows keypoints on the seam, pool boundary, correction arrows. The operator sees that the machine understands the process. It's not a black box. When something looks wrong, it's visible on screen before the weld goes bad.
"I can see the machine understands."
See how ->Don't Take Our Word for It
Real welding, real solutions. We can't name every customer, but we can show how it could work on your materials. Send us your parts — we can schedule a demo on-site or remote.
Runs on Neuron — at the Machine
Compact. DIN-rail mountable. Air-gapped capable.

One box. Plug in the camera, connect the robot, wire the PLC. Everything runs here — the vision processing, your program, the robot commands. No cloud, no server room, no IT department.
Camera, robot, PLC — plug in via Ethernet. GigE Vision, ABB EGM, Fairino, Modbus TCP.
Vision model on-device — GPU always included. NVIDIA for high-performance inference at 25+ FPS. No cloud round-trip.
Works air-gapped — no internet required for operation. Connect when you need it — remote support same day, over-the-air updates.
Open a browser — access the full platform from any device on the network. Program, monitor, record.
Works With

ABB
EGM, 250 Hz feedback
Now
Fairino
Direct TCP control
NowUniversal Robots
URScript integration
Coming soonKUKA
RSI interface
Coming soonfor any PLC . Any GigE Vision / USB camera . Rocket Retina welding camera
Frequently Asked Questions
Technical answers for automation engineers.