Rocket Platform

Integrated environment for AI-powered welding automation and inspection.

Rocket Retina welding camera set

Rocket Retina

Industrial welding camera sets with AI compute and HDR monitor.

  • 4 configurations from Core to Max
  • HDR dual-exposure — see pool and surroundings
  • GigE Vision + PoE, 100 m cable run
Rocket Neuron AI compute unit

Rocket Neuron

AI compute units for real-time torch correction at 5–50 Hz.

  • 3 tiers: Mini (5 Hz), Full (25 Hz), Max (50 Hz)
  • DIN-rail or 19" rack mount
  • Cobot + robot controller integration
Rocket Retina camera accessories

Accessories

Lenses, protective glass, enclosures, and industrial CAT7 cables.

  • C-Mount lenses: 16 mm, 25 mm, 35 mm
  • Protective glass refill (10-pack)
  • CAT7 cable up to 50 m

Rocket Cortex

Edge Software for Adaptive Welding Control

Rocket Cortex is the edge software that turns your camera-equipped welding station into an adaptive welding brain. It sees the seam, tracks the weld pool, and sends real-time corrections to your PLC or robot — for MIG/MAG, TIG, and plasma processes. Eliminate defects at the source, not catch them after. Every weld can be logged with full parameter traceability.

Subscription

Continuous AI improvements delivered to your shop floor. Your engineers write the rules — the AI writes the code.

Monthly

$330/mo

Annual

$295/mo

billed annually — $3,540/y

AI Detection Through the Arc

Real-time object detection at 25 Hz identifies edge, seam, weld pool, torch, and electrode/wire — right through the welding arc. These are the inputs for path calculation and live correction.

Path Calculation & Real-Time Correction

Two modes: pre-weld path calculation (find start, stop, and turn points from a camera scan) and real-time correction during welding (compensate for heat distortion, fit-up variation, and gap deviations). Or both combined.

Your IWE Writes the Rules

Describe what you need in natural language — 'when the seam is wider than 2 mm, oscillate the torch' — and the AI assistant generates the adaptive logic. A built-in IDE with autocomplete lets advanced users fine-tune every detail. No software team required.

Protocols & Robots

Connects via ADS, Modbus TCP/RTU, OPC UA, MQTT, and NATS. Currently supports Fairino cobots and ABB robots — if yours has an API, we integrate it.

Contact Us

Rocket Trainer

AI Model Training Platform

RocketTrainer is the core of your automation workflow - train, simulate, and deploy AI models that talk directly to your machines.

Smart Dataset Management

Track exactly what data was used for each model version.

AI-Assisted Labeling

Use previous models to auto-label new footage - just review and correct.

Simulation Before Deployment

Test how your model performs on historical recordings before going live.

One-Click Deployment

Deploy models directly to your camera + edge compute unit.

Contact Us
Rocket Trainer AI Platform Screenshot

Frequently Asked Questions

Answers covering both the camera and the automation platform. For the full picture visit the dedicated pages.

Rocket Retina — welding camera

More on the welding-camera page →
MIG/MAG, TIG, and Plasma. Arc-rated optics handle all common processes from low-current TIG at 80A to high-current MIG spray transfer at 350A+. The camera includes the Neuron Mini processing unit which merges two exposures into one image in real-time.
Neuron Mini captures two frames at different exposures and merges them in real-time. You see both the bright arc and the dark surroundings simultaneously — no detail lost in either. The processing runs in software, so it improves with updates.
Yes. Rocket Retina is power-source-agnostic. It works with Fronius, Lincoln, Miller, ESAB, Kemppi, OTC Daihen, or any other brand. No lock-in to any specific equipment manufacturer.
Yes. Every recording includes full WPS-style parameters (process, material, current, voltage, wire, gas, travel speed). This makes each recording a traceable quality record supporting ISO 3834, EN 1090, and EN 15085 documentation requirements.
Rocket Retina camera + Neuron Mini processing unit + enclosure (standard or mirrored) + protective window kit + installation support (Poland). The Neuron Mini is required — it processes the video feed and merges the two exposures into one image.
Up to 100m with a CAT7 Ethernet cable. Single cable carries both video and power (PoE) — no separate power supply needed at the camera.
Tool-free replacement. Slide out the old window, slide in the new one. Spare windows are available as consumables — standard maintenance for any welding environment.
The camera takes two exposures per frame — a short one that captures the pool border without blowout, and a long one that captures the surroundings. Both are merged in real-time, so you see the full picture in one image. The processing runs in software, so it improves with updates.
Absolutely. We redesigned how a welding camera is built — no special cooling, no pumps, proper enclosure included out of the box. Plus you get all the digital capabilities for free: recording, viewing from other devices on the network, and lossless transmission. The entire signal path — camera to Neuron to monitor — is fully digital. No analog degradation, no signal loss.
Yes — Rocket Retina is built for continuous exposure to a MIG/MAG arc, not just for a short demo. IP65 on both Standard and Mirrored enclosures, a compact 35 × 35 × 122 mm body that fits torch-mounted, fixture-mounted, or robot-end-effector setups, and an interchangeable protective window that takes spatter hits as a consumable (tool-free swap). On cooling: traditional welding cameras try to capture the full arc-to-surroundings dynamic range from a single frame. To do that they rely on older, specialised silicon sensors with very high dynamic range — those sensors run hot and have to be liquid-cooled to stay within spec near the arc. Rocket Retina takes a different path: an industry-grade CMOS sensor captures two sequential exposures, and modified optics, integrated illumination, and software running on the Neuron Mini merge them into one HDR image. The heavy lifting is done off-camera, so the camera body stays thermally lean — no fans, no pumps, no cooling liquids. The practical payoff: fewer failure modes (no pump to fail, no coolant to leak or go stagnant), nothing hydraulic to route into the cell, and maintenance reduces to swapping a protective window.

Automation Platform — Rocket Neuron

More on the welding-automation page →
It depends on which mode you're running. In pre-scanning mode — the path is calculated before the main welding pass, a single scan of the part identifies start / mid / end keypoints and the full trajectory is computed from them — Neuron Mini (3 Hz, ~450 ms end-to-end) is the right fit. That covers around 90% of the jobs we see. In real-time correction mode — the torch is continuously corrected in the plane perpendicular to the motion so it keeps following the seam — the full Neuron (25 Hz, 150 ms, 100+ TOPS) is what you need. Typical real-time cases: material bending under heat as the weld progresses, or a long gantry arm that deflects and has to track the seam continuously. 3 Hz can't close the correction loop fast enough for either. Rule of thumb: Mini for scan-then-weld, full Neuron for correct-while-welding.
No. Welding is our first and deepest vertical. But the platform is general-purpose. Any workflow that involves a camera, a vision model, and industrial hardware (robot, PLC, or both) can run on Rocket Welder. Quality inspection, measurement, assembly verification — the same platform, same tools.
A compact edge computer with GPU that runs the Rocket Welder platform. Mounts on DIN rail at the machine. Industrial-grade, air-gapped capable. NVIDIA GPU available for high-performance vision inference at 25+ FPS. Camera input, vision processing, your program, robot and PLC control — all run on this one device.
ABB IRB series via EGM (250 Hz feedback) and Fairino cobots (direct TCP control). Both accessed through a unified IRobot interface — swap robot brands without rewriting your program. Universal Robots and KUKA support is planned.
You need to know your process — welding rules, inspection criteria, measurement tolerances. The AI assistant helps you turn those rules into a working program. Under the hood it's C#, but you describe what should happen in plain language. The assistant writes the code. You review and adjust.
Camera captures frames, vision model detects keypoints and geometry, with a laser range detector our SDK returns keypoints directly in robot coordinate system — no manual calibration math needed. Your program reads 3D points it can send straight to the robot. Two modes: path planning (scan the part, get keypoints in robot coordinates, compute the full path, execute) and real-time correction (vision model and program run every frame while the robot moves, pushing corrections continuously). For ABB robots via EGM, corrections run at 250 Hz.
Yes — at runtime. Everything runs on the local Neuron edge device: vision processing, recording, robot and PLC control, all on-premises, no cloud required. The only piece that currently needs an internet connection is programming — the AI-assisted code generation in the IDE. Once your program is deployed, the shop-floor stack runs fully offline. Updates can be pushed via USB or local network.