FABTECH 2025

·Val Kamenski

I just got back from Chicago, where FABTECH 2025 wrapped up at McCormick Place. Nearly 50,000 people and 1,700 exhibitors were there. Every hall was packed with massive robots, heavy machines, and plenty of talk about where manufacturing is heading.

A major focus of the expo was production automation — aimed especially at easing workforce challenges and improving efficiency.

What I Observed on the Floor

  • Welding robots and cobots were everywhere.
  • The shortage of skilled welders in the U.S. keeps getting worse.
  • Sanding and painting robots were also widely showcased.
  • Universal Robots and FANUC robotic manipulators were used in most demos.
  • People talked a lot about manufacturing coming back to the U.S. and the need for machines that can run 24/7.
  • While robots are expected to work non-stop, only a few companies are thinking about predictive maintenance and anomaly detection.
  • Buyer question #1: “Where is this robot manufactured?”
  • Companies like FANUC, Path Robotics, Productive Robotics, Standard Bots, ABAGY, and Teqram are using vision systems, while most others still rely on rigid, pre-programmed path-following algorithms.
  • Only a handful of companies specialize in quality control, whereas FANUC and Path Robotics include it as part of their vision systems.
  • Moving forward, welding cobots will need a more elegant solution to demonstrate a welding path. Today, physically moving the manipulator is still arduous.
  • Every robotics company pushes its unique software, while real customers are complaining about the lack of integration in this fragmented ecosystem.

Tariffs and Reshoring

Tariffs have shifted the equation, but by how much? Do U.S. robotics makers now have enough cost and competitive advantage to reshore production, or do Chinese and other global vendors still provide stronger cost/performance for most buyers?

It’s not something anyone can calculate with certainty today — we’ll see how it plays out over the next few years.

Do Robots Need to Be “Highly Intelligent”?

One thing I kept asking myself at FABTECH is whether robots really need to be “smart” to get the job done.

In areas like welding and painting, I noticed two clear trends:

  • Most robots still need to be taught manually for every new type of part. Moving the robot arm to show it the path looks simple, but if you’re changing types of parts several times a day, it becomes tedious. Parts also have to be placed in exactly the same position every time, because the robot just follows fixed coordinates without any real intelligence. I heard a few people on the floor complain about this and the need to have a manual welding machine standing by.

  • The newer generation of robots use cameras and vision systems to recognize parts, identify where the welds or paint lines need to go, and adapt automatically. These don’t require the operator to carefully reposition each part, which makes them much easier to work with.

So who really needs the extra intelligence?

  • If you’re a large manufacturer running a high-volume line of identical parts, you can probably save money with simpler robots. Training them once every few months is manageable, and consistency matters more than flexibility.
  • If you’re a smaller shop working with a lot of different parts, smarter robots with vision make much more sense. They cut down on repetitive training and reduce the risk of errors when parts aren’t perfectly aligned.

However, from what I overheard in real conversations at the expo, many customers were happy with just small improvements in cobot training. For example, one company has been advertising a new algorithm that reduces the number of welding points the operator has to show to the robot which is simply an improvement to a rigid training algorithm. But even that incremental step feels like a breakthrough to them, and they’re ready to pay for it.

What I Didn’t See

I didn’t see much “general robotics” or factory-wide autonomy. No humanoid robots walking a shop floor, picking a part, and welding or painting it end-to-end. Even advanced cells that use cameras often rely on labels to anchor coordinates. Imitation learning looks promising, but for most companies and real use cases this is still in the distant future.

Conclusion

This year felt centered on “deploy-today” automation with modest intelligence. The next step is making robots easier to teach and easier to integrate. If those boxes get checked, adoption will accelerate.

Here is a bonus video that shows the overall atmosphere on the expo floor.