Seeing Beyond the Stitch: A Journey to Build the Right Vision System

How we tackled lighting, detection, and integration challenges to build a resilient, real-world machine vision system

When the customer came to us, their request sounded straightforward — detect the cloth color, the stitch color, and some small plastic parts on car interior components. But there was a catch… or actually, several.

They’d already tried a few things — color sensors didn’t work because the stitches were too thin, and the part couldn’t move much during inspection. Then came line-scan cameras, but even those couldn’t deliver consistent results. They tried fixing the part in place — that helped a bit — but a bigger problem showed up: lighting. Since the inspection area was open to daylight, the amount and angle of light kept changing through the day, messing up the detection.

Then came the real ask:
“We want a reliable system that works in any lighting. It should show results on an HMI, print a QR code linked to the detected model, and let us scan the code to verify it. And yes, the setup can’t be fully enclosed.”

So began our journey

  • We started by testing all the different samples they gave us. Our first idea was to measure color ratios from the images. But the values kept changing and different models had overlapping colors — not helpful. We tried adjusting the camera’s lens and using image filters, but the problem stayed.
  • Then we moved to identifying stitches by matching them to specific color ranges. This worked better — until lighting conditions shifted again. That’s when we turned our attention to lighting itself. After testing a few options, we found that natural white light, around 4500K, brought out the red tones we needed and reduced the greys that confused the system. To block out external light, we built a black stainless-steel enclosure — open only from the front — which helped keep things stable without fully closing the setup.
  • With detection sorted, we focused on the user interface and integration. We connected the system to the customer’s PLC, added a QR printer and scanner, and showed all results — cloth, stitch, part type — on an HMI and monitor. Sounds good, right? But we hit another roadblock: timing.
  • Our software was processing faster than the PLC could handle, causing communication errors. Sometimes the PLC even froze. We fixed this by adding small delays, status checks, and error alerts on the HMI so operators would know exactly what went wrong — camera issues, hardware faults, anything. We also added a manual mode, where users could just pick a model on the screen and print its QR code — controlled by a simple key switch.
  • Once everything was in place, we tested the system on over 500 parts across different models. For two to three days, it ran non-stop. The results? Accurate. Reliable. Smooth.

The customer was thrilled:

“You’ve taken a major pain point off our hands. Now each part is identified clearly, and our quality has improved.”

This project wasn’t just about vision detection or QR codes. It was about solving real problems step by step — from tricky lighting to smart integration — and building a system that fit right into their workflow.

Every project teaches us something. This one reminded us that even the smallest stitch can lead to the biggest difference.