
The Hidden Challenges of AI Cameras
What You Need to Know
At first glance, AI-powered human recognition cameras appear to be the perfect solution for forklift safety. Manufacturers promise near-flawless performance, each claiming to offer the most advanced technology available. But as with any AI product, the real question is: does it actually deliver where it matters most—in your high-risk, real-world environment?
Cameras have a role, but also real limits.
Cameras can be valuable safety tools. Reviewing video of near-misses supports better training, investigations, and policy decisions. But using a camera for real-time detection of pedestrians in dynamic workplace environments is far from simple. It’s the same reason self-driving vehicles remain a work in progress—even with their array of sophisticated onboard sensors.
Where an AI camera is genuinely the only option at a workplace, then of course they should be used, because protecting people is the highest priority. But don’t be fooled into thinking that just because it’s AI, it’s foolproof. At SEEN, we regularly hear from customers who’ve tried them, only to be disillusioned by false alerts and missed detections. In safety-critical zones like behind a reversing forklift, that’s a risk no one can afford.
1. Hardware matters
Some AI cameras we’ve seen used on forklifts can be purchased on AliExpress for a few hundred dollars, while other high-end stereo AI cameras can cost $8,000+. Performance tends to reflect that gap, and given that even premium systems face challenges, it’s worth asking: should you trust a bargain-basement camera to do a safety-critical job?
2. It’s not true AI
So-called “AI” cameras often use pre-trained models, and don’t learn and adapt to your unique environment, making them only as good as the data they’re trained on. If that data doesn’t represent your particular operating conditions, then the system’s accuracy will suffer.
3. Vulnerable to environmental changes
Environmental factors such as dust, rain, cold, condensation, poor lighting, or complex backgrounds can degrade performance significantly. One top-tier OEM warns its AI camera ‘may not detect pedestrians if they’re carrying objects, standing near walls, standing in groups, or wearing clothing that blends into the background.’
4. Distance perception
Even advanced cameras with stereo depth perception can struggle when a person is partially obscured, sitting, crouching, or standing too close to other objects. False positives and missed detections become even more likely in camera systems without distance perception features.
5. Data security
AI cameras in the workplace can pose cybersecurity risks, including weak encryption, unauthorized data access, and potential exposure of sensitive footage. These devices may also serve as entry points for broader cyberattacks, making regular security reviews and careful vendor selection essential.
The AI Detection Dilemma
AI cameras walk a fine line. If they’re too cautious, they trigger frequent false alarms. If they’re too conservative, they miss real threats. The balance between over-alerting and under-detecting is a fundamental limitation of human form recognition cameras.
A typical AI camera user-guide limitation notice might read something like this:
There are limitations to the recognition performance of the system. To perform effectively, the camera lenses must remain clean and unobstructed. Dirt, dust, or debris on the lens can impair image quality and reduce detection accuracy. Regular inspection and cleaning of the sensor area is essential. The sensor may not always be able to recognize people as pedestrians in the following situations:
Sitting down, crouched, running, standing in groups, lying down, holding objects.
If a person is partially obscured by objects like boxes, barriers, or shelving.
Standing close to a wall or wearing clothing that blends in with the background.
In poorly lit, high-contrast or strongly backlit environments.
In reduced visibility situations such as fog, steam, smoke, rain, or other.
Dirty or wet lenses.
Strong glare or reflections.
The system may sometimes mistake non-human objects for pedestrians. This is more likely in situations with poor image quality, such as:
Dirty or wet lenses
Poor lighting
Strong glare or reflections
Low visibility due to environmental conditions.
A Smarter Approach to Pedestrian Detection
For health and safety managers, the message is clear: Think carefully about how much reliance you place on AI cameras as the primary means of detection in safety-critical applications. A smarter solution combines active sensing—to deliver reliable, real-time detection—combined with camera-based imaging used to capture visual evidence and support AI-powered analysis of safety trends and near-miss events.
That’s exactly the approach we’ve taken at SEEN. Our patented IRIS 860 sensor uses infrared LIDAR technology to actively detect retroreflective tape on standard PPE, delivering precise, reliable, and tagless detection, even in challenging environments, and without the limitations of passive camera-only systems.
At the same time, the SEEN System captures and uploads AI-classified images of near-miss events, providing actionable insight into workplace risks, so you can make informed safety improvements, backed by real data.