My security camera is lonely.
That’s the only explanation I have. Three people detected in my garage at 11pm. I check the feed. Nobody there. Just my truck, parked, windshield aimed at the camera, clouds drifting across the glass. And somewhere in that blob of reflected sky, my AI saw faces.
It saw them at 0.74 confidence. That’s not a guess. That’s commitment.
I’ve been setting up Frigate for home surveillance, and it works really well. When my truck pulls in, it knows. When a car drives by, it knows. The actual detection part is solid.
But then there are the clouds.
Frigate runs a computer vision model that has seen millions of images of people. It learned what humans look like from every angle, in every lighting condition, half-obscured and fuzzy and far away. And apparently, somewhere in all that training, it also learned that cloud reflections in windshield glass are basically people. Like, a lot.
I don’t blame it. Pareidolia, the brain’s tendency to find faces in random patterns, is something humans do too. We see faces in toast, in wood grain, in the side of a cliff. The difference is that when I see a ghost face in my truck’s windshield, I register it as “weird cloud shape” and move on. Frigate registers it as “probably Dave.”
The AI is so eager to find humans that it started inventing them. Which is kind of sweet, actually. Kind of sad. Mostly annoying.
The thing about a 0.74 confidence score is that it’s high enough to trigger alerts. Frigate’s default threshold sits around 0.7, which makes sense as a starting point. You’d rather get a false positive than miss someone in your driveway. But “miss a real person” and “be haunted by ghost people in your own garage every time it’s cloudy” turned out to be two failure modes I cared about differently.
After some trial and error I landed here:
objects:
track:
- person
- car
filters:
person:
min_score: 0.70
threshold: 0.80
min_area: 3000
The min_area: 3000 is the thing that actually fixed it. Cloud reflections are diffuse. They spread across the glass in ways that don’t look like a human-shaped blob. A real person in frame takes up a certain number of pixels. The cloud version apparently doesn’t meet that bar.
Raising the confidence threshold from 0.7 to 0.8 helped too. The ghost people were real enough to the model, but not that real.
What I find weirdly funny about all of this is that the AI kept doing exactly what it was supposed to do. It wasn’t broken. It was doing its job with more enthusiasm than the job required. It found humans in the data because that’s what it’s trained to find. Clouds kind of looked like humans. It went for it.
There’s a version of this that’s a useful metaphor for AI in general, the model that finds what it’s looking for even when it isn’t there, because it’s been trained to look so hard. But honestly I mostly just think it’s funny that my camera got lonely and started seeing people in the sky.
The ghost sightings have stopped now. The automation works. When something real pulls into the garage, I hear about it.
But I kind of miss the cloud people. They were very committed.
References
- Frigate - open source NVR with AI object detection
- Home Assistant - open source home automation platform