Here's a number: 200. That's how many images I shoot on an average product day. Some days it's 400. Headshot days, it's 300+.
And here's the part of the job nobody talks about: after the shoot, you sit down and go through every single one of them. Is this sharp? Is that one slightly soft? Did the client blink? Is this a duplicate of the one before it? Is the exposure clipped?
That's not creative work. That's data entry. And it takes 30-45 minutes on a light day, hours on a heavy one.
I built Smart Cull because I was tired of doing it.
What it actually does
Blur detection: Analyzes every image for motion blur, camera shake, and soft focus. Not "is this image blurry?" — it scores sharpness and tells you exactly how sharp each image is on a 0-100 scale. A portrait with tack-sharp eyes at 95 vs. one that's slightly soft at 68. You see the number, you make the call.
Face focus scoring: Uses MediaPipe (Google's face detection AI) to find faces and score eye sharpness independently. Because a landscape might be sharp overall, but a portrait with soft eyes is unusable. Smart Cull knows the difference.
Exposure analysis: Flags clipped highlights and crushed shadows. Not just "overexposed" — it tells you the percentage of blown pixels and whether it's recoverable. Because a tiny specular highlight is fine; a blown sky is not.
Duplicate detection: Groups near-identical shots using perceptual hashing. When you shoot 8 frames of the same product at the same angle, Smart Cull groups them and highlights the sharpest one. You're not comparing 8 images — you're confirming one pick.
Blink detection: Yes, it catches blinks. On headshot days, this alone saves me 15 minutes.
How it works in practice
You shoot. Between setups — while you're adjusting a light, swapping a product, getting coffee — you hit Cmd+Shift+A. Smart Cull starts analyzing in the background.
When you come back to the Mac, every image has a badge:
- Green (Keep): Sharp, well-exposed, not a duplicate
- Yellow (Review): Might be soft, might be a duplicate, worth a second look
- Red (Reject): Blurry, blinked, badly exposed, or a worse duplicate
You scan the yellows, confirm the greens, and you're done. A 200-image cull that used to take 30 minutes takes 3.
The important part: it runs on YOUR Mac
Smart Cull uses OpenCV and MediaPipe. Both run locally on your Mac. Your images never leave your computer. There's no cloud upload, no API call, no subscription to an AI service.
On Apple Silicon (M1/M2/M3/M4), it processes about 0.5 seconds per image. A 200-image set takes under 2 minutes. On Intel, it's 1.5-2.5 seconds per image — slower but still way faster than doing it by hand.
No monthly AI fee. No per-image charge. No cloud. The AI runs on the same Mac you're tethering to. It's included in the $99.
Why this matters for your workflow
Time is money. Literally. If you charge $250/hour and Smart Cull saves you 30 minutes per shoot, it pays for itself in two shoots. After that, every shoot puts money back in your pocket.
But honestly, it's not about the money. It's about not spending your creative energy on the most tedious part of the job. Cull faster, edit sooner, deliver quicker, get paid, go home. That's the whole point.