Are you installing racist security systems?
Yikes, that's a pretty inflammatory headline I just wrote there, but it's something to consider after watching the following video, which posits that HP makes racist computers because of new facial recognition software that appears to not follow a black man's face, but has no problem with a white woman. You have to watch this, only because the these two folks are hilarious: My favorite line? "And the worst part is that I bought one for Christmas!" Classic, classic reason why the Internet is the best thing since sliced bread, but it does raise a valid point: This is essentially facial-recognition software, not recognizing a specific face, but that a face is in the frame. I remember not long ago when Dynapel (now called Nio) was touting this very ability as a major accomplishment by its cameras. As video analytics continue to be buffeted by claims they don't work, to the point where companies are now separating themselves from the term even though that's totally what they do, this is just another indication to the public that the technology isn't ready for prime-time. Whether this guy's complaint is at all valid, or whether it's just a single bunk computer or a particularly bad lighting set-up, doesn't really matter. What the public perceives to be true is ultimately what affects the market. This thing is pretty viral at this point. The public is going to think HP computers are "racist," whether they like it or not. What does the "public" think about video analytics, and why?