This is more of bias as a service rather than a particular instance of bias.. a company called Faception  - 'faception is a facial personality analytics technology company'. Based on the idea of physiognomy – that a person's facial features can be indicative of character and personality.

I would like to believe that this a fake company (maybe this weeks tno theme has me being extra skeptical) – these are some of the classifiers they advertise on their website:        

this page has now been updated on their website - the caricatures have been removed

This idea is not so far of from this report from earlier this year – "meet the professor who says facial recognition ​​can tell if you're gay"..

I was thinking of the connection between machine bias and stereotyping, and wondering how many machine learning / predictive modeling things would hold up if they were to accurately represent the whole population.. wouldn't the model have to lose so much detail in order to not be biased, and hence be almost useless?