How your Face’s Hidden Emotions may Dictate whether you Get the Job or Not
Whipping up a quality resume is far from the only thing you need to worry about when getting ready for a job interview.
Today’s technologies have advanced to the point where it’s now possible to use emotion analytics to read facial reactions not visible to the everyday human eye.
It’s been suggested that using this software could assist HR reps in deciding whether you fit the job or not.
Imagine the impact it would have if a computer could read the involuntary facial reactions that happen to anyone in a job interview. It’s a little twisted, is it not! Instead of being judged on the merits of your qualifications, your appearance, your accomplishments, or your conduct, emotion analytics software some argue is a more accurate way of hiring in some fields.
How emotion analytics software works is by analyzing the micro-expressions that our face makes that we cannot see. They happen so fast, lasting as long as a twenty-fifth of a second. By using this software, a person’s surface confidence can be seen through towards areas where a job prospect may appear uncertain or nervous.
If emotion analytics software sounds a little mean to you in this context, it might be because it sort of is. Granted, there is an argument behind its use in job application situations, primarily because of the results it’s been able to produce in other categories of use. For example, emotion analytics software has been used in security situations and has been able to successfully identify threats among crowds of people. Emotion analytics has also been used in market research, providing valuable insight that is not necessarily recorded without it. So considering all this evidence, why does it seem so wrong for emotion analytics to be applied in HR – well, check this out.
Any HR rep tasked with hiring is looking for someone who is a good fit for the business and that goes beyond qualifications on a resume. Part of the job interview is about assessing a person’s competency level and their socio-cultural background in an attempt to determine whether they may fit into the existing business model. Much like human beings, emotion analytics software can discriminate against individuals who it deems to “not be a suitable fit”. In addition, it’s also recording and analyzing information taken without the consent of the participant, creating some privacy concerns that are well-founded.
There’s no doubting that emotion analytics is an amazing piece of technology. In this application though, there’s something uncomfortable. The argument continues to be that this software can tell a software owner the way that a person actually feels about something, even if that person is not communicating it directly.
What this argument does not establish is whether a software owner has the right to know such information, if procuring this information without consent is appropriate, and the potential that exists for this software to be abused.
We admit it’s a little creepy! The primary barrier to implementing emotion analytics technology is privacy laws and the required data collection that this software carries with it. Though in the future that may change, for now, hopeful job applicants can rest easy walking into an interview that they won’t be confronted with emotion analytics software.