Interviews are often the only way employers have to get to know a candidate personally before they offer them a job. While background and reference checks are standard for most new hires, they don't always catch behaviors that can clash with a company's culture and values and create a toxic work environment.California-based Fama Technologies is offering another layer of screening to help identify risky applicants before they get hired.
Fama, which was launched in 2015, creates risk profiles for job applicants or even current employees by scouring publicly-available content online, including social media, message boards, blogs, news articles and comment sections that can be tied to a current or prospective employee. Read More"We do not score, there's no thumbs up or down," said Fama CEO Ben Mones."We aren't saying anything about the person ... We can say this piece of text is an example of bigotry.
Whoa
USA NASA. USA FEDERAL PROSECUTORS USA CIA FBI USA FEDERAL COURT JUDGE FEDERAL JUDGE WE NEED PRIPERIN FOR EMERGENCY LANDING IN WASHINGTON BEFORE THE KING NIGHT PLAN
how long before they start flagging political views? the media is already trying to silence conservative views...
how long before they start flagging political views? the media is already trying to silence conservative views
Out of context: I followed CNN because you said it was your job to tell the difficult stories. Pure B.S.Commercialism, from my unique view you are useless to freedom, truth and justice, our new first line of defense for these qualities are in the watchdog groups. Unfollow
Who decides what is racist and sexist? This is a lawsuit waiting to happen.
Violation to people's privacy. CNN loves eliminating the 1st amendment to the right, but Sues white house any time they aren't permitted seat in white house and allowed to rattle on insults fake news forever and put hands on white house intern (Jim Acosta is a disgrace fake news)
Censorship
GregCraig
I am not defending racist or sexists here but this kind of technology seems really dangerous. So anything you have ever said online from day 1 could possibly used against you to find employment in the future. Can AI identify sarcasm or taking something out of context?
This artificial intelligence screening does not always work. I am legally blind and on one of my reports I was rejected because I used the term I can not see that in a social media environment. The AI interpreted the words, not see, different way than i use it.
who determines if its racist or sexist.
Did you report this story today? The man accused of raping a New Jersey jogger before drowning her in a lake is an illegal immigrant from Honduras who had already been kicked out of the U.S. twice before, authorities said Thursday.
Big Brother is alive and well.
Malaysia Malaysia Latest News, Malaysia Malaysia Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: WSJ - 🏆 98. / 63 Read more »