Psychologist Francine Trindade, 28, says she has already sent 434 resumes on the Catho platform alone — not to mention the other recruitment sites on which she is written. Since January, she has been looking for a position in the human resources area, after specializing in people management last year. During this period, she received three invitations for an interview.
“Of the interviews I did, I only received feedback from one, which I did in person in May”, he says.
The situation experienced by Trindade illustrates the current situation of job seekers. On the one hand, people apply with unprecedented ease thanks to virtual platforms — some don’t even require a resume. On the other hand, companies have more information than ever to filter when selecting, but the selection processes end up taking too long and are closed without feedback for all candidates.
In response to uncertainty, influencers have been using social media to give tips on how to get back into the market. Some are useful, some are not.
On TikTok, for example, people discuss the effectiveness of filling the CV with small keywords written in the same color as the CV background. The idea is to be imperceptible to the human eye, but readable by recruitment algorithms.
The objective of this tactic is to appear at the top of recruiters’ lists. This is because current platforms use programs — sometimes artificial intelligence — to sort candidates by compatibility with the vacancy. These are ATS (applicant tracking systems, in free translation).
The logic is similar to the ranking of links delivered by Google, which two decades ago operated through keywords.
However, according to Bianca Ximenes, a researcher on machine learning ethics, the white font trick does not work in AI-based algorithms.
Not even the creators of ATS themselves know exactly how programs made based on machine learning work — called black boxes by researchers in the field.
Gupy, for example, uses AI to sort candidates and claims that people don’t need to use keywords. “Our algorithm reads the context, the semantics, in addition to the word itself. If the vacancy asks for a leadership position, the candidate may be suitable if he says he has managed a project”, says the co-founder of Gupy, Guilherme Henrique Dias.
Catho also uses artificial intelligence to find similarities between the attributes listed on the candidate’s resume and what the vacancy requires. The company considers professional experiences, technical skills, training and other criteria, according to company director Fabio Maeda. InfoJobs uses a similar technique.
On both recruitment platforms, candidates can subscribe to paid plans to find out their level of compatibility with the position. Data analysis services, on all sites, are available to contracting companies.
In Ximenes’ assessment, the lack of information about objective criteria confuses people. Without answers, candidates look for explanations that seem plausible — such as keywords in white font.
The tactic works in programs that perform simple term filters. However, it is ineffective because the best resumes still undergo human review.
For Ximenes, adding keywords does not constitute an ethical violation if the professional has skills related to the term.
Recruiting companies can place eliminatory filters on vacancies, such as length of experience. “It is good practice for companies to inform what disqualifies automatically”, says Ximenes.
Gupy says it follows the practice. The company says it plans campaigns to make people aware of how the platform works. “It’s a challenge to reach 40 million people registered on our website”, says Guilherme Henrique Dias.
Renata Lino, creator of the job platform for mothers Mommy Tech, says that she recommends that mothers add the keywords in the job description to their CV. “It is essential to have your CV adapted to the opportunity.”
Furthermore, she suggests that candidates use artificial intelligence help when formulating specific resumes. “Ask ChatGPT to adapt the text to the keywords taken from the job description,” she says.
Another common question concerns information about gender, color and ethnic origin. Catho, Gupy and InfoJobs AIs do not evaluate this personal data.
According to Dominik Hangartner, professor of public policy at the London School of Economics Dominik Hangartner, even embedded systems with AI can reproduce prejudices if gender, color or ethnicity correlate in some way with language.
Hangatner showed in a 2021 article that people from the Middle East and North Africa were called 19% fewer times for interviews on a Swiss job site equipped with ATS, compared to local white candidates.
Gupy and Catho claim to follow ethical principles for applying AI to avoid bias. Both companies say they regularly monitor minority hiring data to reduce bias with adjustments to the models.