Diversity Data and DEI

Findem’s approach to diversity data

As a data platform for talent acquisition, Findem is committed to ensuring a fair representation of ethnicities and genders in every talent pool, as well as analysis and insights to support diversity, equity, and inclusion (DEI) initiatives.
Why Findem?

FAQ: Diversity data and DEI in Findem’s platform

As companies evaluate recruitment software, it is essential to understand a vendor’s approach to diversity data and support for DEI initiatives. This FAQ highlights Findem’s approach to diversity data and DEI analysis.

What is Findem's approach to DEI?

Our approach to DEI significantly enhances the methodologies used by sources like the US Census Bureau. Findem does not label any candidates with ethnicity and gender identification. While the Census primarily relies on surname-based distributions — which are about 70% accurate for gender and 55% accurate for ethnicity and race — Findem augments this data by incorporating additional datasets to build a probabilistic distribution similar to Census data.

These datasets leverage information from a person’s experiences, professional associations, social media footprints, and other relevant sources to help Findem achieve 95% accuracy in gender identification and 85% accuracy in ethnic identification.

Our goal is to enable organizations to build inclusive pipelines by providing the aggregate data to the right stakeholders across the funnel. For example, empowering hiring managers and recruiters to build inclusive searches by providing market maps and giving leaders the rich analytics to ensure funnel conversions are inclusive and devoid of bias.

How does Findem recommend using DEI data?

We are transparent about the predictive nature of these methodologies and acknowledge their limitations and potential for errors. With this knowledge, Findem strongly encourages organizations to use this data for aggregate analysis and market mapping to help build inclusive talent pipelines. By combining diverse data sources and advanced analytical techniques, Findem offers a more comprehensive and accurate picture of diversity to support organizations in their efforts to foster a more inclusive environment.

Is Findem a candidate evaluation tool?

No, Findem is not a candidate evaluation tool that automatically advances or rejects applicants. Findem's Talent Data Cloud is a talent acquisition and recruitment platform that uses data enrichment and AI to assist talent organizations with searching and matching. Deploying our solution should not create bias or discrimination concerns related to AI-based searching or matching.

What data sources does Findem use for diversity attributes?

Findem aggregates publicly available people data, including US Census data, which is verified and triangulated across multiple sources, for the purpose of recording and learning attributes. An attribute is a distinguishing skill, experience, or characteristic that is an inherent part of someone’s person. Attributes help diversify the talent pool, offering a more comprehensive and accurate picture of diversity and supporting organizations in their efforts to foster a more inclusive environment.

We have a team of dedicated data researchers who are constantly curating and improving the quality of data to increase our accuracy across the following categories:

  • First name, last name
  • Country Census data
  • Resume dataset (including affiliations, education, location etc.)

How is diversity defined and classified?

Our models are trained on verified 3D data that is balanced for ethnicity and gender to minimize bias in the talent acquisition process. Diversity of a talent pool is determined through a probabilistic analysis, leveraging a variety of data sources, starting with Census data. We use Census data as a baseline, augmenting and classifying with location, name, resume, education, and work history data.

Public image classification is used to help estimate aggregate market diversity but never used to label a single candidate.

How accurate is Findem's diversity data?

Findem only categorizes pre-applicant ethnicity when our models are 85% or higher in terms of certainty. If the models are below 85%, we do not classify ethnicity. This should be compared to Census data, which is 70% accurate for gender and 55% accurate for ethnicity and race. In general, we do observe our accuracy to be 95% when we are able to classify ethnicity.

Post-applicant data is based on candidate-declared data, which is factual.

How does Findem assist teams in addressing bias?

Findem’s approach is to enable talent teams to be data informed and eliminate bias. The Findem platform provides aggregate analysis and market mapping in real time for every recruiting project. Organizations are encouraged to use these insights during the intake process to set expectations and hold productive conversations with hiring managers.

How does Findem balance talent pools automatically?

To ensure fair representation, we take the following steps with regard to diversity data:

  • Artificially balance training data to deliver a fair representation of all ethnicities and genders
  • Use training data from multiple open-source datasets with in-house data annotation by experts so any bias in a particular data source does not create an overall bias
  • Exclude personal and professional history from the determination of DEI labels
  • Verify outputs from the AI algorithms with a team of experts
  • Identify and remove any possible bias from the algorithms before deploying to production

Why is data important to reducing bias in AI?

AI technologies wield significant power, potentially amplifying biases or perpetuating injustices if not carefully designed and implemented. Ensuring clean, accurate data is key. Findem uses a holistic, probabilistic analysis vs identifying diversity attributes on an individual basis.

How does Findem mitigate bias in AI development?

On the AI development team, a range of perspectives and experiences are considered throughout the design process. Robust testing and validation protocols are used to detect and mitigate bias in AI systems before deployment. Additionally, ongoing monitoring and accountability mechanisms help address biases that emerge post-implementation. Deploying our solution should not create bias or discrimination concerns, as the solution does not rely on AI for searching or matching.

Findem's commitment to diversity

Findem is committed to the support of diversity, equity, inclusion, and belonging in our own workforce, and we support DEIB efforts for our customers. This is a snapshot of how we treat diversity data with care and consideration. Please contact us for more details.

Start with the warmest leads

Turn your talent acquisition strategy inside out with Findem
Request a demo