University of Virginia data scientist and criminologist Renée Cummings will visit Grand Rapids this month to share her insights — and warnings — on the growing use of artificial intelligence and algorithms in corporate decision making. The exponential growth in data availability has created the need for companies to make informed decisions when deploying such technology in hiring practices, DEI efforts or growing revenue, for example. That decision making should be done in a way that is justice-oriented, equitable, diverse and inclusive, Cummings says. The former journalist-turned-data scholar will make multiple appearances in Grand Rapids from Nov. 14-17 as part of a weeklong exploration of data ethics led by The Delta Project, Greater Grand Rapids NAACP, the West Michigan chapter of Public Relations Society of America, the Grand Rapids Area Chamber of Commerce, Grand Valley State University and a consortium of tech businesses. Cummings spoke with MiBiz ahead of her visit to highlight key themes of her work and why businesses should be paying attention to ethics in data-based decision making.
How can A.I. and data help company executives who are looking to build diversity, equity and inclusion within their organizations?
What companies want is decision-making accuracy. Companies want to make the best kinds of decisions in real time that will impact the bottom line in a way that is extremely profitable. Data science — and it is definitely an approach that is justice-oriented, equitable, diverse and inclusive — brings a really laser-focused accuracy to the ways in which you can extrapolate intelligence from that data. So companies that are interested in using data to build more diverse, inclusive and equitable approaches to what they’re doing really have an opportunity to do that by ensuring they take an ethical approach to the ways in which they are using data and ensuring also that they bring an interdisciplinary imagination when it comes to understanding, analyzing and visualizing that data to get the best out of those datasets they tend to use.
How have algorithms been deployed in a discriminatory way?
In the past what we have seen is the deployment of algorithms without the requisite due diligence or without an approach that adheres to a really robust and rigorous duty of care philosophy. What we are seeing now is companies understanding that many of our datasets are tied to our history. These being historical datasets that carry with them biases and stereotypes and discriminatory approaches that have been replicating themselves in the kinds of solutions and processes that we’ve been using algorithms to intervene in.
What’s the landscape for resources that are available to companies looking to lean more into A.I. and data in their decision making?
I think the landscape is extraordinary. Just think Google. You can Google anything you want because it’s an algorithm that’s providing you with these extraordinary recommendations on how to improve. I’d be really interested as an organization in getting involved in the data science process. The fact is that there are so many things available at the moment to really upskill employees and companies and just regular individuals in real time.
You’ve talked about the potential for data to be weaponized against someone by organizations or by companies. Can you expand on that?
We all know that we are daily creating an extraordinary amount of data. We’ve just really amassed, individually, this arsenal of data. It is our data that companies are using to create the kinds of business intelligence decisions that they are requiring to profit. And then we are seeing in some cases, unfortunately, companies that are building tools that are not ethical are using our data in ways in which our data could be weaponized against us.
It is so important for us to understand the power of our data, the pervasive use of algorithms and the political culture around data. (We also need) to understand why concerns such as privacy and the protection of privacy and the accountability, transparency and explainability around the building of algorithms and the deployment and adoption are so important to the ways in which we interact with each other, organizations, companies, and just surviving in general.
How can companies use A.I. and data for good when recruiting or retaining talent?
I think one of the ways companies can use A.I. in the recruitment of talent would be the fact that you can have an algorithm that is able to look at an extraordinary amount of data. We’re talking about resumes, dossiers and CVs, and really analyzing that data in real time so you can get many more individuals applying for positions.
But in the process of corralling or curating the best, what is important is not just the use of A.I., but the ethical use of A.I. What we have realized is that in the use of A.I. in hiring, there have been several cases over the last few years that have shown the bias in the datasets really undermining the talent process, and really undermining the kind of extraordinary talent that we could actually recruit. We’ve had so many cases of big organizations, big tech companies designing hiring algorithms and deploying these algorithms and realizing that they, because of that lack of ethical due diligence, created algorithms that really did not do what they were supposed to do, and in return really frustrated the hiring process.
What else should readers know about your upcoming visit to Grand Rapids?
I’m really pleased that Grand Rapids, Mich. has taken this approach to understanding data. What I get from the various agencies that have come together to support my visit to Grand Rapids is the need to do data right, the need to do A.I. right, and the need to embrace new and emerging technologies, but do so in a way that is ethical. I (see) a community that wants to develop and progress and is in search of prosperity but wants to do it in a way that is justice-oriented and equitable, and that wants to use data in ways that Grand Rapids could build a legacy of success by ensuring that access to resources and opportunities are equitable. I think that is very powerful.
Interview conducted and condensed by Andy Balaskovitz. Courtesy photo.