Large networks like social media platforms, highway systems, and even our genes contain vast amounts of data hiding in plain sight. However, the techniques scientists design to learn about the nonlinear relationships within these structures often result in unintentional discrimination against historically disadvantaged groups. These biased outcomes are what electrical engineering and computer science professor Sucheta Soundarajan is working to prevent by bringing fairness to network algorithms.
Soundarajan has received a National Science Foundation (NSF) CAREER Award for her research on algorithms for network analysis. The grant is a single investigator award intended to support Soundarajan’s professional development. In addition to providing funding for research, it will support a number of non-research service projects.
“Anytime I get a grant it feels great because it is validation from the larger scientific community,” said Soundarajan. “This one especially because it is tied to me as an individual and not just the project. It feels like I am being validated as a scientist. It means a lot.”
Although the award is an individual accomplishment, it is supporting research that has potential to benefit communities around the world. Increasingly, information is becoming acquired from network analysis and what scientists are finding is that despite algorithms not having access to protected attributes like age, disability, gender identification, religion and national origin, they still end up discriminating against these groups.
“What we’re seeing is that people from these minority and disadvantaged groups are being wrongfully discriminated against at a higher rate,” said Soundarajan. “We want to create algorithms that automatically find people central within a network but do it in a way that is fair.”
Soundarajan says criminal sentencing and lending are two examples of areas where algorithms are used to make crucial decisions and where scientists have detected potential wrongful discrimination. Another example of a fairness issue is in the way we connect with each other on social platforms. Friendship recommendation algorithms can exacerbate a tendency for people to seek out those who are similar to themselves.
“Taken to an extreme, if people follow these recommendations, people end up in silos where they only connect to people who are like them and that is how you end up with echo chambers,” said Soundarajan.
Outside of her research, Soundarajan will have the opportunity to hire a graduate student to help develop ethics-based modules that can become part of computer science courses with the hope it will help students develop ethics focused thinking.
“We’re going to design these labs where we will give students a data set and they will apply some algorithms to it and then they will look at the results and they will have to think about are these results fair,” said Soundarajan.
Soundarajan will also be looking into developing continuing education for lawyers. She hopes to create classes that focus on explaining how algorithms can cause discriminatory issues.
Committing her time and talent to something societally meaningful is important to Soundarajan. She credits the support she has received throughout her life as a factor in choosing her research area, and she recognizes the help she has received from members of her department contributed to her latest achievement.
“There has been so much invested in me as a scientist, I feel like I have the moral obligation to do something that benefits everybody,” said Soundarajan. “I have been really fortunate to be surrounded by people who really want to see me succeed and that’s been true at Syracuse University as well. People have given me their time, spending hours reading the proposal that got me this award, and that means a lot to me.”