Looking at Technology with a Critical Eye: Social Work Researchers Collaborate to Increase Equity and Decrease Built-in Bias

Diverse group of social work professors and engineering faculty and students.

UB social work faculty/researchers are collaborating with their counterparts and  students in UB’s Department of Computer Science and Engineering (CSE) to  consider alternate ways to address, design and apply technology to better serve the broader population, especially around increasing equity and reducing bias. Shown here, left to right, are Al-Kesna Foster (CSE undergraduate student), CSE Asst. Prof. Kenny Joseph, UBSSW’s Asst. Profs. Melanie Sage and Maria Y. Rodriguez, Ahana Bhattacharya, Benson Cai (CSE undergraduate students), and CSE PhD student  Yuhao Du. (Not shown, but quoted in the story, is CSE Prof. Atri Rudra.)

Print

by Jana Eisenberg

We’ve all heard the hype about technology over the decades and its powerful potential to fix … well, everything. But by now we know that’s not always true. And technology’s got issues. Several UBSSW social work researchers are deeply focused on looking at technology, especially how it can have built-in bias.

And, considering that technology was designed specifically to help do things more quickly and at a larger scale, it can actually perpetuate or increase negative issues that social workers are trying to address.

A few of the most widely vaunted technological tools used to drive decision-making of many kinds are big data, algorithms (basically a set of steps), machine learning—a process by which an algorithm learns from its analyses and adjusts, self-optimizing its performance—and artificial intelligence (AI), which is machine learning (and other methods) applied to tasks usually accomplished by humans.

In addition to challenges within technology’s design, there are acknowledged challenges within all areas of social work around the acceptance, adoption, and application of technology in many of the systems where social workers practice.

The National Association of Social Workers says that “social workers face critical decisions about the lives of … vulnerable children and youths while working in stressful environments that include high caseloads and workloads, inadequate supervision, safety concerns, and limited training and resources (for example, access to emerging technology).Source: https://www.socialworkers.org/Advocacy/Policy-Issues/Child-Welfare. Replace “children and youths” with almost any other population with whom social workers are engaged, i.e., veterans, people with mental illness, and the story is generally the same.

In their research, Asst. Prof. Maria Y. Rodriguez and Asst. Prof. Melanie Sage are either using technology or addressing how it (including social media) affects populations in which they are interested.

Addressing the idea that technology must be integrated into social work education, research and practice, Sage noted, “I’ve always been interested in technology, and as I grew in my field and in my position, I realized ways that technology illiteracy in social work holds us back from innovation; that not knowing the effects of algorithms and other technology can potentially cause harm.”

For those reasons, and as technology becomes even more prevalent, it makes sense for social workers to have some knowledge of technology’s application, issues and potential in their field, as well as its challenges and potential dangers.

Professor Atri Rudra

“As a computer person, what’s great about working with Maria and Melanie is talking about how to think about society at large."

Rodriguez has a lifelong interest in coding, and her graduate work naturally led her to data science (at the University of Washington in Seattle, a technology hotbed). She is currently looking at social media dis- and misinformation particularly for women of color. (In grad school, she analyzed House of Representatives Housing Subcommittee hearing recordings; she’s also been a community organizer.)

“To be of service is my primary reason for being an academic,” said Rodriguez. “Specifically to aid communities that have been on the short end of the stick. One of my projects is working on building a bot to use across social media platforms to help communities of color not to have to defend their existence.”

Sage and Rodriguez both work in collaboration with members of the Department of Computer Science and Engineering, where Rodriguez also serves as an adjunct faculty member; she’s active in interdisciplinary advocacy and collaboration. She is a faculty associate at the Berkman Klein Center for Internet and Society at Harvard University, a member of Twitter’s Academic Research Advisory Board, and a faculty fellow at the Center for Democracy and Technology.

Sage is a national leader at the intersection of technology and social work, providing training for social work practitioners and educators in the ethical and effective use of technology and social media. Among other roles, she serves on the Technology Advisory Committee for CSWE, and is chair of husITa (a national non-profit that promotes the ethical and effective use of technology in human services) and is a co-chair of the American Academy of Social Work and Social Welfare Grand Challenge, Harnessing Technology for Social Good.

“My work is focused on child welfare,” said Sage. “It’s one of social work’s oldest professions and can be slow to innovate. But recent federal changes, emerging technology, a desire for efficiency, and demands to improve services are making change more imperative. My first major project is an NSF- and Amazon-funded grant to think about fair use of algorithms in child welfare.”

“Most of the algorithms that are already being used to determine predictive risk when deciding what services might lead to the best outcomes are based on historical records, which can be biased,” said Sage. “This project is trying to address the fairness and equity component; to help the field understand how using machine learning to predict who gets what can increase bias.” Her work with Computer Sciences faculty includes consulting with child welfare agencies about the best use of data to guide future services.

The other part of Sage’s work is around use of social media for youth aging out of the child welfare system. “Youth in care have many relational losses, including family, bumping from home to home—they often don’t have ongoing supportive relationships. And, we know that youths who can maintain supportive relationships when they leave care are less likely to experience serious negative outcomes,” said Sage. “Child welfare agencies see social media as risky, but those most at risk also have the most to gain—it may give them a way to stay connected and supported. One key to better outcomes is learning how to use social media in healthy ways that minimize risks.” (For comments from Dean Alford on this topic, please visit https://tinyurl.com/smartphones-child-welfare)

Rodriguez iterates that historically or culturally faulty data can drive existing disparity levels even deeper. “An algorithm can only do what its programmer tells it,” she said. “And most programmers were trained by society to think about certain things as more (or less) important. So if you think about categories of people—Black and brown, trans, immigrants—who are marginalized…they are considered by definition ‘at the margins,’ i.e., ‘less important.’”

One of her major projects is looking at social media data; using text/words as data to gather unsolicited personal narratives of those who identify as on society’s margins. “This came from my wondering how people talked about how to navigate systems, and how futile they felt in those systems, given their positionalities,” said Rodriguez, noting that she’s one of the few in social work research using social media this way.

She is also speaking with women of color political candidates to consider using the data gathered to inform policy. “I hope to turn these insights into recommendations, not just for platforms, but to extend ‘social citizenship’ for those traditionally marginalized people,” she said.

Computer Science and Engineering faculty members Prof. Atri Rudra and Asst. Prof. Kenneth Joseph, who are both also member of UB’s Computing for Social Good group, agree that making technology work better for everyone is crucial.

“Big data, machine learning, and AI cannot solve the world’s problems,” said Rudra. “As a computer person, what’s great about working with Maria and Melanie is talking about how to think about society at large. Instead of saying ‘I have a solution,’ social workers talk to people; they try to figure out what will work. That’s a shift in mindset for people like us. How do you collect data about humans—and use that data to make decisions about humans? Where are the pitfalls?”

Joseph agrees. “My work is at the intersection of computer science and sociology; still, I hadn’t really understood social work’s applied nature, especially through the practical social justice lens Melanie and Maria have,” he said. “That’s reorienting my thinking—the nature of my research was to see if we could use computing as a tool for social science; should these tools make decisions or suggestions? That was a core question. Now I’m thinking more about using these tools to combat or decrease human biases and biased decisions.”

The work to approach such a large topic is ongoing, and the questions complex. Former dean, Prof. Nancy J. Smyth also acknowledges that caution is required, both in determining what mode of technology is employed, and how (and who) is creating the technology. Smyth is coauthor, along with Sage, and Laurel Iverson Hitchcock of the University of Alabama at Birmingham, of Teaching Social Work With Digital Technology” (Council on Social Work Education, 2019).

“‘Technology’ in concept is neutral—but once someone starts to create something, it’s no longer neutral,” said Smyth. “People have assumptions and beliefs. That’s why social work educators, researchers, and practitioners need to be literate, and ask critical questions. What is the technology tool? Who created it? What is it being used for? If data is being presented and used, where did it come from—are there protections in place?”

One of Rodriguez’s colleagues at the Berkman Klein Center is Prof. Desmond Patton of the Columbia School of Social Work, where he’s also associate dean for innovation and academic affairs, and co-director of its Justice, Equity and Technology lab.

“Social work scientists and social workers need to understand technology as a tool, and we need to have the right tool,” said Patton. “Tools can have harmful effects when not adjusted to anticipate the needs of and concerns from a diverse population.”

As far as social workers being able to make contributions in technology, Patton said, “They can. We need to apply ‘social work thinking,’ based on the NASW code of ethics. Once we understand problems’ root causes, consider the humanity of every person and reflect on those issues, we can think about how to sow in things like algorithmic design and machine learning.”

Someone like Rodriguez, a social work researcher with computer science and coding skills, can create “a space and place for others to work with her,” continued Patton. “She can help others understand the language and the data—what’s going into it? What’s not in it? Where is it coming from?”

Once again attesting that the field of social work must think about, use, be part of creating, and acknowledge technology, Patton said, “We cannot be social workers in the 21st century without a baseline of understanding of how technology affects our society and our world.”

Productive group meeting for social work and engineering researchers.

For comments from Dean Alford on this topic, please visit our Q&A: Talking Smartphones in Child Welfare story.