Working to make tech safe and more healing

Ava Lockwood near a tree.

By Jana Eisenberg

Social workers think a lot about trauma — and use a trauma-informed lens to shape our relationships, communication style and understanding of our practice. But how often do we apply that trauma-informed lens to technology?

As technology exerts an increasingly dominant influence on the field, how can social workers shape the tech we use every day — from the websites and forms our clients navigate to the tools we use in clinical work, training and more? For the first time, the School of Social Work’s Office of Continuing Education is addressing the topic through a self-study training titled, “Introduction to Trauma-Informed Technology and the Vital Role Clinicians Play.”

Carol F. Scott, PhD ’19, and Melissa Eggleston recorded the training and are the co-founders of Trauma-Informed Technology, a consultancy that offers training and education with the goal of working toward a world where tech “is helpful and healing instead of harmful.”

Carol Scott.

Carol F. Scott, PhD '19, MSW

Why is this a topic that social workers are interested in and the school wants to educate our constituencies about?

Carol F. Scott: Social work is interested in technology for a few reasons. First, it’s becoming more embedded in our daily lives, especially with AI. The pandemic changed a lot of things, including social work; using technology was the only way we could see clients or teach. That made the entire field acknowledge that social workers — students, educators, practitioners — must both use technology and be comfortable with it.

Melissa Eggleston: I try to help people understand the importance of the tech and social work connection, for example, by noting that often the first interaction for a potential client is visiting your website or filling out an online form. That first interaction needs to be positive, healing and trauma-informed. So part of our job is connecting the dots between the great clinical work that social workers do and applying a trauma-informed perspective to all types of technology as well.

Melissa Eggleston.

Melissa Eggleston

It seems like a big ask for social work practitioners and educators to add thinking about how the technology itself affects students or clients.

CS: Like me, and the UB School of Social Work, many social workers eat, sleep and breathe a trauma-informed and human rights perspective. Applying it to technology just makes sense.

ME: In 2024, the World Health Organization determined that technology is a digital determinant of health. If you are using the technology, you need to think about how it affects people.

How can technology unintentionally cause harm?

CS: If someone is experiencing intimate partner violence or is a youth looking for LGBTQ+ services, and if a website isn’t trauma-informed in the layout and design, it can be hard for users to quickly find helpful resources, which leads them to get discouraged.

But engineers and designers are not out to get people; sometimes they don’t know or have the time to think carefully. Little changes can make a difference; reducing harm includes something like adding a “quick exit” button to a website, so that if someone comes along, a visitor can just click it and the page is gone, and the website becomes less harmful and more healing.

ME: Another way that tech can cause harm is if it’s not mobile-friendly. Many people access things primarily through a smartphone. It’s easy to shop on your phone, but when it comes to social services, there may accidentally be a lot of friction. A user might need to find a computer to access it, or if there’s a binary gender drop-down and the person doesn’t identify with those categories, they already feel like “these people aren’t going to get me.”

How could the tech world think about this?

ME: The last decade has seen a movement toward ethical technology in general — aiming for “trauma-informed” gives technology designers a framework to help make tech more equitable, more ethical and safer. And more and more consumers are demanding it.

CS: The example I use to advocate for safer technology is the car industry. At first, there was fear that people wouldn’t drive cars with seat belts. But now we can’t imagine driving a car not just without a seat belt, but airbags and cameras too. People will use the car (or the technology) more if they know it’s safer — if they know you’re trying.

How can social workers evaluate the tech they’re using to shape healthier digital tools?

ME: It can be as simple as thinking about an online form you’re asking somebody to fill out. How could it be more trustworthy and transparent? Why are you asking these questions? Is there somebody to call if they have trouble or become worried about the information they’re disclosing? These are fundamental things that help build trust. It’s about collaboration between technology and social work.

CS: Social workers are, by nature, evaluators. We can ask: Is this tech helping or harming my clients? Is there something better? Or, if you don’t have a choice of what tech to use, ask: How can I help make this better?

In this training and in our business, we encourage social workers to use their knowledge and expertise — think of system-level changes and the principles of social work. Tech can make people feel disempowered. But we can all start with baby steps. Make notes of what might work better. Try to meet someone in your IT department. They’re interested in learning how to do better.

Dive deeper

Explore how you can partner with tech designers to create human-centered, trauma-informed technology that’s healing — not harmful. Earn 1.75 ASWB ACE or 2.25 NYSED CEs through an online self-study training from our Office of Continuing Education.