technologists as caregivers
Those who take care of others in vulnerable states tend to experience an incredible amount of fatigue, burnout, and even vicarious trauma. Caregivers empathize with people who are suffering every day. They take on a lot of other people’s pain, and over time, it overwhelms them. As a result, there’s been a lot of research and discussion in healthcare, psychology and social work about the the difference between empathy and compassion.
While empathy is this natural capacity we have to feel what another person’s feeling, compassion embodies a sincere desire to take actions which actually help that person. With training, it seems to be possible to care for those facing dire circumstances without taking on the immense burden of their emotions. This is why today’s nurses, first responders, firefighters, shelter workers, and other professionals who serve the vulnerable are being trained in compassion.
As a designer, it’s rare that I’m carrying someone out of a burning building. But the difference between empathy and compassion is still very relevant to my work. It’s less about burnout and more about ethics. In the past few decades, the topic of empathy has exploded among product design and user experience circles. The idea being that in order to design good products and services, we need to empathize with our target audience. This is what I do for a living.
For well over a decade, I’ve been helping teams break out of the assumption that they know exactly what other people need. We all have the natural capacity for empathy required for good design process. When I guide teams to take an empathic approach, it also leads to a healthier collaborative culture. A culture where everyone acknowledges just how much they don’t know, motivating them to gather evidence and inspiration before acting.
When we talk about the ethics of technology, some speak of this kind of empathic, human-centered design as somehow inherently good. They’re wrong. Empathy is not compassion. Nowhere in the paragraph above have we specified what is being designed, or why. Certainly human-centered design is being used to improve hospitals, schools, and daily life. But many use the exact same approach to design manipulative smartphone apps, invasive advertising tools, and better weapons.
Empathy in design is often much more effective, but it gives us no moral superiority or ethical validation whatsoever. For example, let’s say you’re building a new technology platform for the classroom. To assume you know what schools need and build it blindly is hubris and ineffective. You might guess right once or twice, but you won’t be consistent. And when you’re wrong, you might make things worse for people. If you spend some time doing field work in classrooms and testing prototypes with actual teachers and students, you’re bringing empathy into the equation. You’ll be much more effective, especially if you do it early enough.
But what are you doing with the data you’re collecting about these classrooms? Are you using it to make things more convenient but less humane? Are you digitizing education in a way that makes it more efficient but less effective? Are you getting people hooked on your expensive solution without regard for the side effects and unintended consequences in our schools? Or will you hold a truly positive mission to benefit the teachers, students, parents, school admin, and the next generation of society? Will you be combing that data for opportunities for technology to help these individuals accomplish their goals? Will you be monitoring your impact in more places than your own bank account? Will you care?
To design with empathy is effective, but to design with compassion is morally good. The former is a fantastic approach to creative work, but it’s just a tool. Whereas compassionate design sets our first priority to helping others. It reflects a sincere effort to improve the lives of those being served. Empathy and compassion are not the same thing; we need both. As “software eats the world” and the internet becomes an intimate part of who we are, we need more technologists to think of themselves as caregivers, too.
Most big tech organizations which dominate our society are masters of empathy in design, but that does not mean the decisions they’ve made have all been compassionate. McKinsey released a report tracking 300 public companies over a 5 year period in multiple countries and industries, and found design-led organizations return almost double the revenue. In fact, many successful tech companies like Airbnb, Square, Pinterest, Flickr, Etsy, and Kickstarter were founded by designers who embedded empathy into their DNA from day one. But given the influence a scaled platform can have on our identities, health, politics, and relationships, empathy is not enough to earn our trust. We need technologists who are in it for more than themselves. We need compassionate people to guide our digital world.
As we slowly transition from fetishizing to antagonizing Silicon Valley, my peers in tech are facing a true test of their values. Were the countless promises to make the world a better place echoing through decades of TED Talks all just talk? Are the lofty mission statements about connection, belonging, decentralization and democratization just corporate doublespeak? In the end, will we abandon our science fiction dreams of a tech-enabled utopia in favour of heartless capitalism and digital distraction? Or do we care enough to backpedal, slow down, and reflect on what we’ve created and where we’re going?
As a technologist who really cares about this stuff, sometimes I’m inspired by the incredible work being done in attention activism, clean energy, civic tech, and more. But other times, it can feel lonely and hopeless, like trying to grow a tree in a desert. I was recently in just such a funk and I spoke with a mentor of mine. I told her I was starting to feel like the cause might be hopeless. I told her that maybe I need to be more personally ambitious. I’ve been taking the less personally lucrative but more mission-driven road for years, but now I have a child and maybe it’s time to take a few contracts with big tech companies that have nothing to do with my own personal values.
She reminded me that ambition is about a lot more than money. She helped me see that I am, in fact, very ambitious. I’m ambitious about a career where I don’t have to compromise my values to feed my family. I’m ambitious about the role technology could play in our lives - and in our society - if we designed it with a bit more care. She helped me see that the most ambitious people in our society are not wealthy entrepreneurs, but those who fight against the grain for a more just society.
Jay Vidyarthi