We have ethical responsibilities when coding. We're able to extract remarkably precise intuitions about an individual. But do we have a right to know what they didn't consent to share, even when they willingly shared the data that leads us there? A major retailer's data-driven marketing accidentially revealed to a teen's family that she was pregnant. Eek.
What are our obligations to people who did not expect themselves to be so intimately known without sharing directly? How do we mitigate against unintended outcomes? For instance, a social network algorithm accidentally triggering painful memories for families grieving their child's death.
We design software for humans. Balancing human needs and business specs can be tough. It's crucial that we learn how to build in systematic empathy.
In this talk, we'll delve into specific examples of uncritical programming, and painful results from using insightful data in ways that were benignly intended. You'll learn ways we can integrate practices for examining how our code might harm individuals. We'll look at how to flip the paradigm, netting consequences that can be better for everyone.