Azeem Azhar's Exponential View

How will the future unfold? What is the impact of technology on business & society? As technology reorders the world in which live, who will be the winners and who will be the losers? Join Azeem Azhar, curator of the Exponential View newsletter, in deep conversation with the world's leading thinkers and practitioners exploring these and other important questions. The views expressed on this podcast are those of its hosts, guests, and callers, and not those of Harvard Business Review.

https://hbr.org/podcasts/exponential-view

subscribe
share






episode 35: How to Practice Responsible AI


From predictive policing to automated credit scoring, algorithms applied on a massive scale, gone unchecked, represent a serious threat to our society. Dr. Rumman Chowdhury, director of Machine Learning Ethics, Transparency and Accountability at Twitter, joins Azeem Azhar to explore how businesses can practice responsible AI to minimize unintended bias and the risk of harm.

They also discuss:

  • How you can assess and diagnose bias in unexplainable “black box” algorithms.
  • Why responsible AI demands top-down organizational change, implementing new metrics and systems of redress.
  • How Twitter led an audit of its own image-cropping algorithm that was alleged to bias white faces over people of color.
  • The emerging field of “Responsible Machine Learning Operations” (MLOps).

@ruchowdh
@azeem
@exponentialview

Further resources:

  • “Sharing learnings about our image cropping algorithm” (Twitter Blog, 2021)
  • “As AI develops, so does the debate over profits and ethics” (Financial Times, 2021)
  • “It’s time for AI ethics to grow up” (Wired, 2020)
  • “Auditing Algorithms for Bias” (Harvard Business Review, 2018)


fyyd: Podcast Search Engine
share








 June 16, 2021  49m