Machine learning algorithms can infer gender when translating from gender neutral languages to english due to bias in training data.
Hardware designed to control airbag release have resulted in the deaths of Short-stature drivers and children passengers due to bias.
An engineer from Volkswagen was recently sentenced to a 40 month prison term for their part in producing software that cheated at emissions tests for diesel automobiles.
You are responsible for the software and hardware you help to create. Ethically responsible, and potentially legally responsible. Through an examination of academic literature and case studies this talk will discuss the types of bias that can enter into the systems that we manufacture, and suggest ways that we can avoid bias to create better user experiences (and avoid jail time).