Sandra Wachter is a professor and senior researcher in data ethics, artificial intelligence, robotics, algorithms and regulation at the Oxford Internet Institute.[1] She is a former Fellow of The Alan Turing Institute.[2]
Sandra Wachter | |
---|---|
Born | Austria |
Alma mater | University of Vienna University of Oxford |
Scientific career | |
Institutions | Oxford Internet Institute Alan Turing Institute Royal Academy of Engineering |
Early life and education
editWachter grew up in Austria and studied law at the University of Vienna.[3][4] Wachter has said that she was inspired to work in technology because of her grandmother, who was one of the first women admitted to Vienna's Technical University.[3]
She completed her LL.M. in 2009, before starting as a legal counsel in the Austrian Federal Ministry of Health. It was during this time that she joined the faculty at the University of Vienna, pursuing a doctoral degree in technology, intellectual property and democracy. She completed her PhD degree in 2015, and simultaneously earned a master's degree in social sciences at the University of Oxford. After earning her doctorate, Wachter joined the Royal Academy of Engineering, where she worked in public policy. She returned to the University of Vienna where she worked on various ethical aspects of innovation.[5]
Research
editHer work covers legal and ethical issues associated with big data, artificial intelligence, algorithms and data protection.[6][7] She believes that there needs to be a balance between technical innovation and personal control of information.[8] Wachter was made a research fellow at the Alan Turing Institute in 2016. In this capacity she has evaluated the ethical and legal aspects of data science. She has argued that artificial intelligence should be more transparent and accountable, and that people have a "right to reasonable inferences".[9][10][11] She has highlighted cases where opaque algorithms have become racist and sexist; such as discrimination in applications to St George's Hospital and Medical School in the 1970s and overestimations of black defendants reoffending when using the program COMPAS.[9] Whilst Wachter appreciates that it is difficult to eliminate bias from training data sets, she believes that is possible to develop tools to identify and eliminate them.[9][12] She has looked at ways to audit artificial intelligence to tackle discrimination and promote fairness.[4][13] In this capacity she has argued that Facebook should continue to use human moderators.[14]
She has argued that General Data Protection Regulation (GDPR)[15] is in need of reform, as despite attention being paid to the input stage, less time is spent on how the data is assessed.[16][17] She believes that privacy must mean more than data protection, focussing on data evaluation and ways for people to control how information about them is stored and shared.[16][18]
Working with Brent Mittelstadt and Chris Russell, Wachter suggested counterfactual explanations – statements of how different the world would be to result in a different outcome. When decisions are made by an algorithm it can be difficult for people to understand why they are being made, especially without revealing trade secrets about an algorithm. Counterfactual explanations would permit the interrogation of algorithms without the need to reveal secrets. The approach of using counterfactual explanations was adopted by Google on What If, a feature on TensorBoard, a Google Open Source web application that uses machine learning.[3] Counterfactual explanations without opening the black box: automated decisions and the GDPR,[19] a paper written by Wachter, Brent Mittelstadt and Chris Russell, has been featured by the press[3] and is widely cited in scholarly literature.
Academic service
editShe was made an associate professor at the University of Oxford in 2019 and a visiting professor at Harvard University from spring 2020.[4][20] Wachter is also a member of the World Economic Forum Council on Values, Ethics and Innovation, an affiliate at the Bonavero Institute of Human Rights and a member of the European Commission Expert Group on Autonomous Cars.[21][22] Additionally, she is a research fellow at the German Internet Institute.[23]
Awards and honours
edit- ESRC IAA O²RB Excellence in Impact Awards 2021 [24]
- 2019 The Next Web Most influential people in AI in 2019 [25]
- 2019 Privacy Law Scholars Conference Junior Scholars Award [26]
- 2019 Business Insider Nordic AI Trailblazer [27]
- 2019 Business Insider UK Tech 100 [28]
- 2017 CognitionX AI Superhero Award [29]
References
edit- ^ "Sandra Wachter — Oxford Internet Institute". www.oii.ox.ac.uk. Retrieved 2021-03-10.
- ^ "Sandra Wachter". The Alan Turing Institute. Retrieved 2019-03-10.
- ^ a b c d Katwala, Amit (2018-12-11). "How to make algorithms fair when you don't know what they're doing". Wired UK. ISSN 1357-0978. Retrieved 2019-03-10.
- ^ a b c "Sandra Wachter | Harvard Law School". Retrieved 2019-10-30.
- ^ "Robots: Faithful servants or existential threat?". Create the Future. 2016-06-06. Retrieved 2019-10-30.
- ^ "Why it's totally unsurprising that Amazon's recruitment AI was biased against women". nordic.businessinsider.com. 2018-10-13. Retrieved 2019-03-10.
- ^ Baraniuk, Chris. "Exclusive: UK police wants AI to stop violent crime before it happens". New Scientist. Retrieved 2019-03-10.
- ^ CPDP 2019: Profiling, microtargeting and a right to reasonable algorithmic inferences., retrieved 2019-10-30
- ^ a b c Hutson, Matthew (2017-05-31). "Q&A: Should artificial intelligence be legally required to explain itself?". AAAS. Retrieved 2019-10-30.
- ^ "OII London Lecture: Show Me Your Data and I'll Tell You Who You Are — Oxford Internet Institute". www.oii.ox.ac.uk. Retrieved 2019-10-30.
- ^ Privacy, identity, and autonomy in the age of big data and AI - Sandra Wachter, University of Oxford, retrieved 2019-10-30
- ^ "The ethical use of Artificial Intelligence". www.socsci.ox.ac.uk. Retrieved 2021-10-19.
- ^ "What Does a Fair Algorithm Actually Look Like?". Wired. ISSN 1059-1028. Retrieved 2019-10-30.
- ^ Vincent, James (2019-02-27). "AI won't relieve the misery of Facebook's human moderators". The Verge. Retrieved 2019-10-30.
- ^ Artificial Intelligence: GDPR and beyond - Dr. Sandra Wachter, University of Oxford, retrieved 2019-10-30
- ^ a b Shah, Sooraj. "This Lawyer Believes GDPR Is Failing To Protect You - Here's What She Would Change". Forbes. Retrieved 2019-03-10.
- ^ Wachter, Sandra (2018-04-30). "Will our online lives soon become 'private' again?". Retrieved 2019-10-30.
- ^ "Privacy, Identity, & Autonomy in the age of Big Data and AI". TechNative. 2019-06-03. Retrieved 2019-10-30.
- ^ Wachter, Sandra; Mittelstadt, Brent; Russell, Chris (2017). "Counterfactual Explanations Without Opening the Black Box: Automated Decisions and the GDPR". SSRN Working Paper Series. arXiv:1711.00399. Bibcode:2017arXiv171100399W. doi:10.2139/ssrn.3063289. ISSN 1556-5068. S2CID 3995299.
- ^ "Professor Sandra Wachter, the OII". www.oii.ox.ac.uk. Retrieved 2019-10-30.
- ^ "Sandra Wachter". World Economic Forum. Retrieved 2019-10-30.
- ^ "Academic Affiliates of the Bonavero Institute of Human Rights". Oxford Law Faculty. 2018-01-25. Retrieved 2019-10-30.
- ^ https://www.weizenbaum-institut.de/en/spezialseiten/persons-details/p/sandra-wachter/
- ^ "ESRC Excellence in Impact 2021 - awards ceremony". Social Sciences, OXford. 2021-10-19.
- ^ Greene, Tristan (2019-02-28). "Here's who has the most juice in Twitter's AI influencer community". The Next Web. Retrieved 2019-03-10.
- ^ "PLSC Paper Awards". Berkeley Law. Retrieved 2019-10-30.
- ^ Hamilton, Isobel Asher. "3 female AI trailblazers reveal how they beat the odds and overcame sexism to become leaders in their field". Business Insider. Retrieved 2019-10-30.
- ^ Wood, Mary Hanbury, Isobel Asher Hamilton, Charlie. "UK Tech 100: The 30 most important, interesting, and impactful women shaping British technology in 2019". Business Insider. Retrieved 2019-10-30.
{{cite web}}
: CS1 maint: multiple names: authors list (link) - ^ "Turing partners with Cog X London 2017 to explore the impact of AI across sectors". The Alan Turing Institute. Retrieved 2019-10-30.