Mata Haggis-Burridge works in the gaming industry, where he works with Artificial Intelligence systems every day. Too often, he sees that people assume AI is neutral because it’s based on facts and patterns. However, the data that is collected to build these AI systems are merely a reflection of past patterns. In this way, despite the best intentions of developers, discrimination, racism and prejudice sneak into the systems we now rely on to eliminate such human biases. In order to break with the patterns of our past, we need to recognize their flaws and teach the algorithms of the future to do the same. Dr. Mata Haggis-Burridge is Professor of Creative and Entertainment Games at NHTV: Breda University of Applied Sciences, where he has worked since 2010. He completed his PhD on Cyberculture in 2006. Mata is an award winning video game developer, as well as researching the social implications of national and international policies regarding video games. His work frequently involves confronting how problematic outcomes emerge from complex systems, and trying to find new paths to better solutions. Mata’s talk is about how Artificial Intelligences can affect our lives in many strange ways. A sleeper TedX video that we recommend and only has 3,000 views.
Important Note
Content Editors rate, curate and regularly update what we believe are the top 11% of all AI resource and good practice examples and is why our content is rated from 90% to 100%. Content rated less than 90% is excluded from this site. All inclusions are vetted by experienced professionals with graduate level data science degrees.
In the broadest sense, any inclusion of content on this site is not an endorsement or recommendation of any service, product or content that may be discussed, recommended, endorsed or affiliated with the content, company or spokesperson. We are a 501(c)3 nonprofit and receive no website advertising monies or direct or indirect compensation for any content or other information on any of our websites. For more information, visit our TOS.