Canalblog
Editer l'article Suivre ce blog Administration + Créer mon blog
Publicité
Groupement ADAS : Advanced Driver Assistance Systems
17 juillet 2018

The State of Explainable AI

The State of Explainable AI

I don’t need to know exactly why Netflix recommends certain movies to me — if it looks like a fit, I’m happy to take their recommendation. On the other hand, if your AI tells me that I should undergo an invasive medical treatment because a deep neural network (DNN) recommends it — well, I’m going to want to understand why before I take your recommendation.

Explainable AI (XAI) matters when you’re optimizing for something more important than a taste-based recommendation. AI deployed in military tools, financial tools such as loan assessments, or self-driving cars may use DNNs without being able to establishing culpability — if we can’t understand how an algorithm works, who’s responsible when something goes wrong? — and without being able to audit and double-verify that the models aren’t relying on bad information.

The State of XAI

As long as breakthroughs in artificial intelligence (AI) are common, researchers and startups will probably focus most of their effort into making new, flexible AI models. Maybe we can’t explain how these models work, but if x.ai’s Amy or Andrew can miraculously figure out how and when to schedule meetings for me, do I even care? However, once we really hit 

diminishing returns in DNNs, explaining how these DNN produce their results will be an area of intense focus.

For text-based AI systems, logical entailment is about explaining fact checksand arguments in general. Companies like Factmata are working on this by logically explaining the contents of knowledge graphs.

“Explaining” images is a lot trickier. DARPA has begun this work with a 5-year program to develop XAI. The DARPA proposal mentions two academic works which are generating buzz right now: UC Berkeley’s Generating Visual Explanations and University of Washington’s Why Should I Trust You? (the LIME paper).

Read more : https://medium.com/@jschwiep/the-state-of-explainable-ai-e252207dc46b

 

Publicité
Publicité
Commentaires
About us

Groupement ADAS is a Team of innovative companies with over 20 years experience in the field of technologies used in assistance driver systems (design, implementation and integration of ADAS in vehicles for safety features, driver assistance, partial delegation to the autonomous vehicle).

Publicité
Contact us
Thierry Bapin, Pôle Mov'eo
groupement.adas@pole-moveo.org
Follow us : @groupement_adas

Groupement ADAS is empowered by Mov'eo French Automotive competitiveness cluster

Mov'eo-2014

Visiteurs
Depuis la création 204 046
Archives
Publicité