Computing science experts from the University of Aberdeen are teaching a class on reproducibility as part of a training event taking place in the Netherlands this month in relation to the emerging field of Explainable Artificial Intelligence (XAI).
The event will look at how the quality of algorithms used in XAI – which helps humans understand how AI systems make decisions or predictions – should be assessed. Involving scientists from several European universities, the event is being held as part of the H2020 European research project NL4XAI, which aims train the first generation of experts in XAI.
Scientists from the different universities involved in the project, including those from the University of Aberdeen, will join representatives from industry to share their experiences with 11 PhD students who are faced with the challenge of finding a way to make AI systems explain themselves.
Prof Anya Belz and RA Dr Craig Thomson from the EPSRC funded ReproHum project will teach a class on reproducibility of research results consisting of a lecture and a practical session giving hands-on experience with the design of evaluation experiments in AI which are known to be difficult to replicate, due to factors such as the omission of crucial details from research articles in which the experiments are reported.
The Aberdeen team will support students in gaining experience in identifying these problems in existing studies, and learning to work in a way that will ensure that the evaluation studies they conduct as part of their PhD projects can be easily replicated.
Prof Belz and Dr Thompson, both from the University’s Department of Computing Science, will be attending the event which takes place in Utrecht from 15 to 16 December.