Resources

This is a list of related web pages and sites. If you have links to add please write me an email.

Below is a list of recommended literature on the topic of explanation. It is work in progress. Please see also the Yahoo! group explanation-research for further explanation-related publications. Contributors to this list: Thomas Roth-Berghofer, Nava Tintarev

Expert systems

[MS88] Johanna D. Moore and William R. Swartout. Explanation in expert systems: A survey. Research Report RR-88-228, University of Southern California, Marina Del Rey, CA, 1988.

Abstract. In order to be considered useful and acceptable, expert systems must be able to explain their knowledge of the domain and the reasoning processes they employ to produce results and recommendations. Despite the fact that the need for explanation has been widely recognized, current expert systems have only limited explanatory capabilities. In this survey, we review early approaches to explanation in expert systems and discuss their limitations. We discuss improvements to the explanation capabilities based on enriched knowledge bases of expert systems. We then argue that further improvements in explanation require better generation techniques. Related work in the field of natural language generation suggests techniques that are useful to the task of explanation in expert systems; however, even those techniques will not provide all of the capabilities required for the task of carrying on a dialogue with the user. Finally, we describe our approach to explanation, which provides the facilities necessary to carry on an interactive dialogue with the user.

[LD04] Lacave, C. & Diéz, F. J. A review of explanation methods for heuristic expert systems. The Knowledge Engineering Review, 2004, 17:2, 133-146

Abstract. Explanation of reasoning is one of the most important abilities an expert system should provide in order to be widely accepted. In fact, since MYCIN, many expert systems have tried to include some explanation capability. This paper reviews the methods developed to date for explanation in heuristic expert systems.

Recommender systems

[TM07] Nava Tintarev and Judith Masthoff. A Survey of Explanations in Recommender Systems. In G Uchyigit (ed), Workshop on Recommender Systems and Intelligent User Interfaces associated with ICDE’07, Istanbul. pp. 801-810

Abstract. This paper provides a comprehensive review of explanations in recommender systems. We highlight seven possible advantages of an explanation facility, and describe how existing measures can be used to evaluate the quality of explanations. Since explanations are not independent of the recommendation process, we consider how the ways recommendations are presented may affect explanations. Next, we look at different ways of interacting with explanations. The paper is illustrated with examples of explanations through- out, where possible from existing applications.

Case-Based Reasoning

[DTC03] Doyle, D.; Tsymbal, A., Cunningham, P. A Review of Explanation and Explanation in Case-Based Reasoning. Department of Computer Science, Trinity College, Dublin, 2003

Abstract. ./.

[RB04] Thomas R. Roth-Berghofer. Explanations and Case-Based Reasoning: Foundational issues. In Peter Funk and Pedro A. González-Calero, editors, Advances in Case-Based Reasoning, pages 389–403. Springer, 2004.

Abstract. By design, Case-Based Reasoning (CBR) systems do not need deep general knowledge. In contrast to (rule-based) expert systems, CBR systems can already be used with just some initial knowledge. Further knowledge can then be added manually or learned over time. CBR systems are not addressing a special group of users. Expert systems, on the other hand, are intended to solve problems similar to human ex- perts. Because of the complexity and difficulty of building and using expert systems, research in this area addressed generating explanations right from the beginning. But for knowledge-intensive CBR applications, the demand for explanations is also growing. This paper is a first pass on examining issues concerning explanations produced by CBR systems from the knowledge containers perspective. It discusses what naturally can be explained by each of the four knowledge containers (vocabulary, similarity measures, adaptation knowledge, and case base) in relation to scientific, conceptual, and cognitive explanations.

Miscellaneous

[Coh00] Daniel Cohnitz. Explanations are like salted peanuts. In Ansgar Beckermann and Christian Nimtz, editors, Proceedings of the Fourth International Congress of the Society for Analytic Philosophy, 2000. http://www.gap-im-netz.de/gap4Konf/Proceedings4/titel.htm.

Abstract. ./.

[LD02] Lacave, C. & Diéz, F. J. A review of explanation methods for Bayesian networks. The Knowledge Engineering Review, 2002, 17:2, 107-12

Abstract. One of the key factors for the acceptance of expert systems in real world domains is the capability to explain their reasoning. This paper describes the basic properties that characterise explanation methods and reviews the methods developed up to date for explanation in Bayesian networks.

 

[Last updated: 16 August 2013]

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: