On the day July the 16th 2007, prior to the start of the technical sessions (on the days 17--19), there will be four invited tutorials of 90 minutes each. The tutorials will provide a gentle introduction to a wide range of important subject matters in imprecise probability. The tutorials are included in the (regular or student) registration fee. You are kindly invited to participate.
Abstract: Risk analysis is widely used in many disciplines to quantify risks or expectations in the face of pervasive variability and profound uncertainty about both natural and engineered systems. Although most analyses today are still based on point estimates, awkward qualitative assessments, or probabilistic calculations employing unwarranted assumptions, the methods of imprecise probability hold great promise for allowing analysts to develop quantitative models that make use of the knowledge and data that are available but do not require untenable or unjustified assumptions or simplifications. This tutorial will introduce some of the methods that are easiest to make calculations with, including probability bounds analysis, Dempster-Shafer evidence theory, and interval statistics, and will show how they can be used to address the basic problems that risk analysts face: not knowing the input distributions, not knowing their correlations, not being sure about the model itself, or even which variables should be considered. We suggest that these tools constitute a practical uncertainty arithmetic (and logic) that can be widely deployed for lots of applications. Of course, not all problems can be well solved by these relatively crude methods. Examples requiring fuller analyses with the methods of imprecise probability are described.
Abstract: In this tutorial, introduce the main elements of Peter Walley's theory of coherent lower and upper previsions. I review the notions of avoiding sure loss and coherence, and the representation of coherent assessments in terms of sets of linear previsions and sets of almost-desirable gambles. Then, I turn to the notion of natural extension, and give its expression under any of these three equivalent representations. Finally, I study how to update assessments in a coherent way, presenting the main facts about conditional lower previsions.
Abstract: A research program whose objective is to study the dual concepts of information-based uncertainty and uncertainty-based information in all their manifestations was introduced in the early 1990s under the name "generalized information theory2 (GIT). The purpose of this tutorial is to introduce conceptual boundaries within which GIT operates and a comprehensive overview of principal results that emerged from GIT. As in classical information theory, uncertainty is the primary concept and information is defined in terms of uncertainty reduction.
GIT is based on a two-dimensional expansion of classical information theory. In one dimension, additive probability measures of classical information theory are expanded to various types of nonadditive measures. In the other dimension, the theory of classical sets, within which probability measures are formalized, is expanded to the various theories of fuzzy sets. Each choice of a particular set theory and a particular measure theory defines a particular information theory. The full development of any of these information theories requires that issues at each of the following four levels be adequately addressed: (1) an uncertainty function, u, of the theory be formalized in term of appropriate axioms; (2) the calculus for dealing with function u be properly developed; (3) a justifiable functional, U, be determined by which the amount of relevant uncertainty (predictive, prescriptive, diagnostic, etc.) associated with function u is measured; and (4) methodological aspects of the theory be developed by utilizing functional U as a measuring instrument.
The tutorial is presented in two parts of approximately the same duration. An overall characterization of GIT is presented in the first part. After a brief overview of classical information theory, a general framework for formalizing uncertainty and the associated uncertainty-based information of any conceivable type is sketched. The various theories of imprecise probabilities that have already been developed within this framework are surveyed and some important unifying principles applying to these theories are introduced. The second part is devoted to the issues of measuring uncertainty and information in the various theories and to the methodological principles based on these measuring capabilities. The tutorial is concluded by a discussion of some open problems in the area of GIT.
The tutorial is intended as a gentle introduction to the area of GIT, which is covered in a greater depth in the recent book "Uncertainty and Information: Foundation of Generalized Information Theory" by George J. Klir (Wiley-Interscience, 2006).
Abstract: This tutorial offers an overview of selected alternative decision theories designed for use with IP theory. The review begins with an examination of the kinds of rival decision theories that ensue when each of the familiar axioms of, for instance, Anscombe-Aumann Horse Lottery theory is relaxed to accommodate normal-form IP theory. From this base, further generalizations are considered, including: multi-agent decision making, extensive form IP theory, and several interesting considerations that attend problems with infinite decision structures.