Cover image for Learning with Nested Generalized Exemplars
Learning with Nested Generalized Exemplars
Title:
Learning with Nested Generalized Exemplars
ISBN:
9781461315490
Personal Author:
Edition:
1st ed. 1990.
Publication Information New:
New York, NY : Springer US : Imprint: Springer, 1990.
Physical Description:
XX, 160 p. online resource.
Series:
The Springer International Series in Engineering and Computer Science ; 100
Contents:
1 Introduction -- 1.1 Background -- 1.2 NGE and other exemplar-based theories -- 1.3 Previous models -- 1.4 Comparisons of NGE and other models -- 1.5 Types of generalization -- 2 The NGE learning algorithm -- 2.1 Initialization -- 2.2 Get the next example -- 2.3 Make a prediction -- 2.4 Feedback -- 2.5 Summary of algorithm -- 2.6 Partitioning feature space -- 2.7 Assumptions -- 2.8 Greedy variant of the algorithm -- 3 Review -- 3.1 Concept learning in psychology -- 3.2 Prototype theory and exemplar theory -- 3.3 Each as a multiple prototype model -- 3.4 Machine learning in AI -- 3.5 Connectionism -- 3.6 Cluster analysis -- 3.7 Conclusion -- 4 Experimental results with NGE -- 4.1 Breast cancer data -- 4.2 Iris classification -- 4.3 Echocardiogram tests -- 4.4 Discrete event simulation -- 5 Conclusion -- 5.1 Weight factors -- 5.2 Synthesis with explanation-based learning -- 5.3 Psychological plausibility -- 5.4 Complexity results -- 5.5 Future experimental work -- A Data sets -- A.1 Breast cancer data -- A.2 Iris data -- A.3 Echocardiogram data.
Abstract:
Machine Learning is one of the oldest and most intriguing areas of Ar­ tificial Intelligence. From the moment that computer visionaries first began to conceive the potential for general-purpose symbolic computa­ tion, the concept of a machine that could learn by itself has been an ever present goal. Today, although there have been many implemented com­ puter programs that can be said to learn, we are still far from achieving the lofty visions of self-organizing automata that spring to mind when we think of machine learning. We have established some base camps and scaled some of the foothills of this epic intellectual adventure, but we are still far from the lofty peaks that the imagination conjures up. Nevertheless, a solid foundation of theory and technique has begun to develop around a variety of specialized learning tasks. Such tasks in­ clude discovery of optimal or effective parameter settings for controlling processes, automatic acquisition or refinement of rules for controlling behavior in rule-driven systems, and automatic classification and di­ agnosis of items on the basis of their features. Contributions include algorithms for optimal parameter estimation, feedback and adaptation algorithms, strategies for credit/blame assignment, techniques for rule and category acquisition, theoretical results dealing with learnability of various classes by formal automata, and empirical investigations of the abilities of many different learning algorithms in a diversity of applica­ tion areas.
Added Corporate Author:
Language:
English