Analyzing data to discover the combinations of different kinds of measurements (features) that are distinctive of specific patterns (classes). Sometimes thought of as the automatic identification of shapes and forms. The basic problem is to determine combinations of the features (discriminants) that separate the different classes. The members of the classes generally involve distributions (often Gaussian) that overlap so that the individual features do not permit effective separation. Each feature can be thought of as a dimension and the problem thought of as mapping from a multidimensional space to a simpler space where the classes are well separated (feature selection). Techniques include discriminant, factor, cannonical, principal-component, cluster, regression analyses (see individual methods). This is a generalized eigenvalue problem. Often done by first eliminating those kinds of measurements that are not very helpful in distinguishing the classes and then in finding combinations of the remaining measurements that permit optimal separation of the classes. The first part of this process involves reducing the number of dimensions (eliminating the very small eigenvalues) and the latter as coordinate rotation into orthogonal eigenvector space to construct surfaces that separate the classes.