In
machine learning and
pattern recognition, a feature is an individual measurable property or characteristic of a phenomenon.
Choosing informative, discriminating and independent features is a crucial element of effective algorithms in
pattern recognition,
classification Classification is a process related to categorization, the process in which ideas and objects are recognized, differentiated and understood.
Classification is the grouping of related facts into classes.
It may also refer to:
Business, organizat ...
and
regression
Regression or regressions may refer to:
Science
* Marine regression, coastal advance due to falling sea level, the opposite of marine transgression
* Regression (medicine), a characteristic of diseases to express lighter symptoms or less extent ( ...
. Features are usually numeric, but structural features such as
strings
String or strings may refer to:
*String (structure), a long flexible structure made from threads twisted together, which is used to tie, bind, or hang other objects
Arts, entertainment, and media Films
* ''Strings'' (1991 film), a Canadian anim ...
and
graphs
Graph may refer to:
Mathematics
*Graph (discrete mathematics), a structure made of vertices and edges
**Graph theory, the study of such graphs and their properties
*Graph (topology), a topological space resembling a graph in the sense of discre ...
are used in
syntactic pattern recognition Syntactic pattern recognition or structural pattern recognition is a form of pattern recognition, in which each object can be represented by a variable- cardinality set of symbolic, nominal features. This allows for representing pattern structures, ...
. The concept of "feature" is related to that of
explanatory variable used in
statistical
Statistics (from German: ''Statistik'', "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, industria ...
techniques such as
linear regression
In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is call ...
.
Classification
A numeric feature can be conveniently described by a feature vector. One way to achieve
binary classification is using a
linear predictor function (related to the
perceptron) with a feature vector as input. The method consists of calculating the
scalar product between the feature vector and a vector of weights, qualifying those observations whose result exceeds a threshold.
Algorithms for classification from a feature vector include
nearest neighbor classification,
neural networks
A neural network is a network or circuit of biological neurons, or, in a modern sense, an artificial neural network, composed of artificial neurons or nodes. Thus, a neural network is either a biological neural network, made up of biological ...
, and
statistical techniques such as
Bayesian approaches.
Examples
In
character recognition, features may include
histogram
A histogram is an approximate representation of the distribution of numerical data. The term was first introduced by Karl Pearson. To construct a histogram, the first step is to " bin" (or "bucket") the range of values—that is, divide the ent ...
s counting the number of black pixels along horizontal and vertical directions, number of internal holes, stroke detection and many others.
In
speech recognition, features for recognizing
phonemes can include noise ratios, length of sounds, relative power, filter matches and many others.
In
spam
Spam may refer to:
* Spam (food), a canned pork meat product
* Spamming, unsolicited or undesired electronic messages
** Email spam, unsolicited, undesired, or illegal email messages
** Messaging spam, spam targeting users of instant messaging ( ...
detection algorithms, features may include the presence or absence of certain email headers,
the email structure, the language, the frequency of specific terms, the grammatical correctness of the text.
In
computer vision
Computer vision is an interdisciplinary scientific field that deals with how computers can gain high-level understanding from digital images or videos. From the perspective of engineering, it seeks to understand and automate tasks that the hum ...
, there are a large number of possible
features
Feature may refer to:
Computing
* Feature (CAD), could be a hole, pocket, or notch
* Feature (computer vision), could be an edge, corner or blob
* Feature (software design) is an intentional distinguishing characteristic of a software item ...
, such as edges and objects.
Extensions
In
pattern recognition and
machine learning, a feature vector is an n-dimensional
vector of numerical features that represent some object. Many
algorithms in machine learning require a numerical representation of objects, since such representations facilitate processing and statistical analysis. When representing images, the feature values might correspond to the pixels of an image, while when representing texts the features might be the frequencies of occurrence of textual terms. Feature vectors are equivalent to the vectors of
explanatory variables used in
statistical
Statistics (from German: ''Statistik'', "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, industria ...
procedures such as
linear regression
In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is call ...
. Feature vectors are often combined with weights using a
dot product in order to construct a
linear predictor function that is used to determine a score for making a prediction.
The
vector space associated with these vectors is often called the feature space. In order to reduce the dimensionality of the feature space, a number of
dimensionality reduction techniques can be employed.
Higher-level features can be obtained from already available features and added to the feature vector; for example, for the study of diseases the feature 'Age' is useful and is defined as ''Age = 'Year of death' minus 'Year of birth' ''. This process is referred to as feature construction.
[Liu, H., Motoda H. (1998) ]
Feature Selection for Knowledge Discovery and Data Mining
'', Kluwer Academic Publishers. Norwell, MA, USA. 1998.[Piramuthu, S., Sikora R. T]
Iterative feature construction for improving inductive learning algorithms
In Journal of Expert Systems with Applications. Vol. 36 , Iss. 2 (March 2009), pp. 3401-3406, 2009 Feature construction is the application of a set of constructive operators to a set of existing features resulting in construction of new features. Examples of such constructive operators include checking for the equality conditions , the arithmetic operators , the array operators as well as other more sophisticated operators, for example count(S,C)
[Bloedorn, E., Michalski, R. Data-driven constructive induction: a methodology and its applications. IEEE Intelligent Systems, Special issue on Feature Transformation and Subset Selection, pp. 30-37, March/April, 1998] that counts the number of features in the feature vector S satisfying some condition C or, for example, distances to other recognition classes generalized by some accepting device. Feature construction has long been considered a powerful tool for increasing both accuracy and understanding of structure, particularly in high-dimensional problems.
[Breiman, L. Friedman, T., Olshen, R., Stone, C. (1984) ''Classification and regression trees'', Wadsworth] Applications include studies of disease and
emotion recognition from speech.
[Sidorova, J., Badia T]
Syntactic learning for ESEDA.1, tool for enhanced speech emotion detection and analysis
Internet Technology and Secured Transactions Conference 2009 (ICITST-2009), London, November 9–12. IEEE
Selection and extraction
The initial set of raw features can be redundant and too large to be managed. Therefore, a preliminary step in many applications of
machine learning and
pattern recognition consists of
selecting a subset of features, or
constructing a new and reduced set of features to facilitate learning, and to improve generalization and interpretability.
Extracting or selecting features is a combination of art and science; developing systems to do so is known as
feature engineering. It requires the experimentation of multiple possibilities and the combination of automated techniques with the intuition and knowledge of the
domain expert. Automating this process is
feature learning, where a machine not only uses features for learning, but learns the features itself.
See also
*
Covariate
*
Dimensionality reduction
*
Feature engineering
*
Hashing trick In machine learning, feature hashing, also known as the hashing trick (by analogy to the kernel trick), is a fast and space-efficient way of vectorizing features, i.e. turning arbitrary features into indices in a vector or matrix. It works by apply ...
*
Statistical classification
*
Explainable artificial intelligence
References
{{Reflist
Data mining
Machine learning
Pattern recognition