HOME





Submodular
In mathematics, a submodular set function (also known as a submodular function) is a set function that, informally, describes the relationship between a set of inputs and an output, where adding more of one input has a decreasing additional benefit (diminishing returns). The natural diminishing returns property which makes them suitable for many applications, including approximation algorithms, game theory (as functions modeling user preferences) and electrical networks. Recently, submodular functions have also found utility in several real world problems in machine learning and artificial intelligence, including automatic summarization, multi-document summarization, feature selection, Active learning (machine learning), active learning, sensor placement, image collection summarization and many other domains. Definition If \Omega is a finite set (mathematics), set, a submodular function is a set function f:2^\rightarrow \mathbb, where 2^\Omega denotes the Power set#Representing s ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Automatic Summarization
Automatic summarization is the process of shortening a set of data computationally, to create a subset (a summary) that represents the most important or relevant information within the original content. Artificial intelligence algorithms are commonly developed and employed to achieve this, specialized for different types of data. Text summarization is usually implemented by natural language processing methods, designed to locate the most informative sentences in a given document. On the other hand, visual content can be summarized using computer vision algorithms. Image summarization is the subject of ongoing research; existing approaches typically attempt to display the most representative images from a given image collection, or generate a video that only includes the most important content from the entire collection. Video summarization algorithms identify and extract from the original video content the most important frames (''key-frames''), and/or the most important video seg ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Matroid Rank
In the mathematical theory of matroids, the rank of a matroid is the maximum size of an independent set in the matroid. The rank of a subset ''S'' of elements of the matroid is, similarly, the maximum size of an independent subset of ''S'', and the rank function of the matroid maps sets of elements to their ranks. The rank function is one of the fundamental concepts of matroid theory via which matroids may be axiomatized. Matroid rank functions form an important subclass of the submodular set functions. The rank functions of matroids defined from certain other types of mathematical object such as undirected graphs, matrices, and field extensions are important within the study of those objects. Examples In all examples, ''E'' is the base set of the matroid, and ''B'' is some subset of ''E''. * Let ''M'' be the free matroid, where the independent sets are all subsets of ''E''. Then the rank function of ''M'' is simply: ''r''(''B'') = , ''B'', . * Let ''M'' be a uniform mat ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Subadditive Set Function
In mathematics, a subadditive set function is a set function whose value, informally, has the property that the value of function on the union of two sets is at most the sum of values of the function on each of the sets. This is thematically related to the subadditivity property of real-valued functions. Definition Let \Omega be a set and f \colon 2^ \rightarrow \mathbb be a set function, where 2^\Omega denotes the power set of \Omega. The function ''f'' is ''subadditive'' if for each subset S and T of \Omega, we have f(S) + f(T) \geq f(S \cup T). Note that by substitution of T=S into the defining equation, it follows that f(S) \ge 0 for all . Examples of subadditive functions Every non-negative submodular set function is subadditive (the family of non-negative submodular functions is strictly contained in the family of subadditive functions). The function that counts the number of sets required to cover a given set is subadditive. Let T_1, \dotsc, T_m \subseteq \Omega su ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Set Function
In mathematics, especially measure theory, a set function is a function whose domain is a family of subsets of some given set and that (usually) takes its values in the extended real number line \R \cup \, which consists of the real numbers \R and \pm \infty. A set function generally aims to subsets in some way. Measures are typical examples of "measuring" set functions. Therefore, the term "set function" is often used for avoiding confusion between the mathematical meaning of "measure" and its common language meaning. Definitions If \mathcal is a family of sets over \Omega (meaning that \mathcal \subseteq \wp(\Omega) where \wp(\Omega) denotes the powerset) then a is a function \mu with domain \mathcal and codomain \infty, \infty/math> or, sometimes, the codomain is instead some vector space, as with vector measures, complex measures, and projection-valued measures. The domain of a set function may have any number properties; the commonly encountered properties and ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Entropic Vector
The entropic vector or entropic function is a concept arising in information theory. It represents the possible values of Shannon's information entropy that subsets of one set of random variables may take. Understanding which vectors are entropic is a way to represent all possible inequalities between entropies of various subsets. For example, for any two random variables X_1,X_2, their joint entropy (the entropy of the random variable representing the pair X_1,X_2) is at most the sum of the entropies of X_1 and of X_2: :H(X_1,X_2) \leq H(X_1) + H(X_2) Other information-theoretic measures such as conditional information, mutual information, or total correlation can be expressed in terms of joint entropy and are thus related by the corresponding inequalities. Many inequalities satisfied by entropic vectors can be derived as linear combinations of a few basic ones, called ''Shannon-type inequalities''. However, it has been proven that already for n=4 variables, no finite set of lin ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Matroid
In combinatorics, a matroid is a structure that abstracts and generalizes the notion of linear independence in vector spaces. There are many equivalent ways to define a matroid Axiomatic system, axiomatically, the most significant being in terms of: independent sets; bases or circuits; rank functions; closure operators; and closed sets or ''flats''. In the language of partially ordered sets, a finite simple matroid is equivalent to a geometric lattice. Matroid theory borrows extensively from the terms used in both linear algebra and graph theory, largely because it is the abstraction of various notions of central importance in these fields. Matroids have found applications in geometry, topology, combinatorial optimization, network theory, and coding theory. Definition There are many Cryptomorphism, equivalent ways to define a (finite) matroid. Independent sets In terms of independence, a finite matroid M is a pair (E, \mathcal), where E is a finite set (called the ''gro ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Budget-additive Valuation
In economics, a budget-additive valuation is a kind of a utility function. It corresponds to a person that, when given a set of items, evaluates them in the following way: * For each item ''j'', there is a fixed value ''vj''. * There is also a fixed budget ''B''. * The value of the set of items is the minimum between B and the sum of values of items in the set. Budget-additive valuations are useful in the research of online advertising, combinatorial auctions, resource allocation, and market equilibrium. Relation to other kinds of valuations Every additive valuation is a special case of a budget-additive valuation, in which the budget is infinite. Every budget-additive valuation is a submodular valuation. References {{Reflist See also * Utility functions on indivisible goods Some branches of economics and game theory deal with indivisible goods, discrete items that can be traded only as a whole. For example, in combinatorial auctions there is a finite set of items, a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Entropic Vector
The entropic vector or entropic function is a concept arising in information theory. It represents the possible values of Shannon's information entropy that subsets of one set of random variables may take. Understanding which vectors are entropic is a way to represent all possible inequalities between entropies of various subsets. For example, for any two random variables X_1,X_2, their joint entropy (the entropy of the random variable representing the pair X_1,X_2) is at most the sum of the entropies of X_1 and of X_2: :H(X_1,X_2) \leq H(X_1) + H(X_2) Other information-theoretic measures such as conditional information, mutual information, or total correlation can be expressed in terms of joint entropy and are thus related by the corresponding inequalities. Many inequalities satisfied by entropic vectors can be derived as linear combinations of a few basic ones, called ''Shannon-type inequalities''. However, it has been proven that already for n=4 variables, no finite set of lin ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mutual Information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual Statistical dependence, dependence between the two variables. More specifically, it quantifies the "Information content, amount of information" (in Units of information, units such as shannon (unit), shannons (bits), Nat (unit), nats or Hartley (unit), hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of Entropy (information theory), entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable. Not limited to real-valued random variables and linear dependence like the Pearson correlation coefficient, correlation coefficient, MI is more general and determines how different the joint distribution of the pair (X,Y) is from the product of the marginal distributions of X and ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Graph (discrete Mathematics)
In discrete mathematics, particularly in graph theory, a graph is a structure consisting of a Set (mathematics), set of objects where some pairs of the objects are in some sense "related". The objects are represented by abstractions called ''Vertex (graph theory), vertices'' (also called ''nodes'' or ''points'') and each of the related pairs of vertices is called an ''edge'' (also called ''link'' or ''line''). Typically, a graph is depicted in diagrammatic form as a set of dots or circles for the vertices, joined by lines or curves for the edges. The edges may be directed or undirected. For example, if the vertices represent people at a party, and there is an edge between two people if they shake hands, then this graph is undirected because any person ''A'' can shake hands with a person ''B'' only if ''B'' also shakes hands with ''A''. In contrast, if an edge from a person ''A'' to a person ''B'' means that ''A'' owes money to ''B'', then this graph is directed, because owing mon ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]