Bidimensionality
Bidimensionality theory characterizes a broad range of graph problems (bidimensional) that admit efficient approximate, fixed-parameter or kernel solutions in a broad range of graphs. These graph classes include planar graphs, map graphs, bounded-genus graphs and graphs excluding any fixed minor. In particular, bidimensionality theory builds on the graph minor theory of Robertson and Seymour by extending the mathematical results and building new algorithmic tools. The theory was introduced in the work of Demaine, Fomin, Hajiaghayi, and Thilikos, for which the authors received the Nerode Prize in 2015. Definition A parameterized problem \Pi is a subset of \Gamma^\times \mathbb for some finite alphabet \Gamma. An instance of a parameterized problem consists of ''(x,k)'', where ''k'' is called the parameter. A parameterized problem \Pi is ''minor-bidimensional'' if # For any pair of graphs H,G, such that H is a minor of G and integer k, (G,k)\in \Pi yields that (H,k)\in ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Treewidth
In graph theory, the treewidth of an undirected graph is an integer number which specifies, informally, how far the graph is from being a tree. The smallest treewidth is 1; the graphs with treewidth 1 are exactly the trees and the forests A forest is an ecosystem characterized by a dense community of trees. Hundreds of definitions of forest are used throughout the world, incorporating factors such as tree density, tree height, land use, legal standing, and ecological functio .... An example of graphs with treewidth at most 2 are the series–parallel graphs. The maximal graphs with treewidth exactly are called '' -trees'', and the graphs with treewidth at most are called '' partial -trees''. Many other well-studied graph families also have bounded treewidth. Treewidth may be formally defined in several equivalent ways: in terms of the size of the largest vertex set in a tree decomposition of the graph, in terms of the size of the largest clique in a chordal completi ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Halin's Grid Theorem
In graph theory, a branch of mathematics, Halin's grid theorem states that the infinite graphs with thick ends are exactly the graphs containing subdivisions of the hexagonal tiling of the plane. It was published by , and is a precursor to the work of Robertson and Seymour linking treewidth to large grid minors, which became an important component of the algorithmic theory of bidimensionality. Definitions and statement A ray, in an infinite graph, is a semi-infinite path: a connected infinite subgraph in which one vertex has degree one and the rest have degree two. defined two rays ''r''0 and ''r''1 to be equivalent if there exists a ray ''r''2 that includes infinitely many vertices from each of them. This is an equivalence relation, and its equivalence classes (sets of mutually equivalent rays) are called the ends of the graph. defined a thick end of a graph to be an end that contains infinitely many rays that, despite being equivalent, are pairwise disjoint from each other. ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Mohammad Hajiaghayi
Mohammad Taghi Hajiaghayi () is a computer scientist known for his work in algorithms, game theory, social networks, network design, graph theory, and big data.. He has over 200 publications with over 185 collaborators and 10 issued patents. He is the Jack and Rita G. Minker Professor at the University of Maryland Department of Computer Science. Professional career Hajiaghayi received his PhD in applied mathematics and computer science from Massachusetts Institute of Technology in 2005 advised by Erik Demaine and F. Thomson Leighton. His thesis was ''The Bidimensionality Theory and Its Algorithmic Applications''.. It founded the theory of bidimensionality which later received the Nerode Prize and was the topic of workshops. Hajiaghayi has been the coach of the University of Maryland ACM International Collegiate Programming team in the World Finals. Honors and awards Hajiaghayi's has received National Science Foundation CAREER Award (2010), Office of Naval Research Young In ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Apex Graph
In graph theory, a branch of mathematics, an apex graph is a graph that can be made planar by the removal of a single vertex. The deleted vertex is called an apex of the graph. It is ''an'' apex, not ''the'' apex because an apex graph may have more than one apex; for example, in the minimal nonplanar graphs or , every vertex is an apex. The apex graphs include graphs that are themselves planar, in which case again every vertex is an apex. The null graph is also counted as an apex graph even though it has no vertex to remove. Apex graphs are closed under the operation of taking minors and play a role in several other aspects of graph minor theory: linkless embedding, Hadwiger's conjecture,. YΔY-reducible graphs, and relations between treewidth and graph diameter. Characterization and recognition Apex graphs are closed under the operation of taking minors: contracting any edge, or removing any edge or vertex, leads to another apex graph. For, if is an apex graph with ape ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Fedor Fomin
Fedor V. Fomin (born March 16, 1968) is a professor of Computer Science at the University of Bergen. He is known for his work in algorithms and graph theory. He received his PhD in 1997 at St. Petersburg State University under Nikolai Nikolaevich Petrov. Books Fomin is the co-author of three books: * * * Awards and honours With his co-authors Erik Demaine, Mohammad Hajiaghayi, and Dimitrios Thilikos, he received the 2015 European Association for Theoretical Computer Science Nerode Prize for his work on bidimensionality. Together with Fabrizio Grandoni and Dieter Kratsch, he received the 2017 Nerode Prize for his work on Measure & Conquer. Fomin won the Nerode Prize a third time in 2024 for the paper "(Meta)Kernelization," coauthored with Hans L. Bodlaender, Daniel Lokshtanov, Eelko Penninkx, Saket Saurabh, and Dimitrios M. Thilikos. In 2019, Fomin was named an EATCS Fellow for "his fundamental contributions in the fields of parametrized complexity and exponential algorithms" ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Nerode Prize
The EATCS–IPEC Nerode Prize is a theoretical computer science prize awarded for outstanding research in the area of parameterized complexity, multivariate algorithmics. It is awarded by the European Association for Theoretical Computer Science and the International Symposium on Parameterized and Exact Computation. The prize was offered for the first time in 2013.. Winners The prize winners so far have been: *2013: Chris Calabro, Russell Impagliazzo, Valentine Kabanets, Ramamohan Paturi, and Francis Zane, for their research formulating the exponential time hypothesis and using it to determine the exact parameterized complexity of several important variants of the Boolean satisfiability problem. *2014: Hans L. Bodlaender, Rod Downey, Rodney G. Downey, Michael Fellows, Michael R. Fellows, Danny Hermelin, Lance Fortnow, and Rahul Santhanam, for their work on kernelization, proving that several problems with fixed-parameter tractable algorithms do not have polynomial-size kernels unles ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Erik Demaine
Erik D. Demaine (born February 28, 1981) is a Canadian-American professor of computer science at the Massachusetts Institute of Technology and a former child prodigy. Early life and education Demaine was born in Halifax, Nova Scotia, to mathematician and sculptor Martin L. Demaine and Judy Anderson. From the age of 7, he was identified as a child prodigy and spent time traveling across North America with his father. He was home-schooled during that time span until entering university at the age of 12. Demaine completed his bachelor's degree at 14 years of age at Dalhousie University in Canada, and completed his PhD at the University of Waterloo by the time he was 20 years old. Demaine's PhD dissertation, a work in the field of computational origami, was completed at the University of Waterloo under the supervision of Anna Lubiw and Ian Munro. This work was awarded the Canadian Governor General's Gold Medal from the University of Waterloo and the NSERC Doctoral Prize (200 ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Combinatorica
''Combinatorica'' is an international journal of mathematics, publishing papers in the fields of combinatorics and computer science Computer science is the study of computation, information, and automation. Computer science spans Theoretical computer science, theoretical disciplines (such as algorithms, theory of computation, and information theory) to Applied science, .... It started in 1981, with László Babai and László Lovász as the editors-in-chief with Paul Erdős as honorary editor-in-chief. The current editors-in-chief are Imre Bárány and József Solymosi. The advisory board consists of Ronald Graham, Gyula O. H. Katona, Miklós Simonovits, Vera Sós, and Endre Szemerédi. It is published by the János Bolyai Mathematical Society and Springer Verlag. The following members of the '' Hungarian School of Combinatorics'' have strongly contributed to the journal as authors, or have served as editors: Miklós Ajtai, László Babai, József Beck, ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Analysis Of Algorithms
In computer science, the analysis of algorithms is the process of finding the computational complexity of algorithms—the amount of time, storage, or other resources needed to execute them. Usually, this involves determining a function that relates the size of an algorithm's input to the number of steps it takes (its time complexity) or the number of storage locations it uses (its space complexity). An algorithm is said to be efficient when this function's values are small, or grow slowly compared to a growth in the size of the input. Different inputs of the same size may cause the algorithm to have different behavior, so best, worst and average case descriptions might all be of practical interest. When not otherwise specified, the function describing the performance of an algorithm is usually an upper bound, determined from the worst case inputs to the algorithm. The term "analysis of algorithms" was coined by Donald Knuth. Algorithm analysis is an important part of a broa ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
SIAM Journal On Discrete Mathematics
'' SIAM Journal on Discrete Mathematics'' is a peer-reviewed mathematics journal published quarterly by the Society for Industrial and Applied Mathematics (SIAM). The journal includes articles on pure and applied discrete mathematics. It was established in 1988, along with the ''SIAM Journal on Matrix Analysis and Applications'', to replace the '' SIAM Journal on Algebraic and Discrete Methods''. The journal is indexed by ''Mathematical Reviews'' and Zentralblatt MATH. Its 2009 MCQ was 0.57. According to the ''Journal Citation Reports'', the journal has a 2016 impact factor of 0.755. Although its official ISO abbreviation is ''SIAM J. Discrete Math.'', its publisher and contributors frequently use the shorter abbreviation ''SIDMA''. References External links * Discrete mathematics journals Academic journals established in 1988 English-language journals Discrete Mathematics Discrete mathematics is the study of mathematical structures that can be considered "discre ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Kernelization
In computer science, a kernelization is a technique for designing efficient algorithms that achieve their efficiency by a preprocessing stage in which inputs to the algorithm are replaced by a smaller input, called a "kernel". The result of solving the problem on the kernel should either be the same as on the original input, or it should be easy to transform the output on the kernel to the desired output for the original problem. Kernelization is often achieved by applying a set of reduction rules that cut away parts of the instance that are easy to handle. In parameterized complexity theory, it is often possible to prove that a kernel with guaranteed bounds on the size of a kernel (as a function of some parameter associated to the problem) can be found in polynomial time. When this is possible, it results in a fixed-parameter tractable algorithm whose running time is the sum of the (polynomial time) kernelization step and the (non-polynomial but bounded by the parameter) time to ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Induced Matching
In graph theory, an induced matching or strong matching is a subset of the edges of an undirected graph that do not share any vertices (it is a Matching (graph theory), matching) and these are the only edges connecting any two vertices which are endpoints of the matching edges (it is an induced subgraph). An induced matching can also be described as an Independent set (graph theory), independent set in the Graph power, square of the line graph of the given graph. Strong coloring and neighborhoods The minimum number of induced matchings into which the edges of a graph can be partitioned is called its ''strong chromatic index'', by analogy with the chromatic index of the graph, the minimum number of matchings into which its edges can be partitioned. It equals the chromatic number of the square of the line graph. Brooks' theorem, applied to the square of the line graph, shows that the strong chromatic index is at most quadratic in the maximum degree of the given graph, but better ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |