The.Algorithm.Design.Manual.Springer-Verlag.1998
The.Algorithm.Design.Manual.Springer-Verlag.1998 The.Algorithm.Design.Manual.Springer-Verlag.1998
Transitive Closure and Reduction Next: Matching Up: Graph Problems: Polynomial-Time Previous: Shortest Path Transitive Closure and Reduction Input description: A directed graph G=(V,E). Problem description: For transitive closure, construct a graph G'=(V,E') with edge iff there is a directed path from i to j in G. For transitive reduction, construct a small graph G'=(V,E') with a directed path from i to j in G' iff . Discussion: Transitive closure can be thought of as establishing a data structure that makes it possible to solve reachability questions (can I get to x from y?) efficiently. After the preprocessing of constructing the transitive closure, all reachability queries can be answered in constant time by simply reporting a matrix entry. Transitive closure is fundamental in propagating the consequences of modified attributes of a graph G. For example, consider the graph underlying any spreadsheet model, where the vertices are cells and there is an edge from cell i to cell j if the result of cell j depends on cell i. When the value of a given cell is modified, the values of all reachable cells must also be updated. The identity of these cells is revealed by the transitive closure of G. Many database problems reduce to computing transitive closures, for analogous reasons. There are three basic algorithms for computing transitive closure: file:///E|/BOOK/BOOK4/NODE163.HTM (1 of 4) [19/1/2003 1:31:01]
Transitive Closure and Reduction ● The simplest algorithm just performs a breadth-first or depth-first search from each vertex and keeps track of all vertices encountered. Doing n such traversals gives an O(n (n+m) ) algorithm, which degenerates to cubic time if the graph is dense. This algorithm is easily implemented, runs well on sparse graphs, and is likely the right answer for your application. ● If the transitive closure of G will be dense, a better algorithm exploits the fact that the strongly connected components of G can be computed in linear time (see Section ). All pairs of vertices in each strongly connected component are mutually reachable. Further, if (x,y) is an edge between two vertices in different strongly connected components, every vertex in y's component is reachable from each vertex in x's component. Thus the problem reduces to finding the transitive closure on a graph of strongly connected components, which should have considerably fewer edges and vertices than G. ● Warshall's algorithm constructs transitive closures in with a simple, slick algorithm that is identical to Floyd's all-pairs-shortest-path algorithm of Section . If we are interested only in the transitive closure, and not the length of the resulting paths, we can reduce storage by retaining only one bit for each matrix element. Thus iff j is reachable from i using only vertices as intermediates. Another related algorithm, discussed in the references, runs in the same time as matrix multiplication. You might conceivably win for large n by using Strassen's fast matrix multiplication algorithm, although I for one wouldn't bother trying. Since transitive closure is provably as hard as matrix multiplication, there is little hope for a significantly faster algorithm. Transitive reduction (also known as minimum equivalent digraph) is essentially the inverse operation of transitive closure, namely reducing the number of edges while maintaining identical reachability properties. The transitive closure of G is identical to the transitive closure of the transitive reduction of G. The primary application of transitive reduction is space minimization, by eliminating redundant edges from G that do not effect reachability. Transitive reduction also arises in graph drawing, where it is important to eliminate as many unnecessary edges as possible in order to reduce the visual clutter. Although the transitive closure of G is uniquely defined, a graph may have many different transitive reductions, including G itself. We want the smallest such reduction, but there can be multiple formulations of the problem: ● A linear-time, quick-and-dirty transitive reduction algorithm identifies the strongly connected components of G, replaces each by a simple directed cycle, and adds these edges to those bridging the different components. Although this reduction is not provably minimal, it is likely to be pretty close on typical graphs. One catch with this heuristic is that it can add edges to the transitive reduction of G that are not in G. This may or may not be a problem for your application. ● If, in fact, all edges of the transitive reduction of G must be in G, we must abandon hope of file:///E|/BOOK/BOOK4/NODE163.HTM (2 of 4) [19/1/2003 1:31:01]
- Page 431 and 432: Searching Notes: Mehlhorn and Tsaka
- Page 433 and 434: Median and Selection followed by fi
- Page 435 and 436: Generating Permutations Next: Gener
- Page 437 and 438: Generating Permutations The rank/un
- Page 439 and 440: Generating Permutations generating
- Page 441 and 442: Generating Subsets look right when
- Page 443 and 444: Generating Subsets above for detail
- Page 445 and 446: Generating Partitions Although the
- Page 447 and 448: Generating Partitions Two related c
- Page 449 and 450: Generating Graphs generate: ● Do
- Page 451 and 452: Generating Graphs Combinatorica [Sk
- Page 453 and 454: Calendrical Calculations Next: Job
- Page 455 and 456: Calendrical Calculations Gregorian,
- Page 457 and 458: Job Scheduling ● To assign a set
- Page 459 and 460: Job Scheduling shop scheduling incl
- Page 461 and 462: Satisfiability logic, and automatic
- Page 463 and 464: Satisfiability Next: Graph Problems
- Page 465 and 466: Graph Problems: Polynomial-Time rec
- Page 467 and 468: Connected Components Testing the co
- Page 469 and 470: Connected Components discussing gra
- Page 471 and 472: Topological Sorting contradiction t
- Page 473 and 474: Minimum Spanning Tree Next: Shortes
- Page 475 and 476: Minimum Spanning Tree help you sort
- Page 477 and 478: Shortest Path Next: Transitive Clos
- Page 479 and 480: Shortest Path easier to program tha
- Page 481: Shortest Path Related Problems: Net
- Page 485 and 486: Transitive Closure and Reduction Al
- Page 487 and 488: Matching augmenting paths and stopp
- Page 489 and 490: Matching Combinatorica [Ski90] prov
- Page 491 and 492: Eulerian Cycle / Chinese Postman ar
- Page 493 and 494: Eulerian Cycle / Chinese Postman Ne
- Page 495 and 496: Edge and Vertex Connectivity Severa
- Page 497 and 498: Edge and Vertex Connectivity Next:
- Page 499 and 500: Network Flow programming model for
- Page 501 and 502: Network Flow Combinatorica [Ski90]
- Page 503 and 504: Drawing Graphs Nicely vertices are
- Page 505 and 506: Drawing Graphs Nicely labs.com/orgs
- Page 507 and 508: Drawing Trees such as the map of th
- Page 509 and 510: Planarity Detection and Embedding N
- Page 511 and 512: Planarity Detection and Embedding p
- Page 513 and 514: Graph Problems: Hard Problems ● C
- Page 515 and 516: Clique approximate even to within a
- Page 517 and 518: Independent Set Next: Vertex Cover
- Page 519 and 520: Independent Set Related Problems: C
- Page 521 and 522: Vertex Cover Vertex cover and indep
- Page 523 and 524: Traveling Salesman Problem Next: Ha
- Page 525 and 526: Traveling Salesman Problem practice
- Page 527 and 528: Traveling Salesman Problem Size is
- Page 529 and 530: Hamiltonian Cycle Eulerian cycle, p
- Page 531 and 532: Graph Partition Next: Vertex Colori
Transitive Closure and Reduction<br />
● <strong>The</strong> simplest algorithm just performs a breadth-first or depth-first search from each vertex and<br />
keeps track of all vertices encountered. Doing n such traversals gives an O(n (n+m) ) algorithm,<br />
which degenerates to cubic time if the graph is dense. This algorithm is easily implemented, runs<br />
well on sparse graphs, and is likely the right answer for your application.<br />
● If the transitive closure of G will be dense, a better algorithm exploits the fact that the strongly<br />
connected components of G can be computed in linear time (see Section ). All pairs of<br />
vertices in each strongly connected component are mutually reachable. Further, if (x,y) is an edge<br />
between two vertices in different strongly connected components, every vertex in y's component<br />
is reachable from each vertex in x's component. Thus the problem reduces to finding the transitive<br />
closure on a graph of strongly connected components, which should have considerably fewer<br />
edges and vertices than G.<br />
● Warshall's algorithm constructs transitive closures in with a simple, slick algorithm that is<br />
identical to Floyd's all-pairs-shortest-path algorithm of Section . If we are interested only in<br />
the transitive closure, and not the length of the resulting paths, we can reduce storage by retaining<br />
only one bit for each matrix element. Thus iff j is reachable from i using only vertices<br />
as intermediates.<br />
Another related algorithm, discussed in the references, runs in the same time as matrix<br />
multiplication. You might conceivably win for large n by using Strassen's fast matrix<br />
multiplication algorithm, although I for one wouldn't bother trying. Since transitive closure is<br />
provably as hard as matrix multiplication, there is little hope for a significantly faster algorithm.<br />
Transitive reduction (also known as minimum equivalent digraph) is essentially the inverse operation of<br />
transitive closure, namely reducing the number of edges while maintaining identical reachability<br />
properties. <strong>The</strong> transitive closure of G is identical to the transitive closure of the transitive reduction of<br />
G. <strong>The</strong> primary application of transitive reduction is space minimization, by eliminating redundant<br />
edges from G that do not effect reachability. Transitive reduction also arises in graph drawing, where it<br />
is important to eliminate as many unnecessary edges as possible in order to reduce the visual clutter.<br />
Although the transitive closure of G is uniquely defined, a graph may have many different transitive<br />
reductions, including G itself. We want the smallest such reduction, but there can be multiple<br />
formulations of the problem:<br />
● A linear-time, quick-and-dirty transitive reduction algorithm identifies the strongly connected<br />
components of G, replaces each by a simple directed cycle, and adds these edges to those bridging<br />
the different components. Although this reduction is not provably minimal, it is likely to be pretty<br />
close on typical graphs.<br />
One catch with this heuristic is that it can add edges to the transitive reduction of G that are not in<br />
G. This may or may not be a problem for your application.<br />
● If, in fact, all edges of the transitive reduction of G must be in G, we must abandon hope of<br />
file:///E|/BOOK/BOOK4/NODE163.HTM (2 of 4) [19/1/2003 1:31:01]