2.5. Weighted graphs

An essential generalization of the notion of graph, going beyond topology, enables the application of graph theory to every aspect of cellular networks. One may ascribe different vertex and edge weights, wii and wij, to match essential parameters of network species and their interactions. Vertex weights might characterize the level of expression of network species, as measured by mass-spectra, microarrays, HPLC, 2-D gel chromatography, and other methods. The edge weights in metabolic networks might characterize the enzymes expression. An edge weight in networks build of protein complexes denotes the number of proteins two complexes share. Other applications of weighted graphs exist or might be anticipated.

An edge or vertex weight could be any nonnegative natural number. (Weights having both positive and negative values has to be renormalized in order to enable using Eqs. (2.9-2.12). Weights can also be integers, as is the case with multigraphs, in which more than one edge connects some pairs of vertices. Another example is molecular networks, the different chemical nature of the atoms in which is sometimes labeled with vertex weights showing the number of their valence electrons. The weighted adjacency matrix, WA(G), has the edge weights Wij as nondiagonal elements, and the vertex weights as diagonal elements, wn. All graph-invariants derived from the adjacency matrix of a directed or nondirected simple graph can be redefined for a weighted graph. Included here are the weighted vertex degree, wi, and the corresponding weighted vertex degree distribution, {w max,..., wmin},weighted adjacency, WA(G), the average weighted vertex degree, < wi>, the weighted connectedness, WConn, the weighted cluster coefficient, wci, and the weighted extended connectivity of order k, kWEC:

y v v v wi =j2 wij ; WA(G) = j2 j2 w^j =j2 wi (2.9a,b)


3. How to Measure Network Complexity

3.1. Careful with symmetry!

There is a long-term controversy in the literature whether complexity of a structure increases with its connectivity or rather it passes through a maximum and goes down to zero for complete graphs. This is illustrated in Figure 5.5 with an example taken from Gell-Mann's book [44] "The Quark and the Jaguar". The example includes two graphs with eight vertices; the first one is totally disconnected, whereas the second one is totally connected (complete) graph. It is argued that the two graphs are equally complex. The arguments in favor of this conclusion are based on the binomial distribution of vertex degrees in random graphs (Figure 5.6). Additional arguments in favor of such views come from Shannon's information theory [1]. According to it, the entropy of information H(a) in describing a message of Nsymbols, distributed according to some equivalence criterion a into k groups of N1, N2,..., Nk symbols, is calculated according to the formula:

A *NiNi

H(a) =- Pi log2 Pi = - N log2 N bits/symbol (3.1) i=1 i=1

where the ratio Ni/N = pi defines the probability of occurrence of the symbols of the ith group.

In using Eq. (3.1) to characterize networks or graphs, it is the vertices that most frequently play the role of symbols or system elements. When the criterion of equivalence a is based on the orbits of the automorphism

Figure 5.5. Which graph is more complex: the totally disconnected graph a or the complete graph b?
Figure 5.6. The binomial distribution of vertex degrees in random graphs is used as an argument that complexity of graphs passes through a maximum with the increase in connectivity.

group of the graph, all vertices of the totally disconnected graph belong to a single orbit, and the same is true for the vertices in the complete graph. Eq. (3.1) then shows that the information index I (a) = 0 for both graphs. The same result is obtained when the partitioning of the graph vertices into groups is based on the equality of their vertex degrees, all of which are zeros in the totally disconnected graph, and all of which are of degree N — 1 in the complete graph.

The logic of the above arguments seems flawless. Yet, our intuition tells us that the complete graph is more complex that the totally disconnected graph. There is a hidden weak point in the manner the Shannon theory is applied, namely how symmetry is used to partition the vertices into groups. One should take into account that symmetry is a simplifying factor, but not a complexifying one. A measure of structural or topological complexity must not be based on symmetry. The use of symmetry is justified only in defining compositional complexity, which is based on equivalence and diversity of the elements of the system studied.

3.2. Can Shannon's information content measure topological complexity?

A different approach to characterizing structures by Shannon's theory was proposed in 1977 by Bonchev and Trinajstic in a study on molecular branching as a basic topological feature of molecules [15]. The approach was later generalized by constructing a finite probability scheme for a graph

[16]. Let the graph is represented by some kind of elements (vertices, edges, distances, cliques, etc.); let also assign a certain weight (value, magnitude) wi to each of the N elements. Define the probability for a randomly chosen element i to have the weight wi as pi = wi / S wi, with Swi = w, and Spi = 1. The probability scheme thus constructed

Element 1, 2,..., N Weight w 1, w 2, ...,wN Probability p1, p2,..., pN

enables defining a series of information indices, I(w), with Shannon's Eq. (3.1).

Considering the simplest graph elements, the vertices, and assuming the weights assigned to each vertex to be the corresponding vertex degrees, one easily distinguishes the null complexity of the totally disconnected graph from the high complexity of the complete graph. The probability for a randomly chosen vertex i in the complete graph of Vvertices to have a certain degree ai is pi = ai / A = 1/ V, wherefrom Eq. (3.1) yields for the Shannon entropy of the vertex degree distribution the nonzero value of log2 V.

Our preceding studies [17, 45-47] have shown that a better complexity measure of graphs and networks is the vertex degree magnitude-based information content, Ivd. Shannon defines information as the reduced entropy of the system relative to the maximum entropy that can exist in a system with the same number of elements:

The Shannon entropy of a graph with a total weight W and vertex weights wi is given by a formula derived from Eq. (3.1):

Was this article helpful?

0 0

Post a comment