Login The Centiserver on OMICTools 
Entropy CentralityDefinition
Entropy centrality measures centrality of nodes depending on their contribution to the entropy of the graph.
Entropy is a fundamental concept employed in information theory and Shannon's [SHANNON, C. E. 1948] definition of entropy of a random variable X that can take n values is: The connectivity entropy H_{co} and the centrality entropy measures H_{ce} of a graph G, defined as: where the connectivity of a node v_{i} ∈ V in a graph defined as χ(v) = deg(v_{i}) / 2N where deg(v_{i}) is the number of incident edges to node v_{i} and N the total number of edges in the graph and ϒ(v) = paths(v_{i}) / paths(v_{1}, v_{2}, ..., v_{M}) where paths(v_{i}) is the number of geodesic paths from node v_{i} to all the other nodes in the graph and paths(v_{1}, v_{2}, ..., v_{M}) is the total number of geodesic paths M that exists across all the nodes in the graph. The connectivity entropy measure provides information about the degree of connectivity for a node in the graph. The centrality entropy provides information on the degree of centrality for a node in the graph. Those nodes that will split the graph in two or that will reduce substantially the number of paths available to reach other nodes when removed, will have a higher impact in decreasing the total centrality entropy of a graph. See Redefined Entropy Centrality Algorithm1: Calculate initial total entropy H_{co0} (G) and H_{ce0} (G) 2: for all nodes 2 graph G do 3: Remove node v_{i}, creating a modied graph G' 4: Recalculate H_{coi} (G') and H_{cei} (G'), store these results 5: Restore original graph G 6: end for Computational complexity
O(n^4)
References
