In a connected graph,closeness centrality (or closeness) of a node is a measure of centrality in a network, calculated as the sum of the length of the shortest paths between the node and all other nodes in the graph. Thus the more central a node is, the closer it is to all other nodes.
Closeness was defined by Bavelas (1950) as the reciprocal of the farness,that is:
where is the distance between vertices and . When speaking of closeness centrality, people usually refer to its normalized form which represents the average length of the shortest paths instead of their sum. It is generally given by the previous formula multiplied by , where is the number of nodes in the graph. For large graphs this difference becomes inconsequential so the is dropped resulting in:
This adjustment allows comparisons between nodes of graphs of different sizes.
Taking distances from or to all other nodes is irrelevant in undirected graphs, whereas it can produce totally different results in directed graphs (e.g. a website can have a high closeness centrality from outgoing link, but low closeness centrality from incoming links).
In disconnected graphs
When a graph is not strongly connected, a widespread idea is that of using the sum of reciprocal of distances, instead of the reciprocal of the sum of distances, with the convention :
The most natural modification of Bavelas’s definition of closeness is following the general principle proposed by Marchiori and Latora (2000) that in graphs with infinite distances the harmonic mean behaves better than the arithmetic mean. Indeed, Bavelas’s closeness can be described as the denormalized reciprocal of the arithmetic mean of distances, whereas harmonic centrality is the denormalized reciprocal of the harmonic mean of distances.
This idea was explicitly stated for undirected graphs under the name valued centrality by Dekker (2005) and under the name harmonic centrality by Rochat (2009), axiomatized by Garg (2009) and proposed once again later by Opsahl (2010). It was studied on general directed graphs by Boldi and Vigna (2014). This idea is also quite similar to market potential proposed in Harris (1954) which now often goes by the term market access.
Dangalchev (2006), in a work on network vulnerability proposes for undirected graphs a different definition:
This definition is used effectively for disconnected graphs and allows to create convenient formulae for graph operations. For example if a graph is created by linking node p of graph to node of graph then the combine closeness is:
The information centrality of Stephenson and Zelen (1989) is another closeness measure, which computes the harmonic mean of the resistance distances towards a vertex x, which is smaller if x has many paths of small resistance connecting it to other vertices.
In the classic definition of the closeness centrality, the spread of information is modeled by the use of shortest paths. This model might not be the most realistic for all types of communication scenarios. Thus, related definitions have been discussed to measure closeness, like the random walk closeness centrality introduced by Noh and Rieger (2004). It measures the speed with which randomly walking messages reach a vertex from elsewhere in the graph—a sort of random-walk version of closeness centrality. Hierarchical closeness of Tran and Kwon (2014) is an extended closeness centrality to deal still in another way with the limitation of closeness in graphs that are not strongly connected. The hierarchical closeness explicitly includes information about the range of other nodes that can be affected by the given node.
Following is the code for the calculation of the Closeness Centrality of the graph and its various nodes.
The above function is invoked using the networkx library and once the library is installed, you can eventually use it and the following code has to be written in python for the implementation of the closeness centrality of a node.
The output of the above code is:
The above result is a dictionary depicting the value of closeness centrality of each node. The above is an extension of my article series on the centrality measures. Keep networking!!!
You can read more about the same at
- Percolation Centrality (Centrality Measure)
- Eigenvector Centrality (Centrality Measure)
- Multitape Nondeterministic Turing Machine simulator
- LOB Rules and Restrictions
- ML | Training Image Classifier using Tensorflow Object Detection API
- TensorFlow 2.0
- Interesting Facts in C Programming | Set 2
- Sum of an array using MPI
- What is the Role of Java in the IT Industry?
- Program for Gauss Siedel Method (Computational Mathematics)
- AI Model For Neurodegenerative Diseases
- Introduction to Model View View Model (MVVM)
- Check if the two given stacks are same
- Why Data Structures and Algorithms are "Must Have" for Developers and Where to learn them : Answered
If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to firstname.lastname@example.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.
Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.