Documente Academic
Documente Profesional
Documente Cultură
( ) = ( + + ) =
( ) = ( + ) =
(3 )={e, (13), (24), (13) (24)}. Orbits of (3 ) :
{1, 3},{2, 4}. Hence,
( ) = ( + ) = = =
( ) = ( ) = =
( )
( ) = ||
They assigned a probability to each vertex as,
=
(
1 = {{ , 1 , . . . , {1 ,
2 = {{ , 1 , . . . , {1 ,
= {{ , 1 , . . . , {1 ,
For each vertex V he determined the local information graph
( , ) = ( , which is induced by the paths 1 ( ), . . , (
where = 1 . . . and = { , 1 , . . . , . . .
Example:
(1 )=4 and it is considered that: 1 = 1 = 4, 2 = 2 = 3,
3 = 3 = 2, 4 = 4 = 1
(1 ) = (4 ) = (5 ) = (8 ) = 21+22+23+4 = 19
(2 ) = (7 ) = 21+32+23 = 21
(3 ) = (6 ) = 31+32+3 = 23
(1 ) = ( )( ( )) =
=1
19
19
= 4 19 4 19
4 + 221 + 223 4 + 221 + 223
21 21
+ 2 19
4 + 221 + 223 419 + 221 + 223
( ) = ( ) = ( ) = ( ) = ( ) = ( ) = + + + =
( ) = ( ) = + + =
(1 ) = ( )( ( )) =
=1
36
36
= 4 36
6 + 238 636 + 238
The degree distribution network entropy
Fig. Here two given, connected nodes , are shown, displaying different degrees ,
. Since we are interested in the remaining degrees, a different value needs to be
considered (here indicated as , )
Entropy as a measure of centrality in
networks
Frank Tutzauer (2007) has proposed a measure of centrality for
networks characterized by path-based transfer flows.
The next node then randomly chooses from among its neighbors
and again the flow either stops or continues. The object thus
traverses a path in the network, traveling along links, stopping
when a loop is chosen. Each neighbor is assumed to be chosen with
equal likelihood.
What is the probability that a flow beginning at
vertex 5 ends at vertex 2, in the network below?
Entropy-Closeness
Scatter plots and linear regression
Entropy-Betweenness
Scatter plots and linear regression
Entropy-Degree
Another visualization of the network:
For the graph with loops the number of edges is: |E|=36. We
obtain H(G)=3,967. To normalize H(G), we divide it by the
maximum entropy given by ld(|V|)=ld(16)=4. So, we obtain
0,991.
Entropy of degree
distribution: H(p)=1,9544
Remainin
g q(k)
degree Entropy of the remaining
0 0,075 degree distribution:
1 0,25 H(q)=1,832
2 0,375
3 0,3
Summary and conclusion