Soft configuration model

Random graph model in applied mathematics
Part of a series on
Network science
Internet_map_1024.jpg
  • Theory
  • Graph
  • Complex network
  • Contagion
  • Small-world
  • Scale-free
  • Community structure
  • Percolation
  • Evolution
  • Controllability
  • Graph drawing
  • Social capital
  • Link analysis
  • Optimization
  • Reciprocity
  • Closure
  • Homophily
  • Transitivity
  • Preferential attachment
  • Balance theory
  • Network effect
  • Social influence
Network types
Graphs
Features
Types
Models
Topology
Dynamics
  • Lists
  • Categories
  • Category:Network theory
  • Category:Graph theory
  • v
  • t
  • e

In applied mathematics, the soft configuration model (SCM) is a random graph model subject to the principle of maximum entropy under constraints on the expectation of the degree sequence of sampled graphs.[1] Whereas the configuration model (CM) uniformly samples random graphs of a specific degree sequence, the SCM only retains the specified degree sequence on average over all network realizations; in this sense the SCM has very relaxed constraints relative to those of the CM ("soft" rather than "sharp" constraints[2]). The SCM for graphs of size n {\displaystyle n} has a nonzero probability of sampling any graph of size n {\displaystyle n} , whereas the CM is restricted to only graphs having precisely the prescribed connectivity structure.

Model formulation

The SCM is a statistical ensemble of random graphs G {\displaystyle G} having n {\displaystyle n} vertices ( n = | V ( G ) | {\displaystyle n=|V(G)|} ) labeled { v j } j = 1 n = V ( G ) {\displaystyle \{v_{j}\}_{j=1}^{n}=V(G)} , producing a probability distribution on G n {\displaystyle {\mathcal {G}}_{n}} (the set of graphs of size n {\displaystyle n} ). Imposed on the ensemble are n {\displaystyle n} constraints, namely that the ensemble average of the degree k j {\displaystyle k_{j}} of vertex v j {\displaystyle v_{j}} is equal to a designated value k ^ j {\displaystyle {\widehat {k}}_{j}} , for all v j V ( G ) {\displaystyle v_{j}\in V(G)} . The model is fully parameterized by its size n {\displaystyle n} and expected degree sequence { k ^ j } j = 1 n {\displaystyle \{{\widehat {k}}_{j}\}_{j=1}^{n}} . These constraints are both local (one constraint associated with each vertex) and soft (constraints on the ensemble average of certain observable quantities), and thus yields a canonical ensemble with an extensive number of constraints.[2] The conditions k j = k ^ j {\displaystyle \langle k_{j}\rangle ={\widehat {k}}_{j}} are imposed on the ensemble by the method of Lagrange multipliers (see Maximum-entropy random graph model).

Derivation of the probability distribution

The probability P SCM ( G ) {\displaystyle \mathbb {P} _{\text{SCM}}(G)} of the SCM producing a graph G {\displaystyle G} is determined by maximizing the Gibbs entropy S [ G ] {\displaystyle S[G]} subject to constraints k j = k ^ j ,   j = 1 , , n {\displaystyle \langle k_{j}\rangle ={\widehat {k}}_{j},\ j=1,\ldots ,n} and normalization G G n P SCM ( G ) = 1 {\displaystyle \sum _{G\in {\mathcal {G}}_{n}}\mathbb {P} _{\text{SCM}}(G)=1} . This amounts to optimizing the multi-constraint Lagrange function below:

L ( α , { ψ j } j = 1 n ) = G G n P SCM ( G ) log P SCM ( G ) + α ( 1 G G n P SCM ( G ) ) + j = 1 n ψ j ( k ^ j G G n P SCM ( G ) k j ( G ) ) , {\displaystyle {\begin{aligned}&{\mathcal {L}}\left(\alpha ,\{\psi _{j}\}_{j=1}^{n}\right)\\[6pt]={}&-\sum _{G\in {\mathcal {G}}_{n}}\mathbb {P} _{\text{SCM}}(G)\log \mathbb {P} _{\text{SCM}}(G)+\alpha \left(1-\sum _{G\in {\mathcal {G}}_{n}}\mathbb {P} _{\text{SCM}}(G)\right)+\sum _{j=1}^{n}\psi _{j}\left({\widehat {k}}_{j}-\sum _{G\in {\mathcal {G}}_{n}}\mathbb {P} _{\text{SCM}}(G)k_{j}(G)\right),\end{aligned}}}

where α {\displaystyle \alpha } and { ψ j } j = 1 n {\displaystyle \{\psi _{j}\}_{j=1}^{n}} are the n + 1 {\displaystyle n+1} multipliers to be fixed by the n + 1 {\displaystyle n+1} constraints (normalization and the expected degree sequence). Setting to zero the derivative of the above with respect to P SCM ( G ) {\displaystyle \mathbb {P} _{\text{SCM}}(G)} for an arbitrary G G n {\displaystyle G\in {\mathcal {G}}_{n}} yields

0 = L ( α , { ψ j } j = 1 n ) P SCM ( G ) = log P SCM ( G ) 1 α j = 1 n ψ j k j ( G )     P SCM ( G ) = 1 Z exp [ j = 1 n ψ j k j ( G ) ] , {\displaystyle 0={\frac {\partial {\mathcal {L}}\left(\alpha ,\{\psi _{j}\}_{j=1}^{n}\right)}{\partial \mathbb {P} _{\text{SCM}}(G)}}=-\log \mathbb {P} _{\text{SCM}}(G)-1-\alpha -\sum _{j=1}^{n}\psi _{j}k_{j}(G)\ \Rightarrow \ \mathbb {P} _{\text{SCM}}(G)={\frac {1}{Z}}\exp \left[-\sum _{j=1}^{n}\psi _{j}k_{j}(G)\right],}

the constant Z := e α + 1 = G G n exp [ j = 1 n ψ j k j ( G ) ] = 1 i < j n ( 1 + e ( ψ i + ψ j ) ) {\displaystyle Z:=e^{\alpha +1}=\sum _{G\in {\mathcal {G}}_{n}}\exp \left[-\sum _{j=1}^{n}\psi _{j}k_{j}(G)\right]=\prod _{1\leq i<j\leq n}\left(1+e^{-(\psi _{i}+\psi _{j})}\right)} [3] being the partition function normalizing the distribution; the above exponential expression applies to all G G n {\displaystyle G\in {\mathcal {G}}_{n}} , and thus is the probability distribution. Hence we have an exponential family parameterized by { ψ j } j = 1 n {\displaystyle \{\psi _{j}\}_{j=1}^{n}} , which are related to the expected degree sequence { k ^ j } j = 1 n {\displaystyle \{{\widehat {k}}_{j}\}_{j=1}^{n}} by the following equivalent expressions:

k q = G G n k q ( G ) P SCM ( G ) = log Z ψ q = j q 1 e ψ q + ψ j + 1 = k ^ q ,   q = 1 , , n . {\displaystyle \langle k_{q}\rangle =\sum _{G\in {\mathcal {G}}_{n}}k_{q}(G)\mathbb {P} _{\text{SCM}}(G)=-{\frac {\partial \log Z}{\partial \psi _{q}}}=\sum _{j\neq q}{\frac {1}{e^{\psi _{q}+\psi _{j}}+1}}={\widehat {k}}_{q},\ q=1,\ldots ,n.}

References

  1. ^ van der Hoorn, Pim; Gabor Lippner; Dmitri Krioukov (2017-10-10). "Sparse Maximum-Entropy Random Graphs with a Given Power-Law Degree Distribution". arXiv:1705.10261.
  2. ^ a b Garlaschelli, Diego; Frank den Hollander; Andrea Roccaverde (January 30, 2018). "Coviariance structure behind breaking of ensemble equivalence in random graphs" (PDF). Archived (PDF) from the original on February 4, 2023. Retrieved September 14, 2018.
  3. ^ Park, Juyong; M.E.J. Newman (2004-05-25). "The statistical mechanics of networks". arXiv:cond-mat/0405566.