<?xml version="1.0" encoding="UTF-8" standalone="yes"?><oembed><version><![CDATA[1.0]]></version><provider_name><![CDATA[Azimuth]]></provider_name><provider_url><![CDATA[https://johncarlosbaez.wordpress.com]]></provider_url><author_name><![CDATA[John Baez]]></author_name><author_url><![CDATA[https://johncarlosbaez.wordpress.com/author/johncarlosbaez/]]></author_url><title><![CDATA[Network Theory (Part&nbsp;23)]]></title><type><![CDATA[link]]></type><html><![CDATA[<p>We&#8217;ve been looking at reaction networks, and we&#8217;re getting ready to find equilibrium solutions of the equations they give.  To do this, we&#8217;ll need to connect them to another kind of network we&#8217;ve studied.   A reaction network is something like this:</p>
<div align="center"><img width="200" src="https://i1.wp.com/math.ucr.edu/home/baez/networks/chemical_reaction_network_part_20_III.png" alt="" /></div>
<p>It&#8217;s a bunch of <b>complexes</b>, which are sums of basic building-blocks called <b>species</b>, together with arrows called <b>transitions</b> going between the complexes.  If we know a number  for each transition describing the rate at which it occurs, we get an equation called the &#8216;rate equation&#8217;.  This describes how the amount of each species changes with time.   We&#8217;ve been talking about this equation ever since the start of this series!   <a href="http://math.ucr.edu/home/baez/networks/networks_22.html">Last time</a>, we wrote it down in a new very compact form:</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B+%5Cfrac%7Bd+x%7D%7Bd+t%7D+%3D+Y+H+x%5EY++%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{ &#92;frac{d x}{d t} = Y H x^Y  } ' title='&#92;displaystyle{ &#92;frac{d x}{d t} = Y H x^Y  } ' class='latex' /></p>
<p>Here <img src='https://s0.wp.com/latex.php?latex=x&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='x' title='x' class='latex' /> is a vector whose components are the amounts of each species, while <img src='https://s0.wp.com/latex.php?latex=H&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H' title='H' class='latex' /> and <img src='https://s0.wp.com/latex.php?latex=Y&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='Y' title='Y' class='latex' /> are certain matrices.</p>
<p>But now suppose we forget how each complex is made of species!  Suppose we just think of them as abstract things in their own right, like numbered boxes:</p>
<div align="center"><img width="200" src="https://i0.wp.com/math.ucr.edu/home/baez/networks/markov_process_vs_reaction_network_1.png" alt="" /></div>
<p>We can use these boxes to describe <b>states</b> of some system.  The arrows still describe <b>transitions</b>, but now we think of these as ways for the system to hop from one state to another.   Say we know a number for each transition describing the probability per time at which it occurs:</p>
<div align="center"><img width="200" src="https://i2.wp.com/math.ucr.edu/home/baez/networks/markov_process_vs_reaction_network_2.png" alt="" /></div>
<p>Then we get a &#8216;Markov process&#8217;&#8212;or in other words, a random walk where our system hops from one state to another.  If <img src='https://s0.wp.com/latex.php?latex=%5Cpsi&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi' title='&#92;psi' class='latex' /> is the probability distribution saying how likely the system is to be in each state, this Markov process is described by this equation:</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B+%5Cfrac%7Bd+%5Cpsi%7D%7Bd+t%7D+%3D+H+%5Cpsi++%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{ &#92;frac{d &#92;psi}{d t} = H &#92;psi  } ' title='&#92;displaystyle{ &#92;frac{d &#92;psi}{d t} = H &#92;psi  } ' class='latex' /></p>
<p>This is simpler than the rate equation, because it&#8217;s linear.  But the matrix <img src='https://s0.wp.com/latex.php?latex=H&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H' title='H' class='latex' /> is the same&mdash;we&#8217;ll see that explicitly later on today.</p>
<p>What&#8217;s the point?  Well, our ultimate goal is to prove the deficiency zero theorem, which gives equilibrium solutions of the rate equation.  That means finding <img src='https://s0.wp.com/latex.php?latex=x&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='x' title='x' class='latex' /> with </p>
<p><img src='https://s0.wp.com/latex.php?latex=Y+H+x%5EY+%3D+0+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='Y H x^Y = 0 ' title='Y H x^Y = 0 ' class='latex' /></p>
<p>Today we&#8217;ll find all equilibria for the Markov process, meaning all <img src='https://s0.wp.com/latex.php?latex=%5Cpsi&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi' title='&#92;psi' class='latex' /> with</p>
<p><img src='https://s0.wp.com/latex.php?latex=H+%5Cpsi+%3D+0+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H &#92;psi = 0 ' title='H &#92;psi = 0 ' class='latex' /></p>
<p>Then next time we&#8217;ll show some of these have the form</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cpsi+%3D+x%5EY+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi = x^Y ' title='&#92;psi = x^Y ' class='latex' /></p>
<p>So, we&#8217;ll get </p>
<p><img src='https://s0.wp.com/latex.php?latex=H+x%5EY+%3D+0+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H x^Y = 0 ' title='H x^Y = 0 ' class='latex' /></p>
<p>and thus</p>
<p><img src='https://s0.wp.com/latex.php?latex=Y+H+x%5EY+%3D+0+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='Y H x^Y = 0 ' title='Y H x^Y = 0 ' class='latex' /></p>
<p>as desired!</p>
<p>So, let&#8217;s get to to work.</p>
<h3> The Markov process of a graph with rates </h3>
<p>We&#8217;ve been looking at stochastic reaction networks, which are things like this:</p>
<div align="center">
<img src="https://i1.wp.com/math.ucr.edu/home/baez/networks/reaction_network_diagram_1.png" alt="" />
</div>
<p>However, we can build a Markov process starting from just part of this information:</p>
<div align="center">
<img src="https://i0.wp.com/math.ucr.edu/home/baez/networks/markov_process_diagram_1.png" alt="" />
</div>
<p>Let&#8217;s call this thing a &#8216;graph with rates&#8217;, for lack of a better name.  We&#8217;ve been calling the things in <img src='https://s0.wp.com/latex.php?latex=K&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='K' title='K' class='latex' /> &#8216;complexes&#8217;, but now we&#8217;ll think of them as &#8216;states&#8217;.  So:</p>
<p><b>Definition.</b>  A <b>graph with rates</b> consists of:</p>
<p>&bull; a finite set of <b>states</b> <img src='https://s0.wp.com/latex.php?latex=K%2C&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='K,' title='K,' class='latex' /></p>
<p>&bull; a finite set of <b>transitions</b> <img src='https://s0.wp.com/latex.php?latex=T%2C&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='T,' title='T,' class='latex' /></p>
<p>&bull; a map <img src='https://s0.wp.com/latex.php?latex=r%3A+T+%5Cto+%280%2C%5Cinfty%29&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='r: T &#92;to (0,&#92;infty)' title='r: T &#92;to (0,&#92;infty)' class='latex' /> giving a <b>rate constant</b> for each transition,</p>
<p>&bull; <b>source</b> and <b>target</b> maps <img src='https://s0.wp.com/latex.php?latex=s%2Ct+%3A+T+%5Cto+K&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='s,t : T &#92;to K' title='s,t : T &#92;to K' class='latex' /> saying where each transition starts and ends.</p>
<p>Starting from this, we can get a <a href="http://math.ucr.edu/home/baez/networks/networks_11.html">Markov process</a> describing how a probability distribution <img src='https://s0.wp.com/latex.php?latex=%5Cpsi&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi' title='&#92;psi' class='latex' /> on our set of states will change with time.  As usual, this Markov process is described by a master equation:</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B+%5Cfrac%7Bd+%5Cpsi%7D%7Bd+t%7D+%3D+H+%5Cpsi+%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{ &#92;frac{d &#92;psi}{d t} = H &#92;psi } ' title='&#92;displaystyle{ &#92;frac{d &#92;psi}{d t} = H &#92;psi } ' class='latex' /></p>
<p>for some Hamiltonian:</p>
<p><img src='https://s0.wp.com/latex.php?latex=H+%3A+%5Cmathbb%7BR%7D%5EK+%5Cto+%5Cmathbb%7BR%7D%5EK+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H : &#92;mathbb{R}^K &#92;to &#92;mathbb{R}^K ' title='H : &#92;mathbb{R}^K &#92;to &#92;mathbb{R}^K ' class='latex' /></p>
<p>What is this Hamiltonian, exactly?  Let&#8217;s think of it as a matrix where <img src='https://s0.wp.com/latex.php?latex=H_%7Bi+j%7D&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H_{i j}' title='H_{i j}' class='latex' /> is the probability per time for our system to hop from the state <img src='https://s0.wp.com/latex.php?latex=j&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='j' title='j' class='latex' /> to the state <img src='https://s0.wp.com/latex.php?latex=i.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i.' title='i.' class='latex' />   This looks backwards, but don&#8217;t blame me&mdash;blame the guys who invented the usual conventions for matrix algebra.  Clearly if <img src='https://s0.wp.com/latex.php?latex=i+%5Cne+j&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i &#92;ne j' title='i &#92;ne j' class='latex' /> this probability per time should be the sum of the rate constants of all transitions going from <img src='https://s0.wp.com/latex.php?latex=j&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='j' title='j' class='latex' /> to <img src='https://s0.wp.com/latex.php?latex=i&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i' title='i' class='latex' />:</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B+i+%5Cne+j+%5Cquad+%5CRightarrow+%5Cquad+H_%7Bi+j%7D+%3D++%5Csum_%7B%5Ctau%3A+j+%5Cto+i%7D+r%28%5Ctau%29+%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{ i &#92;ne j &#92;quad &#92;Rightarrow &#92;quad H_{i j} =  &#92;sum_{&#92;tau: j &#92;to i} r(&#92;tau) } ' title='&#92;displaystyle{ i &#92;ne j &#92;quad &#92;Rightarrow &#92;quad H_{i j} =  &#92;sum_{&#92;tau: j &#92;to i} r(&#92;tau) } ' class='latex' /></p>
<p>where we write <img src='https://s0.wp.com/latex.php?latex=%5Ctau%3A+j+%5Cto+i&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;tau: j &#92;to i' title='&#92;tau: j &#92;to i' class='latex' /> when <img src='https://s0.wp.com/latex.php?latex=%5Ctau&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;tau' title='&#92;tau' class='latex' /> is a transition with source <img src='https://s0.wp.com/latex.php?latex=j&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='j' title='j' class='latex' /> and target <img src='https://s0.wp.com/latex.php?latex=i.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i.' title='i.' class='latex' /></p>
<p>Now, we saw in <a href="http://math.ucr.edu/home/baez/networks/networks_11.html">Part 11</a> that for a probability distribution to remain a probability distribution as it evolves in time according to the master equation, we need <img src='https://s0.wp.com/latex.php?latex=H&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H' title='H' class='latex' /> to be <b>infinitesimal stochastic</b>: its off-diagonal entries must be nonnegative, and the sum of the entries in each column must be zero.  </p>
<p>The first condition holds already, and the second one tells us what the diagonal entries must be.  So, we&#8217;re basically done describing <img src='https://s0.wp.com/latex.php?latex=H.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H.' title='H.' class='latex' />  But we can summarize it this way:</p>
<p><b>Puzzle 1.</b>  Think of <img src='https://s0.wp.com/latex.php?latex=%5Cmathbb%7BR%7D%5EK&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;mathbb{R}^K' title='&#92;mathbb{R}^K' class='latex' /> as the vector space consisting of finite linear combinations of elements <img src='https://s0.wp.com/latex.php?latex=%5Ckappa+%5Cin+K.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;kappa &#92;in K.' title='&#92;kappa &#92;in K.' class='latex' />  Then show</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B++H+%5Ckappa+%3D+%5Csum_%7Bs%28%5Ctau%29+%3D+%5Ckappa%7D+r%28%5Ctau%29+%28t%28%5Ctau%29+-+s%28%5Ctau%29%29+%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{  H &#92;kappa = &#92;sum_{s(&#92;tau) = &#92;kappa} r(&#92;tau) (t(&#92;tau) - s(&#92;tau)) } ' title='&#92;displaystyle{  H &#92;kappa = &#92;sum_{s(&#92;tau) = &#92;kappa} r(&#92;tau) (t(&#92;tau) - s(&#92;tau)) } ' class='latex' /> </p>
<h3>  Equilibrium solutions of the master equation </h3>
<p>Now we&#8217;ll classify <b>equilibrium solutions</b> of the master equation, meaning <img src='https://s0.wp.com/latex.php?latex=%5Cpsi+%5Cin+%5Cmathbb%7BR%7D%5EK&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi &#92;in &#92;mathbb{R}^K' title='&#92;psi &#92;in &#92;mathbb{R}^K' class='latex' /> with </p>
<p><img src='https://s0.wp.com/latex.php?latex=H+%5Cpsi+%3D+0+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H &#92;psi = 0 ' title='H &#92;psi = 0 ' class='latex' /></p>
<p>We&#8217;ll do only do this when our graph with rates is &#8216;weakly reversible&#8217;.  This concept doesn&#8217;t actually depend on the rates, so let&#8217;s be general and say:</p>
<p><b>Definition.</b> A graph is <b>weakly reversible</b> if for every edge <img src='https://s0.wp.com/latex.php?latex=%5Ctau+%3A+i+%5Cto+j%2C&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;tau : i &#92;to j,' title='&#92;tau : i &#92;to j,' class='latex' /> there is <b>directed path</b> going back from <img src='https://s0.wp.com/latex.php?latex=j&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='j' title='j' class='latex' /> to <img src='https://s0.wp.com/latex.php?latex=i%2C&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i,' title='i,' class='latex' /> meaning that we have edges</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Ctau_1+%3A+j+%5Cto+j_1+%2C+%5Cquad+%5Ctau_2+%3A+j_1+%5Cto+j_2+%2C+%5Cquad+%5Cdots%2C+%5Cquad+%5Ctau_n%3A+j_%7Bn-1%7D+%5Cto+i+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;tau_1 : j &#92;to j_1 , &#92;quad &#92;tau_2 : j_1 &#92;to j_2 , &#92;quad &#92;dots, &#92;quad &#92;tau_n: j_{n-1} &#92;to i ' title='&#92;tau_1 : j &#92;to j_1 , &#92;quad &#92;tau_2 : j_1 &#92;to j_2 , &#92;quad &#92;dots, &#92;quad &#92;tau_n: j_{n-1} &#92;to i ' class='latex' /></p>
<p>This graph with rates is <i>not</i> weakly reversible:</p>
<div align="center"><img width="200" src="https://i2.wp.com/math.ucr.edu/home/baez/networks/markov_process_vs_reaction_network_2.png" alt="" /></div>
<p>but this one is:</p>
<div align="center"><img width="200" src="https://i1.wp.com/math.ucr.edu/home/baez/networks/markov_process_vs_reaction_network_3.png" alt="" /></div>
<p>The good thing about the weakly reversible case is that we get one equilibrium solution of the master equation for each component of our graph, and all equilibrium solutions are linear combinations of these.   This is <i>not</i> true in general!  For example, this guy is not weakly reversible:</p>
<div align="center"><img width="240" src="https://i1.wp.com/math.ucr.edu/home/baez/networks/markov_process_not_weakly_reversible.png" alt="" /></div>
<p>It has only one component, but the master equation has two linearly independent equilibrium solutions: one that vanishes except at the state 0, and one that vanishes except at the state 2.  </p>
<p>The idea of a &#8216;component&#8217; is supposed to be fairly intuitive&#8212;our graph falls apart into pieces called components&#8212;but we should make it precise.  As explained in <a href="http://math.ucr.edu/home/baez/networks/networks_21.html">Part 21</a>, the graphs we&#8217;re using here are directed multigraphs, meaning things like</p>
<p><img src='https://s0.wp.com/latex.php?latex=s%2C+t+%3A+E+%5Cto+V++&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='s, t : E &#92;to V  ' title='s, t : E &#92;to V  ' class='latex' /></p>
<p>where <img src='https://s0.wp.com/latex.php?latex=E&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='E' title='E' class='latex' /> is the set of <b>edges</b> (our transitions) and <img src='https://s0.wp.com/latex.php?latex=V&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='V' title='V' class='latex' /> is the set of <b>vertices</b> (our states).    There are actually two famous concepts of &#8216;component&#8217; for graphs of this sort: &#8216;strongly connected&#8217; components and &#8216;connected&#8217; components.   We only need connected components, but let me explain both concepts, in a futile attempt to slake your insatiable thirst for knowledge.</p>
<p>Two vertices <img src='https://s0.wp.com/latex.php?latex=i&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i' title='i' class='latex' /> and <img src='https://s0.wp.com/latex.php?latex=j&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='j' title='j' class='latex' /> of a graph lie in the same <b>strongly connected component</b> iff you can find a directed path of edges from <img src='https://s0.wp.com/latex.php?latex=i&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i' title='i' class='latex' /> to <img src='https://s0.wp.com/latex.php?latex=j&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='j' title='j' class='latex' /> and also one from <img src='https://s0.wp.com/latex.php?latex=j&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='j' title='j' class='latex' /> back to <img src='https://s0.wp.com/latex.php?latex=i.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i.' title='i.' class='latex' />  </p>
<p>Remember, a directed path from <img src='https://s0.wp.com/latex.php?latex=i&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i' title='i' class='latex' /> to <img src='https://s0.wp.com/latex.php?latex=j&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='j' title='j' class='latex' /> looks like this:</p>
<p><img src='https://s0.wp.com/latex.php?latex=i+%5Cto+a+%5Cto+b+%5Cto+c+%5Cto+j+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i &#92;to a &#92;to b &#92;to c &#92;to j ' title='i &#92;to a &#92;to b &#92;to c &#92;to j ' class='latex' /></p>
<p>Here&#8217;s a path from <img src='https://s0.wp.com/latex.php?latex=x&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='x' title='x' class='latex' /> to <img src='https://s0.wp.com/latex.php?latex=y&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='y' title='y' class='latex' /> that is not directed:</p>
<p><img src='https://s0.wp.com/latex.php?latex=i+%5Cto+a+%5Cleftarrow+b+%5Cto+c+%5Cto+j+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i &#92;to a &#92;leftarrow b &#92;to c &#92;to j ' title='i &#92;to a &#92;leftarrow b &#92;to c &#92;to j ' class='latex' /></p>
<p>and I hope you can write down the obvious but tedious definition of an &#8216;undirected path&#8217;, meaning a path made of edges that don&#8217;t necessarily point in the correct direction.   Given that, we say two vertices <img src='https://s0.wp.com/latex.php?latex=i&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i' title='i' class='latex' /> and <img src='https://s0.wp.com/latex.php?latex=j&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='j' title='j' class='latex' /> lie in the same <b>connected component</b> iff you can find an <i>undirected</i> path going from <img src='https://s0.wp.com/latex.php?latex=i&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i' title='i' class='latex' /> to <img src='https://s0.wp.com/latex.php?latex=j.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='j.' title='j.' class='latex' />  In this case, there will automatically also be an undirected path going from <img src='https://s0.wp.com/latex.php?latex=j&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='j' title='j' class='latex' /> to <img src='https://s0.wp.com/latex.php?latex=i.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i.' title='i.' class='latex' /></p>
<p>For example, <img src='https://s0.wp.com/latex.php?latex=i&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i' title='i' class='latex' /> and <img src='https://s0.wp.com/latex.php?latex=j&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='j' title='j' class='latex' /> lie in the same connected component here, but not the same strongly connected component:</p>
<p><img src='https://s0.wp.com/latex.php?latex=i+%5Cto+a+%5Cleftarrow+b+%5Cto+c+%5Cto+j+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='i &#92;to a &#92;leftarrow b &#92;to c &#92;to j ' title='i &#92;to a &#92;leftarrow b &#92;to c &#92;to j ' class='latex' /></p>
<p>Here&#8217;s a graph with one connected component and 3 strongly connected components, which are marked in blue:</p>
<div align="center"><a href="http://en.wikipedia.org/wiki/Strongly_connected_component"><img src="https://i1.wp.com/math.ucr.edu/home/baez/networks/strongly_connected_component.png" /></a></div>
<p>For the theory we&#8217;re looking at now, <i>we only care about connected components, not strongly connected components!</i>   However:</p>
<p><b>Puzzle 2.</b>  Show that for weakly reversible graph, the connected components are the same as the strongly connected components.  </p>
<p>With these definitions out of way, we can state today&#8217;s big theorem:</p>
<p><b>Theorem.</b>  Suppose <img src='https://s0.wp.com/latex.php?latex=H&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H' title='H' class='latex' /> is the Hamiltonian of a weakly reversible graph with rates:</p>
<div align="center">
<img src="https://i0.wp.com/math.ucr.edu/home/baez/networks/markov_process_diagram_1.png" alt="" />
</div>
<p>Then for each connected component <img src='https://s0.wp.com/latex.php?latex=C+%5Csubseteq+K%2C&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='C &#92;subseteq K,' title='C &#92;subseteq K,' class='latex' /> there exists a unique probability distribution <img src='https://s0.wp.com/latex.php?latex=%5Cpsi_C+%5Cin+%5Cmathbb%7BR%7D%5EK&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi_C &#92;in &#92;mathbb{R}^K' title='&#92;psi_C &#92;in &#92;mathbb{R}^K' class='latex' /> that is positive on that component, zero elsewhere, and is an equilibrium solution of the master equation:</p>
<p><img src='https://s0.wp.com/latex.php?latex=H+%5Cpsi_C+%3D+0+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H &#92;psi_C = 0 ' title='H &#92;psi_C = 0 ' class='latex' /></p>
<p>Moreover, these probability distributions <img src='https://s0.wp.com/latex.php?latex=%5Cpsi_C&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi_C' title='&#92;psi_C' class='latex' /> form a basis for the space of equilibrium solutions of the master equation.  So, the dimension of this space is the number of components of <img src='https://s0.wp.com/latex.php?latex=K.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='K.' title='K.' class='latex' /></p>
<p><i>Proof.</i>   We start by assuming our graph has one connected component.  We use the Perron&ndash;Frobenius theorem, as explained in <a href="http://math.ucr.edu/home/baez/networks/networks_20.html">Part 20</a>.  This applies to &#8216;nonnegative&#8217; matrices, meaning those whose entries are all nonnegative.  That is not true of <img src='https://s0.wp.com/latex.php?latex=H&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H' title='H' class='latex' /> itself, but only its diagonal entries can be negative, so if we choose a large enough number <img src='https://s0.wp.com/latex.php?latex=c+%3E+0%2C&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='c &gt; 0,' title='c &gt; 0,' class='latex' /> <img src='https://s0.wp.com/latex.php?latex=H+%2B+c+I&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H + c I' title='H + c I' class='latex' /> will be nonnegative.  </p>
<p>Since our graph is weakly reversible and has one connected component, it follows straight from the definitions that the operator <img src='https://s0.wp.com/latex.php?latex=H+%2B+c+I&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H + c I' title='H + c I' class='latex' /> will also be &#8216;irreducible&#8217; in the sense of <a href="http://math.ucr.edu/home/baez/networks/networks_20.html">Part 20</a>.  The Perron&ndash;Frobenius theorem then swings into action, and we instantly conclude several things.</p>
<p>First, <img src='https://s0.wp.com/latex.php?latex=H+%2B+c+I&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H + c I' title='H + c I' class='latex' /> has a positive real eigenvalue <img src='https://s0.wp.com/latex.php?latex=r&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='r' title='r' class='latex' /> such that any other eigenvalue, possibly complex, has absolute value <img src='https://s0.wp.com/latex.php?latex=%5Cle+r.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;le r.' title='&#92;le r.' class='latex' />  Second, there is an eigenvector <img src='https://s0.wp.com/latex.php?latex=%5Cpsi&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi' title='&#92;psi' class='latex' /> with eigenvalue <img src='https://s0.wp.com/latex.php?latex=r&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='r' title='r' class='latex' /> and all positive components.  Third, any other eigenvector with eigenvalue <img src='https://s0.wp.com/latex.php?latex=r&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='r' title='r' class='latex' /> is a scalar multiple of <img src='https://s0.wp.com/latex.php?latex=%5Cpsi.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi.' title='&#92;psi.' class='latex' />  </p>
<p>Subtracting <img src='https://s0.wp.com/latex.php?latex=c%2C&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='c,' title='c,' class='latex' /> it follows that <img src='https://s0.wp.com/latex.php?latex=%5Clambda+%3D+r+-+c&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;lambda = r - c' title='&#92;lambda = r - c' class='latex' /> is the eigenvalue of <img src='https://s0.wp.com/latex.php?latex=H&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H' title='H' class='latex' /> with the largest real part.  We have <img src='https://s0.wp.com/latex.php?latex=H+%5Cpsi+%3D+%5Clambda+%5Cpsi%2C&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H &#92;psi = &#92;lambda &#92;psi,' title='H &#92;psi = &#92;lambda &#92;psi,' class='latex' /> and any other vector with this property is a scalar multiple of <img src='https://s0.wp.com/latex.php?latex=%5Cpsi.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi.' title='&#92;psi.' class='latex' />   </p>
<p>We can show that in fact <img src='https://s0.wp.com/latex.php?latex=%5Clambda+%3D+0.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;lambda = 0.' title='&#92;lambda = 0.' class='latex' />  To do this we copy an argument from <a href="http://math.ucr.edu/home/baez/networks/networks_20.html">Part 20</a>.   First, since <img src='https://s0.wp.com/latex.php?latex=%5Cpsi&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi' title='&#92;psi' class='latex' /> is positive we can normalize it to be a probability distribution:</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B+%5Csum_%7Bi+%5Cin+K%7D+%5Cpsi_i+%3D+1+%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{ &#92;sum_{i &#92;in K} &#92;psi_i = 1 } ' title='&#92;displaystyle{ &#92;sum_{i &#92;in K} &#92;psi_i = 1 } ' class='latex' /></p>
<p>Since <img src='https://s0.wp.com/latex.php?latex=H&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H' title='H' class='latex' /> is infinitesimal stochastic, <img src='https://s0.wp.com/latex.php?latex=%5Cexp%28t+H%29&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;exp(t H)' title='&#92;exp(t H)' class='latex' /> sends probability distributions to probability distributions:</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B+%5Csum_%7Bi+%5Cin+K%7D+%28%5Cexp%28t+H%29+%5Cpsi%29_i+%3D+1+%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{ &#92;sum_{i &#92;in K} (&#92;exp(t H) &#92;psi)_i = 1 } ' title='&#92;displaystyle{ &#92;sum_{i &#92;in K} (&#92;exp(t H) &#92;psi)_i = 1 } ' class='latex' /></p>
<p>for all <img src='https://s0.wp.com/latex.php?latex=t+%5Cge+0.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='t &#92;ge 0.' title='t &#92;ge 0.' class='latex' />  On the other hand,</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B+%5Csum_%7Bi+%5Cin+K%7D+%28%5Cexp%28t+H%29%5Cpsi%29_i+%3D+%5Csum_%7Bi+%5Cin+K%7D+e%5E%7Bt+%5Clambda%7D+%5Cpsi_i+%3D+e%5E%7Bt+%5Clambda%7D+%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{ &#92;sum_{i &#92;in K} (&#92;exp(t H)&#92;psi)_i = &#92;sum_{i &#92;in K} e^{t &#92;lambda} &#92;psi_i = e^{t &#92;lambda} } ' title='&#92;displaystyle{ &#92;sum_{i &#92;in K} (&#92;exp(t H)&#92;psi)_i = &#92;sum_{i &#92;in K} e^{t &#92;lambda} &#92;psi_i = e^{t &#92;lambda} } ' class='latex' /></p>
<p>so we must have <img src='https://s0.wp.com/latex.php?latex=%5Clambda+%3D+0.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;lambda = 0.' title='&#92;lambda = 0.' class='latex' />  </p>
<p>We conclude that when our graph has one connected component, there is a probability distribution <img src='https://s0.wp.com/latex.php?latex=%5Cpsi+%5Cin+%5Cmathbb%7BR%7D%5EK&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi &#92;in &#92;mathbb{R}^K' title='&#92;psi &#92;in &#92;mathbb{R}^K' class='latex' /> that is positive everywhere and has <img src='https://s0.wp.com/latex.php?latex=H+%5Cpsi+%3D+0.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H &#92;psi = 0.' title='H &#92;psi = 0.' class='latex' />  Moreover, any <img src='https://s0.wp.com/latex.php?latex=%5Cphi+%5Cin+%5Cmathbb%7BR%7D%5EK&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;phi &#92;in &#92;mathbb{R}^K' title='&#92;phi &#92;in &#92;mathbb{R}^K' class='latex' /> with <img src='https://s0.wp.com/latex.php?latex=H+%5Cphi+%3D+0&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H &#92;phi = 0' title='H &#92;phi = 0' class='latex' /> is a scalar multiple of <img src='https://s0.wp.com/latex.php?latex=%5Cpsi.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi.' title='&#92;psi.' class='latex' /></p>
<p>When <img src='https://s0.wp.com/latex.php?latex=K&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='K' title='K' class='latex' /> has several components, the matrix <img src='https://s0.wp.com/latex.php?latex=H&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H' title='H' class='latex' /> is block diagonal, with one block for each component.  So, we can run the above argument on each component <img src='https://s0.wp.com/latex.php?latex=C+%5Csubseteq+K&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='C &#92;subseteq K' title='C &#92;subseteq K' class='latex' /> and get a probability distribution <img src='https://s0.wp.com/latex.php?latex=%5Cpsi_C+%5Cin+%5Cmathbb%7BR%7D%5EK&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi_C &#92;in &#92;mathbb{R}^K' title='&#92;psi_C &#92;in &#92;mathbb{R}^K' class='latex' /> that is positive on <img src='https://s0.wp.com/latex.php?latex=C.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='C.' title='C.' class='latex' />  We can then check that <img src='https://s0.wp.com/latex.php?latex=H+%5Cpsi_C+%3D+0&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H &#92;psi_C = 0' title='H &#92;psi_C = 0' class='latex' /> and that every <img src='https://s0.wp.com/latex.php?latex=%5Cphi+%5Cin+%5Cmathbb%7BR%7D%5EK&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;phi &#92;in &#92;mathbb{R}^K' title='&#92;phi &#92;in &#92;mathbb{R}^K' class='latex' /> with <img src='https://s0.wp.com/latex.php?latex=H+%5Cphi+%3D+0&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H &#92;phi = 0' title='H &#92;phi = 0' class='latex' /> can be expressed as a linear combination of these probability distributions <img src='https://s0.wp.com/latex.php?latex=%5Cpsi_C&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;psi_C' title='&#92;psi_C' class='latex' /> in a unique way.   &nbsp;  &#9608;</p>
<p>This result must be absurdly familiar to people who study Markov processes, but I haven&#8217;t bothered to look up a reference yet.  Do you happen to know a good one?  I&#8217;d like to see one that generalizes this theorem to graphs that aren&#8217;t weakly reversible.  I think I see how it goes.  We don&#8217;t need that generalization right now, but it would be good to have around.</p>
<h3> The Hamiltonian, revisited </h3>
<p>One last small piece of business: <a href="http://math.ucr.edu/home/baez/networks/networks_22.html">last time</a> I showed you a very slick formula for the Hamiltonian <img src='https://s0.wp.com/latex.php?latex=H.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H.' title='H.' class='latex' />  I&#8217;d like to prove it agrees with the formula I gave this time.</p>
<p>We start with any graph with rates:</p>
<div align="center">
<img src="https://i0.wp.com/math.ucr.edu/home/baez/networks/markov_process_diagram_1.png" alt="" />
</div>
<p>We extend <img src='https://s0.wp.com/latex.php?latex=s&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='s' title='s' class='latex' /> and <img src='https://s0.wp.com/latex.php?latex=t&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='t' title='t' class='latex' /> to linear maps between vector spaces:</p>
<div align="center">
<img src="https://i0.wp.com/math.ucr.edu/home/baez/networks/markov_process_diagram_4.png" /></div>
<p>We define the <b>boundary operator</b> just as we did last time:</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cpartial+%3D+t+-+s+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;partial = t - s ' title='&#92;partial = t - s ' class='latex' /></p>
<p>Then we put an inner product on the vector spaces <img src='https://s0.wp.com/latex.php?latex=%5Cmathbb%7BR%7D%5ET&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;mathbb{R}^T' title='&#92;mathbb{R}^T' class='latex' /> and <img src='https://s0.wp.com/latex.php?latex=%5Cmathbb%7BR%7D%5EK.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;mathbb{R}^K.' title='&#92;mathbb{R}^K.' class='latex' />  So, for <img src='https://s0.wp.com/latex.php?latex=%5Cmathbb%7BR%7D%5EK&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;mathbb{R}^K' title='&#92;mathbb{R}^K' class='latex' /> we let the elements of <img src='https://s0.wp.com/latex.php?latex=K&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='K' title='K' class='latex' /> be an orthonormal basis, but for <img src='https://s0.wp.com/latex.php?latex=%5Cmathbb%7BR%7D%5ET&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;mathbb{R}^T' title='&#92;mathbb{R}^T' class='latex' /> we define the inner product in a more clever way involving the rate constants:</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B+%5Clangle+%5Ctau%2C+%5Ctau%27+%5Crangle+%3D+%5Cfrac%7B1%7D%7Br%28%5Ctau%29%7D+%5Cdelta_%7B%5Ctau%2C+%5Ctau%27%7D+%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{ &#92;langle &#92;tau, &#92;tau&#039; &#92;rangle = &#92;frac{1}{r(&#92;tau)} &#92;delta_{&#92;tau, &#92;tau&#039;} } ' title='&#92;displaystyle{ &#92;langle &#92;tau, &#92;tau&#039; &#92;rangle = &#92;frac{1}{r(&#92;tau)} &#92;delta_{&#92;tau, &#92;tau&#039;} } ' class='latex' /></p>
<p>where <img src='https://s0.wp.com/latex.php?latex=%5Ctau%2C+%5Ctau%27+%5Cin+T.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;tau, &#92;tau&#039; &#92;in T.' title='&#92;tau, &#92;tau&#039; &#92;in T.' class='latex' />  This lets us define adjoints of the maps <img src='https://s0.wp.com/latex.php?latex=s%2C+t&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='s, t' title='s, t' class='latex' /> and <img src='https://s0.wp.com/latex.php?latex=%5Cpartial%2C&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;partial,' title='&#92;partial,' class='latex' /> via formulas like this:</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Clangle+s%5E%5Cdagger+%5Cphi%2C+%5Cpsi+%5Crangle+%3D+%5Clangle+%5Cphi%2C+s+%5Cpsi+%5Crangle+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;langle s^&#92;dagger &#92;phi, &#92;psi &#92;rangle = &#92;langle &#92;phi, s &#92;psi &#92;rangle ' title='&#92;langle s^&#92;dagger &#92;phi, &#92;psi &#92;rangle = &#92;langle &#92;phi, s &#92;psi &#92;rangle ' class='latex' /></p>
<p>Then:</p>
<p><b>Theorem.</b>  The Hamiltonian for a graph with rates is given by </p>
<p><img src='https://s0.wp.com/latex.php?latex=H+%3D+%5Cpartial+s%5E%5Cdagger+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H = &#92;partial s^&#92;dagger ' title='H = &#92;partial s^&#92;dagger ' class='latex' /></p>
<p><i>Proof.</i>  It suffices to check that this formula agrees with the formula for <img src='https://s0.wp.com/latex.php?latex=H&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='H' title='H' class='latex' /> given in Puzzle 1:</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B+++H+%5Ckappa+%3D+%5Csum_%7Bs%28%5Ctau%29+%3D+%5Ckappa%7D+r%28%5Ctau%29+%28t%28%5Ctau%29+-+s%28%5Ctau%29%29+%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{   H &#92;kappa = &#92;sum_{s(&#92;tau) = &#92;kappa} r(&#92;tau) (t(&#92;tau) - s(&#92;tau)) } ' title='&#92;displaystyle{   H &#92;kappa = &#92;sum_{s(&#92;tau) = &#92;kappa} r(&#92;tau) (t(&#92;tau) - s(&#92;tau)) } ' class='latex' /> </p>
<p>Here we are using the complex <img src='https://s0.wp.com/latex.php?latex=%5Ckappa+%5Cin+K&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;kappa &#92;in K' title='&#92;kappa &#92;in K' class='latex' /> as a name for one of the standard basis vectors of <img src='https://s0.wp.com/latex.php?latex=%5Cmathbb%7BR%7D%5EK.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;mathbb{R}^K.' title='&#92;mathbb{R}^K.' class='latex' />   Similarly shall we write things like <img src='https://s0.wp.com/latex.php?latex=%5Ctau&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;tau' title='&#92;tau' class='latex' /> or <img src='https://s0.wp.com/latex.php?latex=%5Ctau%27&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;tau&#039;' title='&#92;tau&#039;' class='latex' /> for basis vectors of <img src='https://s0.wp.com/latex.php?latex=%5Cmathbb%7BR%7D%5ET.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;mathbb{R}^T.' title='&#92;mathbb{R}^T.' class='latex' /></p>
<p>First, we claim that</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B+s%5E%5Cdagger+%5Ckappa+%3D+%5Csum_%7B%5Ctau%3A+%5C%3B+s%28%5Ctau%29+%3D+%5Ckappa%7D+r%28%5Ctau%29+%5C%2C+%5Ctau+%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{ s^&#92;dagger &#92;kappa = &#92;sum_{&#92;tau: &#92;; s(&#92;tau) = &#92;kappa} r(&#92;tau) &#92;, &#92;tau } ' title='&#92;displaystyle{ s^&#92;dagger &#92;kappa = &#92;sum_{&#92;tau: &#92;; s(&#92;tau) = &#92;kappa} r(&#92;tau) &#92;, &#92;tau } ' class='latex' /></p>
<p>To prove this it&#8217;s enough to check that taking the inner products of either sides with any basis vector <img src='https://s0.wp.com/latex.php?latex=%5Ctau%27%2C&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;tau&#039;,' title='&#92;tau&#039;,' class='latex' /> we get results that agree.  On the one hand:</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cbegin%7Barray%7D%7Bccl%7D++%5Clangle+%5Ctau%27+%2C+s%5E%5Cdagger+%5Ckappa+%5Crangle+%26%3D%26+++%5Clangle+s+%5Ctau%27%2C+%5Ckappa+%5Crangle+%5C%5C++%5C%5C++%26%3D%26+%5Cdelta_%7Bs%28%5Ctau%27%29%2C+%5Ckappa%7D++++%5Cend%7Barray%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;begin{array}{ccl}  &#92;langle &#92;tau&#039; , s^&#92;dagger &#92;kappa &#92;rangle &amp;=&amp;   &#92;langle s &#92;tau&#039;, &#92;kappa &#92;rangle &#92;&#92;  &#92;&#92;  &amp;=&amp; &#92;delta_{s(&#92;tau&#039;), &#92;kappa}    &#92;end{array} ' title='&#92;begin{array}{ccl}  &#92;langle &#92;tau&#039; , s^&#92;dagger &#92;kappa &#92;rangle &amp;=&amp;   &#92;langle s &#92;tau&#039;, &#92;kappa &#92;rangle &#92;&#92;  &#92;&#92;  &amp;=&amp; &#92;delta_{s(&#92;tau&#039;), &#92;kappa}    &#92;end{array} ' class='latex' /></p>
<p>On the other hand:</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cbegin%7Barray%7D%7Bccl%7D+%5Cdisplaystyle%7B+%5Clangle+%5Ctau%27%2C+%5Csum_%7B%5Ctau%3A+%5C%3B+s%28%5Ctau%29+%3D+%5Ckappa%7D+r%28%5Ctau%29+%5C%2C+%5Ctau+%5Crangle+%7D+%26%3D%26++%5Csum_%7B%5Ctau%3A+%5C%3B+s%28%5Ctau%29+%3D+%5Ckappa%7D+r%28%5Ctau%29+%5C%2C+%5Clangle+%5Ctau%27%2C+%5Ctau+%5Crangle+++%5C%5C++%5C%5C++%26%3D%26+%5Cdisplaystyle%7B+%5Csum_%7B%5Ctau%3A+%5C%3B+s%28%5Ctau%29+%3D+%5Ckappa%7D+%5Cdelta_%7B%5Ctau%27%2C+%5Ctau%7D+%7D+++%5C%5C++%5C%5C++%26%3D%26+++%5Cdelta_%7Bs%28%5Ctau%27%29%2C+%5Ckappa%7D+++%5Cend%7Barray%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;begin{array}{ccl} &#92;displaystyle{ &#92;langle &#92;tau&#039;, &#92;sum_{&#92;tau: &#92;; s(&#92;tau) = &#92;kappa} r(&#92;tau) &#92;, &#92;tau &#92;rangle } &amp;=&amp;  &#92;sum_{&#92;tau: &#92;; s(&#92;tau) = &#92;kappa} r(&#92;tau) &#92;, &#92;langle &#92;tau&#039;, &#92;tau &#92;rangle   &#92;&#92;  &#92;&#92;  &amp;=&amp; &#92;displaystyle{ &#92;sum_{&#92;tau: &#92;; s(&#92;tau) = &#92;kappa} &#92;delta_{&#92;tau&#039;, &#92;tau} }   &#92;&#92;  &#92;&#92;  &amp;=&amp;   &#92;delta_{s(&#92;tau&#039;), &#92;kappa}   &#92;end{array} ' title='&#92;begin{array}{ccl} &#92;displaystyle{ &#92;langle &#92;tau&#039;, &#92;sum_{&#92;tau: &#92;; s(&#92;tau) = &#92;kappa} r(&#92;tau) &#92;, &#92;tau &#92;rangle } &amp;=&amp;  &#92;sum_{&#92;tau: &#92;; s(&#92;tau) = &#92;kappa} r(&#92;tau) &#92;, &#92;langle &#92;tau&#039;, &#92;tau &#92;rangle   &#92;&#92;  &#92;&#92;  &amp;=&amp; &#92;displaystyle{ &#92;sum_{&#92;tau: &#92;; s(&#92;tau) = &#92;kappa} &#92;delta_{&#92;tau&#039;, &#92;tau} }   &#92;&#92;  &#92;&#92;  &amp;=&amp;   &#92;delta_{s(&#92;tau&#039;), &#92;kappa}   &#92;end{array} ' class='latex' /></p>
<p>where the factor of <img src='https://s0.wp.com/latex.php?latex=1%2Fr%28%5Ctau%29&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='1/r(&#92;tau)' title='1/r(&#92;tau)' class='latex' /> in the inner product on <img src='https://s0.wp.com/latex.php?latex=%5Cmathbb%7BR%7D%5ET&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;mathbb{R}^T' title='&#92;mathbb{R}^T' class='latex' /> cancels the visible factor of <img src='https://s0.wp.com/latex.php?latex=r%28%5Ctau%29.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='r(&#92;tau).' title='r(&#92;tau).' class='latex' />    So indeed the results match.</p>
<p>Using this formula for <img src='https://s0.wp.com/latex.php?latex=s%5E%5Cdagger+%5Ckappa&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='s^&#92;dagger &#92;kappa' title='s^&#92;dagger &#92;kappa' class='latex' />  we now see that</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cbegin%7Barray%7D%7Bccl%7D++H+%5Ckappa+%26%3D%26+%5Cpartial+s%5E%5Cdagger+%5Ckappa+++%5C%5C++%5C%5C++%26%3D%26+%5Cpartial+%5Cdisplaystyle%7B+%5Csum_%7B%5Ctau%3A+%5C%3B+s%28%5Ctau%29+%3D+%5Ckappa%7D+r%28%5Ctau%29+%5C%2C+%5Ctau+%7D++++%5C%5C++%5C%5C++%26%3D%26+%5Cdisplaystyle%7B+%5Csum_%7B%5Ctau%3A+%5C%3B+s%28%5Ctau%29+%3D+%5Ckappa%7D+r%28%5Ctau%29+%5C%2C+%28t%28%5Ctau%29+-+s%28%5Ctau%29%29+%7D++%5Cend%7Barray%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;begin{array}{ccl}  H &#92;kappa &amp;=&amp; &#92;partial s^&#92;dagger &#92;kappa   &#92;&#92;  &#92;&#92;  &amp;=&amp; &#92;partial &#92;displaystyle{ &#92;sum_{&#92;tau: &#92;; s(&#92;tau) = &#92;kappa} r(&#92;tau) &#92;, &#92;tau }    &#92;&#92;  &#92;&#92;  &amp;=&amp; &#92;displaystyle{ &#92;sum_{&#92;tau: &#92;; s(&#92;tau) = &#92;kappa} r(&#92;tau) &#92;, (t(&#92;tau) - s(&#92;tau)) }  &#92;end{array} ' title='&#92;begin{array}{ccl}  H &#92;kappa &amp;=&amp; &#92;partial s^&#92;dagger &#92;kappa   &#92;&#92;  &#92;&#92;  &amp;=&amp; &#92;partial &#92;displaystyle{ &#92;sum_{&#92;tau: &#92;; s(&#92;tau) = &#92;kappa} r(&#92;tau) &#92;, &#92;tau }    &#92;&#92;  &#92;&#92;  &amp;=&amp; &#92;displaystyle{ &#92;sum_{&#92;tau: &#92;; s(&#92;tau) = &#92;kappa} r(&#92;tau) &#92;, (t(&#92;tau) - s(&#92;tau)) }  &#92;end{array} ' class='latex' /></p>
<p>which is precisely what we want.  &nbsp;  &#9608;</p>
<p>I hope you see through the formulas to their intuitive meaning.  As usual, the formulas are just a way of precisely saying something that makes plenty of sense.  If <img src='https://s0.wp.com/latex.php?latex=%5Ckappa&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;kappa' title='&#92;kappa' class='latex' /> is some state of our Markov process, <img src='https://s0.wp.com/latex.php?latex=s%5E%5Cdagger+%5Ckappa&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='s^&#92;dagger &#92;kappa' title='s^&#92;dagger &#92;kappa' class='latex' /> is the sum of all transitions starting at this state, weighted by their rates.  Applying <img src='https://s0.wp.com/latex.php?latex=%5Cpartial&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;partial' title='&#92;partial' class='latex' /> to a transition tells us what change in state it causes.  So <img src='https://s0.wp.com/latex.php?latex=%5Cpartial+s%5E%5Cdagger+%5Ckappa&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;partial s^&#92;dagger &#92;kappa' title='&#92;partial s^&#92;dagger &#92;kappa' class='latex' /> tells us the rate at which things change when we start in the state <img src='https://s0.wp.com/latex.php?latex=%5Ckappa.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;kappa.' title='&#92;kappa.' class='latex' /> That&#8217;s why <img src='https://s0.wp.com/latex.php?latex=%5Cpartial+s%5E%5Cdagger&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;partial s^&#92;dagger' title='&#92;partial s^&#92;dagger' class='latex' /> is the Hamiltonian for our Markov process.  After all, the Hamiltonian tells us how things change:</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B+%5Cfrac%7Bd+%5Cpsi%7D%7Bd+t%7D+%3D+H+%5Cpsi+%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{ &#92;frac{d &#92;psi}{d t} = H &#92;psi } ' title='&#92;displaystyle{ &#92;frac{d &#92;psi}{d t} = H &#92;psi } ' class='latex' /></p>
<p>Okay, we&#8217;ve got all the machinery in place.  Next time we&#8217;ll prove the deficiency zero theorem!</p>
]]></html><thumbnail_url><![CDATA[https://i1.wp.com/math.ucr.edu/home/baez/networks/chemical_reaction_network_part_20_III.png?fit=440%2C330]]></thumbnail_url><thumbnail_height><![CDATA[248]]></thumbnail_height><thumbnail_width><![CDATA[242]]></thumbnail_width></oembed>