<?xml version="1.0" encoding="UTF-8" standalone="yes"?><oembed><version><![CDATA[1.0]]></version><provider_name><![CDATA[Azimuth]]></provider_name><provider_url><![CDATA[https://johncarlosbaez.wordpress.com]]></provider_url><author_name><![CDATA[John Baez]]></author_name><author_url><![CDATA[https://johncarlosbaez.wordpress.com/author/johncarlosbaez/]]></author_url><title><![CDATA[An Entropy Challenge]]></title><type><![CDATA[link]]></type><html><![CDATA[<p>If you like computer calculations, here&#8217;s a little challenge for you.  <a href="http://www.quantumlah.org/people/Dahlsten">Oscar Dahlsten</a> may have solved it, but we&#8217;d love for you to check his work.   It&#8217;s pretty important for the foundations of thermodynamics, but you don&#8217;t need to know any physics or even anything beyond a little algebra to tackle it!  First I&#8217;ll explain it in really simple terms, then I&#8217;ll remind you a bit of why it matters.</p>
<p>We&#8217;re looking for two lists of nonnegative numbers, of the same length, listed in decreasing order:</p>
<p><img src='https://s0.wp.com/latex.php?latex=p_1+%5Cge+p_2+%5Cge+%5Ccdots+%5Cge+p_n+%5Cge+0+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='p_1 &#92;ge p_2 &#92;ge &#92;cdots &#92;ge p_n &#92;ge 0 ' title='p_1 &#92;ge p_2 &#92;ge &#92;cdots &#92;ge p_n &#92;ge 0 ' class='latex' /></p>
<p><img src='https://s0.wp.com/latex.php?latex=q_1+%5Cge+q_2+%5Cge+%5Ccdots+%5Cge+q_n+%5Cge+0+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='q_1 &#92;ge q_2 &#92;ge &#92;cdots &#92;ge q_n &#92;ge 0 ' title='q_1 &#92;ge q_2 &#92;ge &#92;cdots &#92;ge q_n &#92;ge 0 ' class='latex' /></p>
<p>that sum to 1:</p>
<p><img src='https://s0.wp.com/latex.php?latex=p_1+%2B+%5Ccdots+%2B+p_n+%3D+1&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='p_1 + &#92;cdots + p_n = 1' title='p_1 + &#92;cdots + p_n = 1' class='latex' /></p>
<p><img src='https://s0.wp.com/latex.php?latex=q_1+%2B+%5Ccdots+%2B+q_n+%3D+1&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='q_1 + &#92;cdots + q_n = 1' title='q_1 + &#92;cdots + q_n = 1' class='latex' /></p>
<p>and that obey this inequality:</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B+%5Cfrac%7B1%7D%7B1+-+%5Cbeta%7D+%5Cln+%5Csum_%7Bi%3D1%7D%5En+p_i%5E%5Cbeta++%5Cle++%5Cfrac%7B1%7D%7B1+-+%5Cbeta%7D+%5Cln+%5Csum_%7Bi%3D1%7D%5En+q_i%5E%5Cbeta+%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{ &#92;frac{1}{1 - &#92;beta} &#92;ln &#92;sum_{i=1}^n p_i^&#92;beta  &#92;le  &#92;frac{1}{1 - &#92;beta} &#92;ln &#92;sum_{i=1}^n q_i^&#92;beta } ' title='&#92;displaystyle{ &#92;frac{1}{1 - &#92;beta} &#92;ln &#92;sum_{i=1}^n p_i^&#92;beta  &#92;le  &#92;frac{1}{1 - &#92;beta} &#92;ln &#92;sum_{i=1}^n q_i^&#92;beta } ' class='latex' /></p>
<p>for all <img src='https://s0.wp.com/latex.php?latex=0+%3C+%5Cbeta+%3C+%5Cinfty&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='0 &lt; &#92;beta &lt; &#92;infty' title='0 &lt; &#92;beta &lt; &#92;infty' class='latex' /> (ignoring <img src='https://s0.wp.com/latex.php?latex=%5Cbeta+%3D+1&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;beta = 1' title='&#92;beta = 1' class='latex' />), yet do <i>not</i> obey these inequalities:</p>
<p><img src='https://s0.wp.com/latex.php?latex=p_1+%2B+%5Ccdots+%2B+p_k+%5Cge+q_1+%2B+%5Ccdots+%2B+q_k+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='p_1 + &#92;cdots + p_k &#92;ge q_1 + &#92;cdots + q_k ' title='p_1 + &#92;cdots + p_k &#92;ge q_1 + &#92;cdots + q_k ' class='latex' /></p>
<p>for all <img src='https://s0.wp.com/latex.php?latex=1+%5Cle+k+%5Cle+n.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='1 &#92;le k &#92;le n.' title='1 &#92;le k &#92;le n.' class='latex' /></p>
<p>Oscar&#8217;s proposed solution is this:</p>
<p><img src='https://s0.wp.com/latex.php?latex=p+%3D+%280.4%2C+0.29%2C+0.29%2C+0.02%29&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='p = (0.4, 0.29, 0.29, 0.02)' title='p = (0.4, 0.29, 0.29, 0.02)' class='latex' /> </p>
<p><img src='https://s0.wp.com/latex.php?latex=q+%3D+%280.39%2C+0.31%2C+0.2%2C+0.1%29&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='q = (0.39, 0.31, 0.2, 0.1)' title='q = (0.39, 0.31, 0.2, 0.1)' class='latex' /></p>
<p>Can you see if this works?  Is there a simpler example, like one with lists of just 3 numbers?</p>
<p>This question came up near the end of my post <a href="https://johncarlosbaez.wordpress.com/2012/08/24/more-second-laws-of-thermodynamics/">More Second Laws of Thermodynamics</a>.  I phrased the question with a bit more jargon, and said a lot more about its significance.  Suppose we have two probability distributions on a finite set, say <img src='https://s0.wp.com/latex.php?latex=p&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='p' title='p' class='latex' /> and <img src='https://s0.wp.com/latex.php?latex=q.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='q.' title='q.' class='latex' />  We say <img src='https://s0.wp.com/latex.php?latex=p&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='p' title='p' class='latex' /> <b>majorizes</b> <img src='https://s0.wp.com/latex.php?latex=q&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='q' title='q' class='latex' /> if</p>
<p><img src='https://s0.wp.com/latex.php?latex=p_1+%2B+%5Ccdots+%2B+p_k+%5Cge+q_1+%2B+%5Ccdots+%2B+q_k+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='p_1 + &#92;cdots + p_k &#92;ge q_1 + &#92;cdots + q_k ' title='p_1 + &#92;cdots + p_k &#92;ge q_1 + &#92;cdots + q_k ' class='latex' /></p>
<p>for all <img src='https://s0.wp.com/latex.php?latex=1+%5Cle+k+%5Cle+n%2C&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='1 &#92;le k &#92;le n,' title='1 &#92;le k &#92;le n,' class='latex' /> when we write both lists of numbers in decreasing order.  This means <img src='https://s0.wp.com/latex.php?latex=p&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='p' title='p' class='latex' /> is &#8216;less flat&#8217; than <img src='https://s0.wp.com/latex.php?latex=q&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='q' title='q' class='latex' />, so it should have less entropy.  And indeed it does: not just for ordinary entropy, but also for R&eacute;nyi entropy!  The <b>R&eacute;nyi entropy</b> of <img src='https://s0.wp.com/latex.php?latex=p&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='p' title='p' class='latex' /> is defined by</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B+H_%5Cbeta%28p%29+%3D+%5Cfrac%7B1%7D%7B1+-+%5Cbeta%7D+%5Cln+%5Csum_%7Bi%3D1%7D%5En+p_i%5E%5Cbeta++%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{ H_&#92;beta(p) = &#92;frac{1}{1 - &#92;beta} &#92;ln &#92;sum_{i=1}^n p_i^&#92;beta  } ' title='&#92;displaystyle{ H_&#92;beta(p) = &#92;frac{1}{1 - &#92;beta} &#92;ln &#92;sum_{i=1}^n p_i^&#92;beta  } ' class='latex' /></p>
<p>where <img src='https://s0.wp.com/latex.php?latex=0+%3C+%5Cbeta+%3C+1&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='0 &lt; &#92;beta &lt; 1' title='0 &lt; &#92;beta &lt; 1' class='latex' /> or <img src='https://s0.wp.com/latex.php?latex=1+%3C+%5Cbeta+%3C+%5Cinfty&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='1 &lt; &#92;beta &lt; &#92;infty' title='1 &lt; &#92;beta &lt; &#92;infty' class='latex' />.  We can also define R&eacute;nyi entropy for <img src='https://s0.wp.com/latex.php?latex=%5Cbeta+%3D+0%2C+1%2C+%5Cinfty&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;beta = 0, 1, &#92;infty' title='&#92;beta = 0, 1, &#92;infty' class='latex' /> by taking a limit, and at <img src='https://s0.wp.com/latex.php?latex=%5Cbeta+%3D+1&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;beta = 1' title='&#92;beta = 1' class='latex' /> we get the ordinary entropy</p>
<p><img src='https://s0.wp.com/latex.php?latex=%5Cdisplaystyle%7B+H_1%28p%29+%3D+-+%5Csum_%7Bi+%3D+1%7D%5En+p_i+%5Cln+%28p_i%29+%7D+&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='&#92;displaystyle{ H_1(p) = - &#92;sum_{i = 1}^n p_i &#92;ln (p_i) } ' title='&#92;displaystyle{ H_1(p) = - &#92;sum_{i = 1}^n p_i &#92;ln (p_i) } ' class='latex' /></p>
<p>The question is whether majorization is more powerful than R&eacute;nyi entropy as a tool to to tell when one probability distribution is less flat than another.  I know that if <img src='https://s0.wp.com/latex.php?latex=p&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='p' title='p' class='latex' /> majorizes <img src='https://s0.wp.com/latex.php?latex=q%2C&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='q,' title='q,' class='latex' /> its R&eacute;nyi entropy is less than than that of <img src='https://s0.wp.com/latex.php?latex=q&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='q' title='q' class='latex' /> for all <img src='https://s0.wp.com/latex.php?latex=0+%5Cle+%5Cbeta+%5Cle+%5Cinfty.&#038;bg=ffffff&#038;fg=000&#038;s=0' alt='0 &#92;le &#92;beta &#92;le &#92;infty.' title='0 &#92;le &#92;beta &#92;le &#92;infty.' class='latex' />  Your mission, should you choose to accept it, is to show the converse is not true.  </p>
]]></html></oembed>