Analysis of Entropy on Two Distributions: Proving $H(X) < H(X')$Understanding the proof of the concavity of entropy.Optimization of entropy for fixed distance to uniformShannon's entropy in a set of probabilitiesEquality of sets when minimizing Shannon's EntropyHuffman codes: does less entropy imply less weighted average codeword length?Relation between Shannon Entropy and Total Variation distanceRandomly Generate Probability Mass Function With Specific EntropyExpected Entropy Based on Dirichlet DistributionDistributions with equal Renyi EntropiesDerivative of Shannon entropy for discrete distrubtionsMaximum Entropy with bounded constraints

How to secure an aircraft at a transient parking space?

List elements digit difference sort

weren't playing vs didn't play

They call me Inspector Morse

In the late 1940’s to early 1950’s what technology was available that could melt a LOT of ice?

How can I ensure my trip to the UK will not have to be cancelled because of Brexit?

Why does liquid water form when we exhale on a mirror?

In the quantum hamiltonian, why does kinetic energy turn into an operator while potential doesn't?

Is it "Vierergruppe" or "Viergruppe", or is there a distinction?

What problems would a superhuman have whose skin is constantly hot?

Why does the negative sign arise in this thermodynamic relation?

Accountant/ lawyer will not return my call

'The literal of type int is out of range' con número enteros pequeños (2 dígitos)

Was Luke Skywalker the leader of the Rebel forces on Hoth?

Declaring and defining template, and specialising them

Good for you! in Russian

Find longest word in a string: are any of these algorithms good?

What are some noteworthy "mic-drop" moments in math?

Vocabulary for giving just numbers, not a full answer

Are babies of evil humanoid species inherently evil?

Can you reject a postdoc offer after the PI has paid a large sum for flights/accommodation for your visit?

Are there historical instances of the capital of a colonising country being temporarily or permanently shifted to one of its colonies?

If I receive an SOS signal, what is the proper response?

Single word request: Harming the benefactor



Analysis of Entropy on Two Distributions: Proving $H(X)


Understanding the proof of the concavity of entropy.Optimization of entropy for fixed distance to uniformShannon's entropy in a set of probabilitiesEquality of sets when minimizing Shannon's EntropyHuffman codes: does less entropy imply less weighted average codeword length?Relation between Shannon Entropy and Total Variation distanceRandomly Generate Probability Mass Function With Specific EntropyExpected Entropy Based on Dirichlet DistributionDistributions with equal Renyi EntropiesDerivative of Shannon entropy for discrete distrubtionsMaximum Entropy with bounded constraints













0












$begingroup$


Let $P=p_1, p_2, p_3 ..., p_n$ and $P^'= left dfrac(p_1 + p_2)2, dfrac(p_1 + p_2)2, p_3, ..., p_nright$ be distributions on the same random variable $X$.



$1$. Show $H(X)leq H(X^')$ where $H(X)$ is Shannon's entropy formula:$$H(X) = -sum_i=1^n p_ilog_2p_i $$



This make sense since $P^'$ is closer to the uniform distribution, which maximizes entropy, but I'm not sure how to go about proving this. I believe there is some expectation that we will use the the fact that entropy can be decomposed to its binary components (with normalization along the way.)



$2$. Define $P^''=left p_1, ..., p_i-1, dfrac(p_i + p_j)2, p_i+1, ..., p_j-1, dfrac(p_i + p_j)2, p_j+1, ..., p_n right$. Use the "permutation principle" and $(a)$ to show $H(X)leq H(X^'')$










share|cite|improve this question









New contributor




user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$







  • 1




    $begingroup$
    What did you try? There's a direct approach, which is just to expand $H(X') - H(X)$, cancel terms, and argue via convexity of $-log$. A more operational approach is to note that $X'$ is the output of pushing $X$ through a channel that randomises the first two symbols, and then using the data processing inequality.
    $endgroup$
    – stochasticboy321
    2 days ago











  • $begingroup$
    I believe we ought to use convexity in our argument, but I can not get the inequality in the proper form to argue it directly. I currently have: "It suffices to show $p_1log_2p_1 + p_2log_2p_2 geq p_1 + p_2 log_2fracp_1+p_22$, but the coefficient for the right side does not align correctly if I want to argue the convexity of f(x) = $xlog_2x$. Note that the right side occurs form having H(X) - H(X') = $-p_1log_2p_1 - p_2log_2p_2 + fracp_1+p_22 log_2fracp_1+p_22 + fracp_1+p_22 log_2fracp_1+p_22$ How can I go about this?
    $endgroup$
    – user57753
    yesterday











  • $begingroup$
    Right, so, by convexity of $xlog x,$ you have $$ frac(p_1 + p _2)2 log fracp_1 + p_22 le frac12 left( p_1 log p_1 + p_2 log p_2right),$$ which finishes the job. That said, I think leonbloy's suggestion is a whole lot cleaner and thus better.
    $endgroup$
    – stochasticboy321
    8 hours ago















0












$begingroup$


Let $P=p_1, p_2, p_3 ..., p_n$ and $P^'= left dfrac(p_1 + p_2)2, dfrac(p_1 + p_2)2, p_3, ..., p_nright$ be distributions on the same random variable $X$.



$1$. Show $H(X)leq H(X^')$ where $H(X)$ is Shannon's entropy formula:$$H(X) = -sum_i=1^n p_ilog_2p_i $$



This make sense since $P^'$ is closer to the uniform distribution, which maximizes entropy, but I'm not sure how to go about proving this. I believe there is some expectation that we will use the the fact that entropy can be decomposed to its binary components (with normalization along the way.)



$2$. Define $P^''=left p_1, ..., p_i-1, dfrac(p_i + p_j)2, p_i+1, ..., p_j-1, dfrac(p_i + p_j)2, p_j+1, ..., p_n right$. Use the "permutation principle" and $(a)$ to show $H(X)leq H(X^'')$










share|cite|improve this question









New contributor




user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$







  • 1




    $begingroup$
    What did you try? There's a direct approach, which is just to expand $H(X') - H(X)$, cancel terms, and argue via convexity of $-log$. A more operational approach is to note that $X'$ is the output of pushing $X$ through a channel that randomises the first two symbols, and then using the data processing inequality.
    $endgroup$
    – stochasticboy321
    2 days ago











  • $begingroup$
    I believe we ought to use convexity in our argument, but I can not get the inequality in the proper form to argue it directly. I currently have: "It suffices to show $p_1log_2p_1 + p_2log_2p_2 geq p_1 + p_2 log_2fracp_1+p_22$, but the coefficient for the right side does not align correctly if I want to argue the convexity of f(x) = $xlog_2x$. Note that the right side occurs form having H(X) - H(X') = $-p_1log_2p_1 - p_2log_2p_2 + fracp_1+p_22 log_2fracp_1+p_22 + fracp_1+p_22 log_2fracp_1+p_22$ How can I go about this?
    $endgroup$
    – user57753
    yesterday











  • $begingroup$
    Right, so, by convexity of $xlog x,$ you have $$ frac(p_1 + p _2)2 log fracp_1 + p_22 le frac12 left( p_1 log p_1 + p_2 log p_2right),$$ which finishes the job. That said, I think leonbloy's suggestion is a whole lot cleaner and thus better.
    $endgroup$
    – stochasticboy321
    8 hours ago













0












0








0





$begingroup$


Let $P=p_1, p_2, p_3 ..., p_n$ and $P^'= left dfrac(p_1 + p_2)2, dfrac(p_1 + p_2)2, p_3, ..., p_nright$ be distributions on the same random variable $X$.



$1$. Show $H(X)leq H(X^')$ where $H(X)$ is Shannon's entropy formula:$$H(X) = -sum_i=1^n p_ilog_2p_i $$



This make sense since $P^'$ is closer to the uniform distribution, which maximizes entropy, but I'm not sure how to go about proving this. I believe there is some expectation that we will use the the fact that entropy can be decomposed to its binary components (with normalization along the way.)



$2$. Define $P^''=left p_1, ..., p_i-1, dfrac(p_i + p_j)2, p_i+1, ..., p_j-1, dfrac(p_i + p_j)2, p_j+1, ..., p_n right$. Use the "permutation principle" and $(a)$ to show $H(X)leq H(X^'')$










share|cite|improve this question









New contributor




user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$




Let $P=p_1, p_2, p_3 ..., p_n$ and $P^'= left dfrac(p_1 + p_2)2, dfrac(p_1 + p_2)2, p_3, ..., p_nright$ be distributions on the same random variable $X$.



$1$. Show $H(X)leq H(X^')$ where $H(X)$ is Shannon's entropy formula:$$H(X) = -sum_i=1^n p_ilog_2p_i $$



This make sense since $P^'$ is closer to the uniform distribution, which maximizes entropy, but I'm not sure how to go about proving this. I believe there is some expectation that we will use the the fact that entropy can be decomposed to its binary components (with normalization along the way.)



$2$. Define $P^''=left p_1, ..., p_i-1, dfrac(p_i + p_j)2, p_i+1, ..., p_j-1, dfrac(p_i + p_j)2, p_j+1, ..., p_n right$. Use the "permutation principle" and $(a)$ to show $H(X)leq H(X^'')$







information-theory entropy






share|cite|improve this question









New contributor




user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|cite|improve this question









New contributor




user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|cite|improve this question




share|cite|improve this question








edited 2 days ago







user57753













New contributor




user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked 2 days ago









user57753user57753

11




11




New contributor




user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







  • 1




    $begingroup$
    What did you try? There's a direct approach, which is just to expand $H(X') - H(X)$, cancel terms, and argue via convexity of $-log$. A more operational approach is to note that $X'$ is the output of pushing $X$ through a channel that randomises the first two symbols, and then using the data processing inequality.
    $endgroup$
    – stochasticboy321
    2 days ago











  • $begingroup$
    I believe we ought to use convexity in our argument, but I can not get the inequality in the proper form to argue it directly. I currently have: "It suffices to show $p_1log_2p_1 + p_2log_2p_2 geq p_1 + p_2 log_2fracp_1+p_22$, but the coefficient for the right side does not align correctly if I want to argue the convexity of f(x) = $xlog_2x$. Note that the right side occurs form having H(X) - H(X') = $-p_1log_2p_1 - p_2log_2p_2 + fracp_1+p_22 log_2fracp_1+p_22 + fracp_1+p_22 log_2fracp_1+p_22$ How can I go about this?
    $endgroup$
    – user57753
    yesterday











  • $begingroup$
    Right, so, by convexity of $xlog x,$ you have $$ frac(p_1 + p _2)2 log fracp_1 + p_22 le frac12 left( p_1 log p_1 + p_2 log p_2right),$$ which finishes the job. That said, I think leonbloy's suggestion is a whole lot cleaner and thus better.
    $endgroup$
    – stochasticboy321
    8 hours ago












  • 1




    $begingroup$
    What did you try? There's a direct approach, which is just to expand $H(X') - H(X)$, cancel terms, and argue via convexity of $-log$. A more operational approach is to note that $X'$ is the output of pushing $X$ through a channel that randomises the first two symbols, and then using the data processing inequality.
    $endgroup$
    – stochasticboy321
    2 days ago











  • $begingroup$
    I believe we ought to use convexity in our argument, but I can not get the inequality in the proper form to argue it directly. I currently have: "It suffices to show $p_1log_2p_1 + p_2log_2p_2 geq p_1 + p_2 log_2fracp_1+p_22$, but the coefficient for the right side does not align correctly if I want to argue the convexity of f(x) = $xlog_2x$. Note that the right side occurs form having H(X) - H(X') = $-p_1log_2p_1 - p_2log_2p_2 + fracp_1+p_22 log_2fracp_1+p_22 + fracp_1+p_22 log_2fracp_1+p_22$ How can I go about this?
    $endgroup$
    – user57753
    yesterday











  • $begingroup$
    Right, so, by convexity of $xlog x,$ you have $$ frac(p_1 + p _2)2 log fracp_1 + p_22 le frac12 left( p_1 log p_1 + p_2 log p_2right),$$ which finishes the job. That said, I think leonbloy's suggestion is a whole lot cleaner and thus better.
    $endgroup$
    – stochasticboy321
    8 hours ago







1




1




$begingroup$
What did you try? There's a direct approach, which is just to expand $H(X') - H(X)$, cancel terms, and argue via convexity of $-log$. A more operational approach is to note that $X'$ is the output of pushing $X$ through a channel that randomises the first two symbols, and then using the data processing inequality.
$endgroup$
– stochasticboy321
2 days ago





$begingroup$
What did you try? There's a direct approach, which is just to expand $H(X') - H(X)$, cancel terms, and argue via convexity of $-log$. A more operational approach is to note that $X'$ is the output of pushing $X$ through a channel that randomises the first two symbols, and then using the data processing inequality.
$endgroup$
– stochasticboy321
2 days ago













$begingroup$
I believe we ought to use convexity in our argument, but I can not get the inequality in the proper form to argue it directly. I currently have: "It suffices to show $p_1log_2p_1 + p_2log_2p_2 geq p_1 + p_2 log_2fracp_1+p_22$, but the coefficient for the right side does not align correctly if I want to argue the convexity of f(x) = $xlog_2x$. Note that the right side occurs form having H(X) - H(X') = $-p_1log_2p_1 - p_2log_2p_2 + fracp_1+p_22 log_2fracp_1+p_22 + fracp_1+p_22 log_2fracp_1+p_22$ How can I go about this?
$endgroup$
– user57753
yesterday





$begingroup$
I believe we ought to use convexity in our argument, but I can not get the inequality in the proper form to argue it directly. I currently have: "It suffices to show $p_1log_2p_1 + p_2log_2p_2 geq p_1 + p_2 log_2fracp_1+p_22$, but the coefficient for the right side does not align correctly if I want to argue the convexity of f(x) = $xlog_2x$. Note that the right side occurs form having H(X) - H(X') = $-p_1log_2p_1 - p_2log_2p_2 + fracp_1+p_22 log_2fracp_1+p_22 + fracp_1+p_22 log_2fracp_1+p_22$ How can I go about this?
$endgroup$
– user57753
yesterday













$begingroup$
Right, so, by convexity of $xlog x,$ you have $$ frac(p_1 + p _2)2 log fracp_1 + p_22 le frac12 left( p_1 log p_1 + p_2 log p_2right),$$ which finishes the job. That said, I think leonbloy's suggestion is a whole lot cleaner and thus better.
$endgroup$
– stochasticboy321
8 hours ago




$begingroup$
Right, so, by convexity of $xlog x,$ you have $$ frac(p_1 + p _2)2 log fracp_1 + p_22 le frac12 left( p_1 log p_1 + p_2 log p_2right),$$ which finishes the job. That said, I think leonbloy's suggestion is a whole lot cleaner and thus better.
$endgroup$
– stochasticboy321
8 hours ago










1 Answer
1






active

oldest

votes


















1












$begingroup$

There are several ways to attack this, some of them pointed by stochasticboy321's comment.



Another way, which looks elegant to me, but it requires you to know this (useful, and not hard to prove) property of entropy: $H(p)$ is a concave function of the distribution $p$.



Granted this, consider the two distributions $p_A=(p_1,p_2, p_3 cdots p_n)$ and $p_B=(p_2,p_1, p_3 cdots p_n)$ and let $p_C = (p_A +p_B)/2$. Clearly, $H(p_A)=H(p_B)$.



Hence, by concavity $$H(X')=H(p_C) ge fracH(p_A)+H(p_B)2= H(p_A)=H(X)$$






share|cite|improve this answer









$endgroup$












    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );






    user57753 is a new contributor. Be nice, and check out our Code of Conduct.









    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3141599%2fanalysis-of-entropy-on-two-distributions-proving-hx-hx%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    There are several ways to attack this, some of them pointed by stochasticboy321's comment.



    Another way, which looks elegant to me, but it requires you to know this (useful, and not hard to prove) property of entropy: $H(p)$ is a concave function of the distribution $p$.



    Granted this, consider the two distributions $p_A=(p_1,p_2, p_3 cdots p_n)$ and $p_B=(p_2,p_1, p_3 cdots p_n)$ and let $p_C = (p_A +p_B)/2$. Clearly, $H(p_A)=H(p_B)$.



    Hence, by concavity $$H(X')=H(p_C) ge fracH(p_A)+H(p_B)2= H(p_A)=H(X)$$






    share|cite|improve this answer









    $endgroup$

















      1












      $begingroup$

      There are several ways to attack this, some of them pointed by stochasticboy321's comment.



      Another way, which looks elegant to me, but it requires you to know this (useful, and not hard to prove) property of entropy: $H(p)$ is a concave function of the distribution $p$.



      Granted this, consider the two distributions $p_A=(p_1,p_2, p_3 cdots p_n)$ and $p_B=(p_2,p_1, p_3 cdots p_n)$ and let $p_C = (p_A +p_B)/2$. Clearly, $H(p_A)=H(p_B)$.



      Hence, by concavity $$H(X')=H(p_C) ge fracH(p_A)+H(p_B)2= H(p_A)=H(X)$$






      share|cite|improve this answer









      $endgroup$















        1












        1








        1





        $begingroup$

        There are several ways to attack this, some of them pointed by stochasticboy321's comment.



        Another way, which looks elegant to me, but it requires you to know this (useful, and not hard to prove) property of entropy: $H(p)$ is a concave function of the distribution $p$.



        Granted this, consider the two distributions $p_A=(p_1,p_2, p_3 cdots p_n)$ and $p_B=(p_2,p_1, p_3 cdots p_n)$ and let $p_C = (p_A +p_B)/2$. Clearly, $H(p_A)=H(p_B)$.



        Hence, by concavity $$H(X')=H(p_C) ge fracH(p_A)+H(p_B)2= H(p_A)=H(X)$$






        share|cite|improve this answer









        $endgroup$



        There are several ways to attack this, some of them pointed by stochasticboy321's comment.



        Another way, which looks elegant to me, but it requires you to know this (useful, and not hard to prove) property of entropy: $H(p)$ is a concave function of the distribution $p$.



        Granted this, consider the two distributions $p_A=(p_1,p_2, p_3 cdots p_n)$ and $p_B=(p_2,p_1, p_3 cdots p_n)$ and let $p_C = (p_A +p_B)/2$. Clearly, $H(p_A)=H(p_B)$.



        Hence, by concavity $$H(X')=H(p_C) ge fracH(p_A)+H(p_B)2= H(p_A)=H(X)$$







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 2 days ago









        leonbloyleonbloy

        41.5k647108




        41.5k647108




















            user57753 is a new contributor. Be nice, and check out our Code of Conduct.









            draft saved

            draft discarded


















            user57753 is a new contributor. Be nice, and check out our Code of Conduct.












            user57753 is a new contributor. Be nice, and check out our Code of Conduct.











            user57753 is a new contributor. Be nice, and check out our Code of Conduct.














            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3141599%2fanalysis-of-entropy-on-two-distributions-proving-hx-hx%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Solar Wings Breeze Design and development Specifications (Breeze) References Navigation menu1368-485X"Hang glider: Breeze (Solar Wings)"e

            Kathakali Contents Etymology and nomenclature History Repertoire Songs and musical instruments Traditional plays Styles: Sampradayam Training centers and awards Relationship to other dance forms See also Notes References External links Navigation menueThe Illustrated Encyclopedia of Hinduism: A-MSouth Asian Folklore: An EncyclopediaRoutledge International Encyclopedia of Women: Global Women's Issues and KnowledgeKathakali Dance-drama: Where Gods and Demons Come to PlayKathakali Dance-drama: Where Gods and Demons Come to PlayKathakali Dance-drama: Where Gods and Demons Come to Play10.1353/atj.2005.0004The Illustrated Encyclopedia of Hinduism: A-MEncyclopedia of HinduismKathakali Dance-drama: Where Gods and Demons Come to PlaySonic Liturgy: Ritual and Music in Hindu Tradition"The Mirror of Gesture"Kathakali Dance-drama: Where Gods and Demons Come to Play"Kathakali"Indian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceMedieval Indian Literature: An AnthologyThe Oxford Companion to Indian TheatreSouth Asian Folklore: An Encyclopedia : Afghanistan, Bangladesh, India, Nepal, Pakistan, Sri LankaThe Rise of Performance Studies: Rethinking Richard Schechner's Broad SpectrumIndian Theatre: Traditions of PerformanceModern Asian Theatre and Performance 1900-2000Critical Theory and PerformanceBetween Theater and AnthropologyKathakali603847011Indian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceIndian Theatre: Traditions of PerformanceBetween Theater and AnthropologyBetween Theater and AnthropologyNambeesan Smaraka AwardsArchivedThe Cambridge Guide to TheatreRoutledge International Encyclopedia of Women: Global Women's Issues and KnowledgeThe Garland Encyclopedia of World Music: South Asia : the Indian subcontinentThe Ethos of Noh: Actors and Their Art10.2307/1145740By Means of Performance: Intercultural Studies of Theatre and Ritual10.1017/s204912550000100xReconceiving the Renaissance: A Critical ReaderPerformance TheoryListening to Theatre: The Aural Dimension of Beijing Opera10.2307/1146013Kathakali: The Art of the Non-WorldlyOn KathakaliKathakali, the dance theatreThe Kathakali Complex: Performance & StructureKathakali Dance-Drama: Where Gods and Demons Come to Play10.1093/obo/9780195399318-0071Drama and Ritual of Early Hinduism"In the Shadow of Hollywood Orientalism: Authentic East Indian Dancing"10.1080/08949460490274013Sanskrit Play Production in Ancient IndiaIndian Music: History and StructureBharata, the Nāṭyaśāstra233639306Table of Contents2238067286469807Dance In Indian Painting10.2307/32047833204783Kathakali Dance-Theatre: A Visual Narrative of Sacred Indian MimeIndian Classical Dance: The Renaissance and BeyondKathakali: an indigenous art-form of Keralaeee

            Method to test if a number is a perfect power? Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Detecting perfect squares faster than by extracting square rooteffective way to get the integer sequence A181392 from oeisA rarely mentioned fact about perfect powersHow many numbers such $n$ are there that $n<100,lfloorsqrtn rfloor mid n$Check perfect squareness by modulo division against multiple basesFor what pair of integers $(a,b)$ is $3^a + 7^b$ a perfect square.Do there exist any positive integers $n$ such that $lfloore^nrfloor$ is a perfect power? What is the probability that one exists?finding perfect power factors of an integerProve that the sequence contains a perfect square for any natural number $m $ in the domain of $f$ .Counting Perfect Powers