Analysis of Entropy on Two Distributions: Proving $H(X) < H(X')$Understanding the proof of the concavity of entropy.Optimization of entropy for fixed distance to uniformShannon's entropy in a set of probabilitiesEquality of sets when minimizing Shannon's EntropyHuffman codes: does less entropy imply less weighted average codeword length?Relation between Shannon Entropy and Total Variation distanceRandomly Generate Probability Mass Function With Specific EntropyExpected Entropy Based on Dirichlet DistributionDistributions with equal Renyi EntropiesDerivative of Shannon entropy for discrete distrubtionsMaximum Entropy with bounded constraints

How to secure an aircraft at a transient parking space?

List elements digit difference sort

weren't playing vs didn't play

They call me Inspector Morse

In the late 1940’s to early 1950’s what technology was available that could melt a LOT of ice?

How can I ensure my trip to the UK will not have to be cancelled because of Brexit?

Why does liquid water form when we exhale on a mirror?

In the quantum hamiltonian, why does kinetic energy turn into an operator while potential doesn't?

Is it "Vierergruppe" or "Viergruppe", or is there a distinction?

What problems would a superhuman have whose skin is constantly hot?

Why does the negative sign arise in this thermodynamic relation?

Accountant/ lawyer will not return my call

'The literal of type int is out of range' con número enteros pequeños (2 dígitos)

Was Luke Skywalker the leader of the Rebel forces on Hoth?

Declaring and defining template, and specialising them

Good for you! in Russian

Find longest word in a string: are any of these algorithms good?

What are some noteworthy "mic-drop" moments in math?

Vocabulary for giving just numbers, not a full answer

Are babies of evil humanoid species inherently evil?

Can you reject a postdoc offer after the PI has paid a large sum for flights/accommodation for your visit?

Are there historical instances of the capital of a colonising country being temporarily or permanently shifted to one of its colonies?

If I receive an SOS signal, what is the proper response?

Single word request: Harming the benefactor



Analysis of Entropy on Two Distributions: Proving $H(X)


Understanding the proof of the concavity of entropy.Optimization of entropy for fixed distance to uniformShannon's entropy in a set of probabilitiesEquality of sets when minimizing Shannon's EntropyHuffman codes: does less entropy imply less weighted average codeword length?Relation between Shannon Entropy and Total Variation distanceRandomly Generate Probability Mass Function With Specific EntropyExpected Entropy Based on Dirichlet DistributionDistributions with equal Renyi EntropiesDerivative of Shannon entropy for discrete distrubtionsMaximum Entropy with bounded constraints













0












$begingroup$


Let $P=p_1, p_2, p_3 ..., p_n$ and $P^'= left dfrac(p_1 + p_2)2, dfrac(p_1 + p_2)2, p_3, ..., p_nright$ be distributions on the same random variable $X$.



$1$. Show $H(X)leq H(X^')$ where $H(X)$ is Shannon's entropy formula:$$H(X) = -sum_i=1^n p_ilog_2p_i $$



This make sense since $P^'$ is closer to the uniform distribution, which maximizes entropy, but I'm not sure how to go about proving this. I believe there is some expectation that we will use the the fact that entropy can be decomposed to its binary components (with normalization along the way.)



$2$. Define $P^''=left p_1, ..., p_i-1, dfrac(p_i + p_j)2, p_i+1, ..., p_j-1, dfrac(p_i + p_j)2, p_j+1, ..., p_n right$. Use the "permutation principle" and $(a)$ to show $H(X)leq H(X^'')$










share|cite|improve this question









New contributor




user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$







  • 1




    $begingroup$
    What did you try? There's a direct approach, which is just to expand $H(X') - H(X)$, cancel terms, and argue via convexity of $-log$. A more operational approach is to note that $X'$ is the output of pushing $X$ through a channel that randomises the first two symbols, and then using the data processing inequality.
    $endgroup$
    – stochasticboy321
    2 days ago











  • $begingroup$
    I believe we ought to use convexity in our argument, but I can not get the inequality in the proper form to argue it directly. I currently have: "It suffices to show $p_1log_2p_1 + p_2log_2p_2 geq p_1 + p_2 log_2fracp_1+p_22$, but the coefficient for the right side does not align correctly if I want to argue the convexity of f(x) = $xlog_2x$. Note that the right side occurs form having H(X) - H(X') = $-p_1log_2p_1 - p_2log_2p_2 + fracp_1+p_22 log_2fracp_1+p_22 + fracp_1+p_22 log_2fracp_1+p_22$ How can I go about this?
    $endgroup$
    – user57753
    yesterday











  • $begingroup$
    Right, so, by convexity of $xlog x,$ you have $$ frac(p_1 + p _2)2 log fracp_1 + p_22 le frac12 left( p_1 log p_1 + p_2 log p_2right),$$ which finishes the job. That said, I think leonbloy's suggestion is a whole lot cleaner and thus better.
    $endgroup$
    – stochasticboy321
    8 hours ago















0












$begingroup$


Let $P=p_1, p_2, p_3 ..., p_n$ and $P^'= left dfrac(p_1 + p_2)2, dfrac(p_1 + p_2)2, p_3, ..., p_nright$ be distributions on the same random variable $X$.



$1$. Show $H(X)leq H(X^')$ where $H(X)$ is Shannon's entropy formula:$$H(X) = -sum_i=1^n p_ilog_2p_i $$



This make sense since $P^'$ is closer to the uniform distribution, which maximizes entropy, but I'm not sure how to go about proving this. I believe there is some expectation that we will use the the fact that entropy can be decomposed to its binary components (with normalization along the way.)



$2$. Define $P^''=left p_1, ..., p_i-1, dfrac(p_i + p_j)2, p_i+1, ..., p_j-1, dfrac(p_i + p_j)2, p_j+1, ..., p_n right$. Use the "permutation principle" and $(a)$ to show $H(X)leq H(X^'')$










share|cite|improve this question









New contributor




user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$







  • 1




    $begingroup$
    What did you try? There's a direct approach, which is just to expand $H(X') - H(X)$, cancel terms, and argue via convexity of $-log$. A more operational approach is to note that $X'$ is the output of pushing $X$ through a channel that randomises the first two symbols, and then using the data processing inequality.
    $endgroup$
    – stochasticboy321
    2 days ago











  • $begingroup$
    I believe we ought to use convexity in our argument, but I can not get the inequality in the proper form to argue it directly. I currently have: "It suffices to show $p_1log_2p_1 + p_2log_2p_2 geq p_1 + p_2 log_2fracp_1+p_22$, but the coefficient for the right side does not align correctly if I want to argue the convexity of f(x) = $xlog_2x$. Note that the right side occurs form having H(X) - H(X') = $-p_1log_2p_1 - p_2log_2p_2 + fracp_1+p_22 log_2fracp_1+p_22 + fracp_1+p_22 log_2fracp_1+p_22$ How can I go about this?
    $endgroup$
    – user57753
    yesterday











  • $begingroup$
    Right, so, by convexity of $xlog x,$ you have $$ frac(p_1 + p _2)2 log fracp_1 + p_22 le frac12 left( p_1 log p_1 + p_2 log p_2right),$$ which finishes the job. That said, I think leonbloy's suggestion is a whole lot cleaner and thus better.
    $endgroup$
    – stochasticboy321
    8 hours ago













0












0








0





$begingroup$


Let $P=p_1, p_2, p_3 ..., p_n$ and $P^'= left dfrac(p_1 + p_2)2, dfrac(p_1 + p_2)2, p_3, ..., p_nright$ be distributions on the same random variable $X$.



$1$. Show $H(X)leq H(X^')$ where $H(X)$ is Shannon's entropy formula:$$H(X) = -sum_i=1^n p_ilog_2p_i $$



This make sense since $P^'$ is closer to the uniform distribution, which maximizes entropy, but I'm not sure how to go about proving this. I believe there is some expectation that we will use the the fact that entropy can be decomposed to its binary components (with normalization along the way.)



$2$. Define $P^''=left p_1, ..., p_i-1, dfrac(p_i + p_j)2, p_i+1, ..., p_j-1, dfrac(p_i + p_j)2, p_j+1, ..., p_n right$. Use the "permutation principle" and $(a)$ to show $H(X)leq H(X^'')$










share|cite|improve this question









New contributor




user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$




Let $P=p_1, p_2, p_3 ..., p_n$ and $P^'= left dfrac(p_1 + p_2)2, dfrac(p_1 + p_2)2, p_3, ..., p_nright$ be distributions on the same random variable $X$.



$1$. Show $H(X)leq H(X^')$ where $H(X)$ is Shannon's entropy formula:$$H(X) = -sum_i=1^n p_ilog_2p_i $$



This make sense since $P^'$ is closer to the uniform distribution, which maximizes entropy, but I'm not sure how to go about proving this. I believe there is some expectation that we will use the the fact that entropy can be decomposed to its binary components (with normalization along the way.)



$2$. Define $P^''=left p_1, ..., p_i-1, dfrac(p_i + p_j)2, p_i+1, ..., p_j-1, dfrac(p_i + p_j)2, p_j+1, ..., p_n right$. Use the "permutation principle" and $(a)$ to show $H(X)leq H(X^'')$







information-theory entropy






share|cite|improve this question









New contributor




user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|cite|improve this question









New contributor




user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|cite|improve this question




share|cite|improve this question








edited 2 days ago







user57753













New contributor




user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked 2 days ago









user57753user57753

11




11




New contributor




user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






user57753 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







  • 1




    $begingroup$
    What did you try? There's a direct approach, which is just to expand $H(X') - H(X)$, cancel terms, and argue via convexity of $-log$. A more operational approach is to note that $X'$ is the output of pushing $X$ through a channel that randomises the first two symbols, and then using the data processing inequality.
    $endgroup$
    – stochasticboy321
    2 days ago











  • $begingroup$
    I believe we ought to use convexity in our argument, but I can not get the inequality in the proper form to argue it directly. I currently have: "It suffices to show $p_1log_2p_1 + p_2log_2p_2 geq p_1 + p_2 log_2fracp_1+p_22$, but the coefficient for the right side does not align correctly if I want to argue the convexity of f(x) = $xlog_2x$. Note that the right side occurs form having H(X) - H(X') = $-p_1log_2p_1 - p_2log_2p_2 + fracp_1+p_22 log_2fracp_1+p_22 + fracp_1+p_22 log_2fracp_1+p_22$ How can I go about this?
    $endgroup$
    – user57753
    yesterday











  • $begingroup$
    Right, so, by convexity of $xlog x,$ you have $$ frac(p_1 + p _2)2 log fracp_1 + p_22 le frac12 left( p_1 log p_1 + p_2 log p_2right),$$ which finishes the job. That said, I think leonbloy's suggestion is a whole lot cleaner and thus better.
    $endgroup$
    – stochasticboy321
    8 hours ago












  • 1




    $begingroup$
    What did you try? There's a direct approach, which is just to expand $H(X') - H(X)$, cancel terms, and argue via convexity of $-log$. A more operational approach is to note that $X'$ is the output of pushing $X$ through a channel that randomises the first two symbols, and then using the data processing inequality.
    $endgroup$
    – stochasticboy321
    2 days ago











  • $begingroup$
    I believe we ought to use convexity in our argument, but I can not get the inequality in the proper form to argue it directly. I currently have: "It suffices to show $p_1log_2p_1 + p_2log_2p_2 geq p_1 + p_2 log_2fracp_1+p_22$, but the coefficient for the right side does not align correctly if I want to argue the convexity of f(x) = $xlog_2x$. Note that the right side occurs form having H(X) - H(X') = $-p_1log_2p_1 - p_2log_2p_2 + fracp_1+p_22 log_2fracp_1+p_22 + fracp_1+p_22 log_2fracp_1+p_22$ How can I go about this?
    $endgroup$
    – user57753
    yesterday











  • $begingroup$
    Right, so, by convexity of $xlog x,$ you have $$ frac(p_1 + p _2)2 log fracp_1 + p_22 le frac12 left( p_1 log p_1 + p_2 log p_2right),$$ which finishes the job. That said, I think leonbloy's suggestion is a whole lot cleaner and thus better.
    $endgroup$
    – stochasticboy321
    8 hours ago







1




1




$begingroup$
What did you try? There's a direct approach, which is just to expand $H(X') - H(X)$, cancel terms, and argue via convexity of $-log$. A more operational approach is to note that $X'$ is the output of pushing $X$ through a channel that randomises the first two symbols, and then using the data processing inequality.
$endgroup$
– stochasticboy321
2 days ago





$begingroup$
What did you try? There's a direct approach, which is just to expand $H(X') - H(X)$, cancel terms, and argue via convexity of $-log$. A more operational approach is to note that $X'$ is the output of pushing $X$ through a channel that randomises the first two symbols, and then using the data processing inequality.
$endgroup$
– stochasticboy321
2 days ago













$begingroup$
I believe we ought to use convexity in our argument, but I can not get the inequality in the proper form to argue it directly. I currently have: "It suffices to show $p_1log_2p_1 + p_2log_2p_2 geq p_1 + p_2 log_2fracp_1+p_22$, but the coefficient for the right side does not align correctly if I want to argue the convexity of f(x) = $xlog_2x$. Note that the right side occurs form having H(X) - H(X') = $-p_1log_2p_1 - p_2log_2p_2 + fracp_1+p_22 log_2fracp_1+p_22 + fracp_1+p_22 log_2fracp_1+p_22$ How can I go about this?
$endgroup$
– user57753
yesterday





$begingroup$
I believe we ought to use convexity in our argument, but I can not get the inequality in the proper form to argue it directly. I currently have: "It suffices to show $p_1log_2p_1 + p_2log_2p_2 geq p_1 + p_2 log_2fracp_1+p_22$, but the coefficient for the right side does not align correctly if I want to argue the convexity of f(x) = $xlog_2x$. Note that the right side occurs form having H(X) - H(X') = $-p_1log_2p_1 - p_2log_2p_2 + fracp_1+p_22 log_2fracp_1+p_22 + fracp_1+p_22 log_2fracp_1+p_22$ How can I go about this?
$endgroup$
– user57753
yesterday













$begingroup$
Right, so, by convexity of $xlog x,$ you have $$ frac(p_1 + p _2)2 log fracp_1 + p_22 le frac12 left( p_1 log p_1 + p_2 log p_2right),$$ which finishes the job. That said, I think leonbloy's suggestion is a whole lot cleaner and thus better.
$endgroup$
– stochasticboy321
8 hours ago




$begingroup$
Right, so, by convexity of $xlog x,$ you have $$ frac(p_1 + p _2)2 log fracp_1 + p_22 le frac12 left( p_1 log p_1 + p_2 log p_2right),$$ which finishes the job. That said, I think leonbloy's suggestion is a whole lot cleaner and thus better.
$endgroup$
– stochasticboy321
8 hours ago










1 Answer
1






active

oldest

votes


















1












$begingroup$

There are several ways to attack this, some of them pointed by stochasticboy321's comment.



Another way, which looks elegant to me, but it requires you to know this (useful, and not hard to prove) property of entropy: $H(p)$ is a concave function of the distribution $p$.



Granted this, consider the two distributions $p_A=(p_1,p_2, p_3 cdots p_n)$ and $p_B=(p_2,p_1, p_3 cdots p_n)$ and let $p_C = (p_A +p_B)/2$. Clearly, $H(p_A)=H(p_B)$.



Hence, by concavity $$H(X')=H(p_C) ge fracH(p_A)+H(p_B)2= H(p_A)=H(X)$$






share|cite|improve this answer









$endgroup$












    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );






    user57753 is a new contributor. Be nice, and check out our Code of Conduct.









    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3141599%2fanalysis-of-entropy-on-two-distributions-proving-hx-hx%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    There are several ways to attack this, some of them pointed by stochasticboy321's comment.



    Another way, which looks elegant to me, but it requires you to know this (useful, and not hard to prove) property of entropy: $H(p)$ is a concave function of the distribution $p$.



    Granted this, consider the two distributions $p_A=(p_1,p_2, p_3 cdots p_n)$ and $p_B=(p_2,p_1, p_3 cdots p_n)$ and let $p_C = (p_A +p_B)/2$. Clearly, $H(p_A)=H(p_B)$.



    Hence, by concavity $$H(X')=H(p_C) ge fracH(p_A)+H(p_B)2= H(p_A)=H(X)$$






    share|cite|improve this answer









    $endgroup$

















      1












      $begingroup$

      There are several ways to attack this, some of them pointed by stochasticboy321's comment.



      Another way, which looks elegant to me, but it requires you to know this (useful, and not hard to prove) property of entropy: $H(p)$ is a concave function of the distribution $p$.



      Granted this, consider the two distributions $p_A=(p_1,p_2, p_3 cdots p_n)$ and $p_B=(p_2,p_1, p_3 cdots p_n)$ and let $p_C = (p_A +p_B)/2$. Clearly, $H(p_A)=H(p_B)$.



      Hence, by concavity $$H(X')=H(p_C) ge fracH(p_A)+H(p_B)2= H(p_A)=H(X)$$






      share|cite|improve this answer









      $endgroup$















        1












        1








        1





        $begingroup$

        There are several ways to attack this, some of them pointed by stochasticboy321's comment.



        Another way, which looks elegant to me, but it requires you to know this (useful, and not hard to prove) property of entropy: $H(p)$ is a concave function of the distribution $p$.



        Granted this, consider the two distributions $p_A=(p_1,p_2, p_3 cdots p_n)$ and $p_B=(p_2,p_1, p_3 cdots p_n)$ and let $p_C = (p_A +p_B)/2$. Clearly, $H(p_A)=H(p_B)$.



        Hence, by concavity $$H(X')=H(p_C) ge fracH(p_A)+H(p_B)2= H(p_A)=H(X)$$






        share|cite|improve this answer









        $endgroup$



        There are several ways to attack this, some of them pointed by stochasticboy321's comment.



        Another way, which looks elegant to me, but it requires you to know this (useful, and not hard to prove) property of entropy: $H(p)$ is a concave function of the distribution $p$.



        Granted this, consider the two distributions $p_A=(p_1,p_2, p_3 cdots p_n)$ and $p_B=(p_2,p_1, p_3 cdots p_n)$ and let $p_C = (p_A +p_B)/2$. Clearly, $H(p_A)=H(p_B)$.



        Hence, by concavity $$H(X')=H(p_C) ge fracH(p_A)+H(p_B)2= H(p_A)=H(X)$$







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 2 days ago









        leonbloyleonbloy

        41.5k647108




        41.5k647108




















            user57753 is a new contributor. Be nice, and check out our Code of Conduct.









            draft saved

            draft discarded


















            user57753 is a new contributor. Be nice, and check out our Code of Conduct.












            user57753 is a new contributor. Be nice, and check out our Code of Conduct.











            user57753 is a new contributor. Be nice, and check out our Code of Conduct.














            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3141599%2fanalysis-of-entropy-on-two-distributions-proving-hx-hx%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Lowndes Grove History Architecture References Navigation menu32°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661132°48′6″N 79°57′58″W / 32.80167°N 79.96611°W / 32.80167; -79.9661178002500"National Register Information System"Historic houses of South Carolina"Lowndes Grove""+32° 48' 6.00", −79° 57' 58.00""Lowndes Grove, Charleston County (260 St. Margaret St., Charleston)""Lowndes Grove"The Charleston ExpositionIt Happened in South Carolina"Lowndes Grove (House), Saint Margaret Street & Sixth Avenue, Charleston, Charleston County, SC(Photographs)"Plantations of the Carolina Low Countrye

            random experiment with two different functions on unit interval Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern)Random variable and probability space notionsRandom Walk with EdgesFinding functions where the increase over a random interval is Poisson distributedNumber of days until dayCan an observed event in fact be of zero probability?Unit random processmodels of coins and uniform distributionHow to get the number of successes given $n$ trials , probability $P$ and a random variable $X$Absorbing Markov chain in a computer. Is “almost every” turned into always convergence in computer executions?Stopped random walk is not uniformly integrable

            How should I support this large drywall patch? Planned maintenance scheduled April 23, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?How do I cover large gaps in drywall?How do I keep drywall around a patch from crumbling?Can I glue a second layer of drywall?How to patch long strip on drywall?Large drywall patch: how to avoid bulging seams?Drywall Mesh Patch vs. Bulge? To remove or not to remove?How to fix this drywall job?Prep drywall before backsplashWhat's the best way to fix this horrible drywall patch job?Drywall patching using 3M Patch Plus Primer