Logistic Regression ExplanationBinary Logistic Regression Model ProcessingLogistic regression with interactionsMultiple linear regressionQuestion about Logistic Regression - 5What is the “Logistic Regression”? I cannot have a unified concept.Logistic regression coefficients problemLinear regression where explanatory variable of 0 has no meaningLogistic regression of large datasetLogistic Regression Adjusting for True Population ProportionHow is the log-likelihood for a multinomial logistic regression calculated?
Does a dangling wire really electrocute me if I'm standing in water?
"listening to me about as much as you're listening to this pole here"
How is it possible for user's password to be changed after storage was encrypted? (on OS X, Android)
Is there a name of the flying bionic bird?
Does the average primeness of natural numbers tend to zero?
Information to fellow intern about hiring?
How would photo IDs work for shapeshifters?
Typesetting a double Over Dot on top of a symbol
How can I fix this gap between bookcases I made?
Why doesn't a const reference extend the life of a temporary object passed via a function?
What is the offset in a seaplane's hull?
Prime joint compound before latex paint?
What to wear for invited talk in Canada
Email Account under attack (really) - anything I can do?
What happens when a metallic dragon and a chromatic dragon mate?
How can I add custom success page
Why is my log file so massive? 22gb. I am running log backups
When blogging recipes, how can I support both readers who want the narrative/journey and ones who want the printer-friendly recipe?
If a centaur druid Wild Shapes into a Giant Elk, do their Charge features stack?
Need help identifying/translating a plaque in Tangier, Morocco
How to answer pointed "are you quitting" questioning when I don't want them to suspect
Re-submission of rejected manuscript without informing co-authors
Can I find out the caloric content of bread by dehydrating it?
How to manage monthly salary
Logistic Regression Explanation
Binary Logistic Regression Model ProcessingLogistic regression with interactionsMultiple linear regressionQuestion about Logistic Regression - 5What is the “Logistic Regression”? I cannot have a unified concept.Logistic regression coefficients problemLinear regression where explanatory variable of 0 has no meaningLogistic regression of large datasetLogistic Regression Adjusting for True Population ProportionHow is the log-likelihood for a multinomial logistic regression calculated?
$begingroup$
I have two questions regarding logistic regression.
1) I understand that the results of a logistic regression model yield a table stating coefficients together with a p-statistic for each variable . This $p$-statistic states whether the variable at hand is important at predicting the outcome variable. For example, given the null hypothesis that the independent variable at hand is not important at predicting the outcome variable, then if $p< 0.05$ we conclude that the variable is important indeed.
My question: why do we have $p$-statistics for logistic regression and not for linear regression? Wouldn't a coefficient close to $0$ be enough to conclude that the independent variable at hand is not an important predictor?
2) I'm having a bit of trouble understanding the relationship between the logistic regression equation $p = 1/(1+e^-y)$ and its logit counterpart $log(p/(1-p)) = MX + B$. I understand that these two equations are equivalent, correct? Is it safe to explain their relationship (and relevance to our model) by saying that we are most interested in the outcome of $p = 1/(1+e^-y)$ given that this outcome gives us the probability that each observation produces the desired outcome, while the logit equation is only important to solve for coefficients?
statistics statistical-inference linear-regression logistic-regression
$endgroup$
add a comment |
$begingroup$
I have two questions regarding logistic regression.
1) I understand that the results of a logistic regression model yield a table stating coefficients together with a p-statistic for each variable . This $p$-statistic states whether the variable at hand is important at predicting the outcome variable. For example, given the null hypothesis that the independent variable at hand is not important at predicting the outcome variable, then if $p< 0.05$ we conclude that the variable is important indeed.
My question: why do we have $p$-statistics for logistic regression and not for linear regression? Wouldn't a coefficient close to $0$ be enough to conclude that the independent variable at hand is not an important predictor?
2) I'm having a bit of trouble understanding the relationship between the logistic regression equation $p = 1/(1+e^-y)$ and its logit counterpart $log(p/(1-p)) = MX + B$. I understand that these two equations are equivalent, correct? Is it safe to explain their relationship (and relevance to our model) by saying that we are most interested in the outcome of $p = 1/(1+e^-y)$ given that this outcome gives us the probability that each observation produces the desired outcome, while the logit equation is only important to solve for coefficients?
statistics statistical-inference linear-regression logistic-regression
$endgroup$
add a comment |
$begingroup$
I have two questions regarding logistic regression.
1) I understand that the results of a logistic regression model yield a table stating coefficients together with a p-statistic for each variable . This $p$-statistic states whether the variable at hand is important at predicting the outcome variable. For example, given the null hypothesis that the independent variable at hand is not important at predicting the outcome variable, then if $p< 0.05$ we conclude that the variable is important indeed.
My question: why do we have $p$-statistics for logistic regression and not for linear regression? Wouldn't a coefficient close to $0$ be enough to conclude that the independent variable at hand is not an important predictor?
2) I'm having a bit of trouble understanding the relationship between the logistic regression equation $p = 1/(1+e^-y)$ and its logit counterpart $log(p/(1-p)) = MX + B$. I understand that these two equations are equivalent, correct? Is it safe to explain their relationship (and relevance to our model) by saying that we are most interested in the outcome of $p = 1/(1+e^-y)$ given that this outcome gives us the probability that each observation produces the desired outcome, while the logit equation is only important to solve for coefficients?
statistics statistical-inference linear-regression logistic-regression
$endgroup$
I have two questions regarding logistic regression.
1) I understand that the results of a logistic regression model yield a table stating coefficients together with a p-statistic for each variable . This $p$-statistic states whether the variable at hand is important at predicting the outcome variable. For example, given the null hypothesis that the independent variable at hand is not important at predicting the outcome variable, then if $p< 0.05$ we conclude that the variable is important indeed.
My question: why do we have $p$-statistics for logistic regression and not for linear regression? Wouldn't a coefficient close to $0$ be enough to conclude that the independent variable at hand is not an important predictor?
2) I'm having a bit of trouble understanding the relationship between the logistic regression equation $p = 1/(1+e^-y)$ and its logit counterpart $log(p/(1-p)) = MX + B$. I understand that these two equations are equivalent, correct? Is it safe to explain their relationship (and relevance to our model) by saying that we are most interested in the outcome of $p = 1/(1+e^-y)$ given that this outcome gives us the probability that each observation produces the desired outcome, while the logit equation is only important to solve for coefficients?
statistics statistical-inference linear-regression logistic-regression
statistics statistical-inference linear-regression logistic-regression
edited Mar 22 at 15:50
CS Student
asked Mar 22 at 15:43
CS StudentCS Student
85
85
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
1
Certainly you get p-values for regression coefficients ( $beta$ ) in simple linear regression. They also test the hypothesis $beta = 0$. This extends to multiple linear regression where there are multiple regression coefficients. For logistic regression, how accurate and reliable the p-values for the regression coefficients are is a different question.
2
In this case
$$
log(p/(1-p))=MX+B \
Rightarrow p/(1-p)=e^MX+B \
Rightarrow p=e^MX+B/(1+e^MX+B)\
Rightarrow p = 1/(1+e^-(MX+B))
$$
In other words, $y=MX+B$ in your notation.
$endgroup$
$begingroup$
What do you meant that for logistic regression the accuracy and reliability of the p-values differs than from linear regression?
$endgroup$
– CS Student
Mar 22 at 16:27
$begingroup$
And how do you go from 𝑝/(1−𝑝)=𝑒𝑀𝑋+𝐵⇒𝑝=𝑒𝑀𝑋+𝐵/(1+𝑒𝑀𝑋+𝐵)
$endgroup$
– CS Student
Mar 22 at 16:33
$begingroup$
@CSStudent In the normal case, the theory leading to p-values based on t and F distributions is exact. I don't believe any such exact theory exists for binary regression.
$endgroup$
– PM.
Mar 22 at 16:33
1
$begingroup$
@CSStudent As for your second comment "how do you go..." please try to have a go at the re-arrangement yourself. Hint multiply both sides by (1-p) and continue.
$endgroup$
– PM.
Mar 22 at 16:36
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3158311%2flogistic-regression-explanation%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
1
Certainly you get p-values for regression coefficients ( $beta$ ) in simple linear regression. They also test the hypothesis $beta = 0$. This extends to multiple linear regression where there are multiple regression coefficients. For logistic regression, how accurate and reliable the p-values for the regression coefficients are is a different question.
2
In this case
$$
log(p/(1-p))=MX+B \
Rightarrow p/(1-p)=e^MX+B \
Rightarrow p=e^MX+B/(1+e^MX+B)\
Rightarrow p = 1/(1+e^-(MX+B))
$$
In other words, $y=MX+B$ in your notation.
$endgroup$
$begingroup$
What do you meant that for logistic regression the accuracy and reliability of the p-values differs than from linear regression?
$endgroup$
– CS Student
Mar 22 at 16:27
$begingroup$
And how do you go from 𝑝/(1−𝑝)=𝑒𝑀𝑋+𝐵⇒𝑝=𝑒𝑀𝑋+𝐵/(1+𝑒𝑀𝑋+𝐵)
$endgroup$
– CS Student
Mar 22 at 16:33
$begingroup$
@CSStudent In the normal case, the theory leading to p-values based on t and F distributions is exact. I don't believe any such exact theory exists for binary regression.
$endgroup$
– PM.
Mar 22 at 16:33
1
$begingroup$
@CSStudent As for your second comment "how do you go..." please try to have a go at the re-arrangement yourself. Hint multiply both sides by (1-p) and continue.
$endgroup$
– PM.
Mar 22 at 16:36
add a comment |
$begingroup$
1
Certainly you get p-values for regression coefficients ( $beta$ ) in simple linear regression. They also test the hypothesis $beta = 0$. This extends to multiple linear regression where there are multiple regression coefficients. For logistic regression, how accurate and reliable the p-values for the regression coefficients are is a different question.
2
In this case
$$
log(p/(1-p))=MX+B \
Rightarrow p/(1-p)=e^MX+B \
Rightarrow p=e^MX+B/(1+e^MX+B)\
Rightarrow p = 1/(1+e^-(MX+B))
$$
In other words, $y=MX+B$ in your notation.
$endgroup$
$begingroup$
What do you meant that for logistic regression the accuracy and reliability of the p-values differs than from linear regression?
$endgroup$
– CS Student
Mar 22 at 16:27
$begingroup$
And how do you go from 𝑝/(1−𝑝)=𝑒𝑀𝑋+𝐵⇒𝑝=𝑒𝑀𝑋+𝐵/(1+𝑒𝑀𝑋+𝐵)
$endgroup$
– CS Student
Mar 22 at 16:33
$begingroup$
@CSStudent In the normal case, the theory leading to p-values based on t and F distributions is exact. I don't believe any such exact theory exists for binary regression.
$endgroup$
– PM.
Mar 22 at 16:33
1
$begingroup$
@CSStudent As for your second comment "how do you go..." please try to have a go at the re-arrangement yourself. Hint multiply both sides by (1-p) and continue.
$endgroup$
– PM.
Mar 22 at 16:36
add a comment |
$begingroup$
1
Certainly you get p-values for regression coefficients ( $beta$ ) in simple linear regression. They also test the hypothesis $beta = 0$. This extends to multiple linear regression where there are multiple regression coefficients. For logistic regression, how accurate and reliable the p-values for the regression coefficients are is a different question.
2
In this case
$$
log(p/(1-p))=MX+B \
Rightarrow p/(1-p)=e^MX+B \
Rightarrow p=e^MX+B/(1+e^MX+B)\
Rightarrow p = 1/(1+e^-(MX+B))
$$
In other words, $y=MX+B$ in your notation.
$endgroup$
1
Certainly you get p-values for regression coefficients ( $beta$ ) in simple linear regression. They also test the hypothesis $beta = 0$. This extends to multiple linear regression where there are multiple regression coefficients. For logistic regression, how accurate and reliable the p-values for the regression coefficients are is a different question.
2
In this case
$$
log(p/(1-p))=MX+B \
Rightarrow p/(1-p)=e^MX+B \
Rightarrow p=e^MX+B/(1+e^MX+B)\
Rightarrow p = 1/(1+e^-(MX+B))
$$
In other words, $y=MX+B$ in your notation.
edited Mar 22 at 16:19
answered Mar 22 at 16:11
PM.PM.
3,4432925
3,4432925
$begingroup$
What do you meant that for logistic regression the accuracy and reliability of the p-values differs than from linear regression?
$endgroup$
– CS Student
Mar 22 at 16:27
$begingroup$
And how do you go from 𝑝/(1−𝑝)=𝑒𝑀𝑋+𝐵⇒𝑝=𝑒𝑀𝑋+𝐵/(1+𝑒𝑀𝑋+𝐵)
$endgroup$
– CS Student
Mar 22 at 16:33
$begingroup$
@CSStudent In the normal case, the theory leading to p-values based on t and F distributions is exact. I don't believe any such exact theory exists for binary regression.
$endgroup$
– PM.
Mar 22 at 16:33
1
$begingroup$
@CSStudent As for your second comment "how do you go..." please try to have a go at the re-arrangement yourself. Hint multiply both sides by (1-p) and continue.
$endgroup$
– PM.
Mar 22 at 16:36
add a comment |
$begingroup$
What do you meant that for logistic regression the accuracy and reliability of the p-values differs than from linear regression?
$endgroup$
– CS Student
Mar 22 at 16:27
$begingroup$
And how do you go from 𝑝/(1−𝑝)=𝑒𝑀𝑋+𝐵⇒𝑝=𝑒𝑀𝑋+𝐵/(1+𝑒𝑀𝑋+𝐵)
$endgroup$
– CS Student
Mar 22 at 16:33
$begingroup$
@CSStudent In the normal case, the theory leading to p-values based on t and F distributions is exact. I don't believe any such exact theory exists for binary regression.
$endgroup$
– PM.
Mar 22 at 16:33
1
$begingroup$
@CSStudent As for your second comment "how do you go..." please try to have a go at the re-arrangement yourself. Hint multiply both sides by (1-p) and continue.
$endgroup$
– PM.
Mar 22 at 16:36
$begingroup$
What do you meant that for logistic regression the accuracy and reliability of the p-values differs than from linear regression?
$endgroup$
– CS Student
Mar 22 at 16:27
$begingroup$
What do you meant that for logistic regression the accuracy and reliability of the p-values differs than from linear regression?
$endgroup$
– CS Student
Mar 22 at 16:27
$begingroup$
And how do you go from 𝑝/(1−𝑝)=𝑒𝑀𝑋+𝐵⇒𝑝=𝑒𝑀𝑋+𝐵/(1+𝑒𝑀𝑋+𝐵)
$endgroup$
– CS Student
Mar 22 at 16:33
$begingroup$
And how do you go from 𝑝/(1−𝑝)=𝑒𝑀𝑋+𝐵⇒𝑝=𝑒𝑀𝑋+𝐵/(1+𝑒𝑀𝑋+𝐵)
$endgroup$
– CS Student
Mar 22 at 16:33
$begingroup$
@CSStudent In the normal case, the theory leading to p-values based on t and F distributions is exact. I don't believe any such exact theory exists for binary regression.
$endgroup$
– PM.
Mar 22 at 16:33
$begingroup$
@CSStudent In the normal case, the theory leading to p-values based on t and F distributions is exact. I don't believe any such exact theory exists for binary regression.
$endgroup$
– PM.
Mar 22 at 16:33
1
1
$begingroup$
@CSStudent As for your second comment "how do you go..." please try to have a go at the re-arrangement yourself. Hint multiply both sides by (1-p) and continue.
$endgroup$
– PM.
Mar 22 at 16:36
$begingroup$
@CSStudent As for your second comment "how do you go..." please try to have a go at the re-arrangement yourself. Hint multiply both sides by (1-p) and continue.
$endgroup$
– PM.
Mar 22 at 16:36
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3158311%2flogistic-regression-explanation%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown