Kernel of polynomial of matrixLinear Algebra: Minimum PolynomialCharacteristic Polynomial of a Linear MapPolynomial of matrixImage and Kernel of a MatrixKernel of a polynomialA positive definite matrix A can be express as a polynomial of $ A^2 $How does minimal polynomial divide characteristic polynomial of matrix?Kernel and Image of matricesFinding a matrix given its characteristic polynomialChange of basis of the kernel of a rectangular matrix
Constructing Group Divisible Designs - Algorithms?
Proving a function is onto where f(x)=|x|.
What's the difference between 違法 and 不法?
What (else) happened July 1st 1858 in London?
Does the Mind Blank spell prevent the target from being frightened?
Has Darkwing Duck ever met Scrooge McDuck?
Is camera lens focus an exact point or a range?
Can someone explain how this makes sense electrically?
Drawing ramified coverings with tikz
Is it possible to have a strip of cold climate in the middle of a planet?
Can somebody explain Brexit in a few child-proof sentences?
Customize circled numbers
Two-sided logarithm inequality
Is a model fitted to data or is data fitted to a model?
List of people who lose a child in תנ"ך
Folder comparison
Can the Supreme Court overturn an impeachment?
Why has "pence" been used in this sentence, not "pences"?
Can I use my Chinese passport to enter China after I acquired another citizenship?
MAXDOP Settings for SQL Server 2014
How do ground effect vehicles perform turns?
Is there a word to describe the feeling of being transfixed out of horror?
Have I saved too much for retirement so far?
Do Legal Documents Require Signing In Standard Pen Colors?
Kernel of polynomial of matrix
Linear Algebra: Minimum PolynomialCharacteristic Polynomial of a Linear MapPolynomial of matrixImage and Kernel of a MatrixKernel of a polynomialA positive definite matrix A can be express as a polynomial of $ A^2 $How does minimal polynomial divide characteristic polynomial of matrix?Kernel and Image of matricesFinding a matrix given its characteristic polynomialChange of basis of the kernel of a rectangular matrix
$begingroup$
I am given that $f(x)$, $g(x)$ and $h(x)$ are polynomials such that $f(x)=g(x)h(x)$.
I am then asked to show using induction on the degree of $f$ that $f(A)$ would be an $n times n$-matrix if $A$ is an $n times n$ matrix. I am told any constant term of $f(x), c,$ will be replaced by $cI$ in the matrix polynomial $f(A)$, where $I$ is the $n times n$ identity matrix - same size as A.
Wouldn’t this be trivial as if $I$ is the $n times n$ identity matrix, A would need to be the same size in order to add to I and thus any linear combination of powers of A would be the same size? I don’t know what induction has to do with this.
I am also asked to show that if $f(x)=g(x) h(x)$ and $f(A)=0$, then for any vector $v$ and any polynomial $b(A)$, then $b(A) h(A) v inker(g(A))$ and, similarly, $b(A) g(A) v inker(h(A))$. I don’t know how to start because I think that $b(A)h(A)v$ and $b(A)g(A)v$ are both vectors and we can’t insert vectors into $g(A)$ or $h(A)$ because you can’t take powers of vectors.
Any hints or help would be greatly appreciated!
linear-algebra matrices polynomials
$endgroup$
add a comment |
$begingroup$
I am given that $f(x)$, $g(x)$ and $h(x)$ are polynomials such that $f(x)=g(x)h(x)$.
I am then asked to show using induction on the degree of $f$ that $f(A)$ would be an $n times n$-matrix if $A$ is an $n times n$ matrix. I am told any constant term of $f(x), c,$ will be replaced by $cI$ in the matrix polynomial $f(A)$, where $I$ is the $n times n$ identity matrix - same size as A.
Wouldn’t this be trivial as if $I$ is the $n times n$ identity matrix, A would need to be the same size in order to add to I and thus any linear combination of powers of A would be the same size? I don’t know what induction has to do with this.
I am also asked to show that if $f(x)=g(x) h(x)$ and $f(A)=0$, then for any vector $v$ and any polynomial $b(A)$, then $b(A) h(A) v inker(g(A))$ and, similarly, $b(A) g(A) v inker(h(A))$. I don’t know how to start because I think that $b(A)h(A)v$ and $b(A)g(A)v$ are both vectors and we can’t insert vectors into $g(A)$ or $h(A)$ because you can’t take powers of vectors.
Any hints or help would be greatly appreciated!
linear-algebra matrices polynomials
$endgroup$
add a comment |
$begingroup$
I am given that $f(x)$, $g(x)$ and $h(x)$ are polynomials such that $f(x)=g(x)h(x)$.
I am then asked to show using induction on the degree of $f$ that $f(A)$ would be an $n times n$-matrix if $A$ is an $n times n$ matrix. I am told any constant term of $f(x), c,$ will be replaced by $cI$ in the matrix polynomial $f(A)$, where $I$ is the $n times n$ identity matrix - same size as A.
Wouldn’t this be trivial as if $I$ is the $n times n$ identity matrix, A would need to be the same size in order to add to I and thus any linear combination of powers of A would be the same size? I don’t know what induction has to do with this.
I am also asked to show that if $f(x)=g(x) h(x)$ and $f(A)=0$, then for any vector $v$ and any polynomial $b(A)$, then $b(A) h(A) v inker(g(A))$ and, similarly, $b(A) g(A) v inker(h(A))$. I don’t know how to start because I think that $b(A)h(A)v$ and $b(A)g(A)v$ are both vectors and we can’t insert vectors into $g(A)$ or $h(A)$ because you can’t take powers of vectors.
Any hints or help would be greatly appreciated!
linear-algebra matrices polynomials
$endgroup$
I am given that $f(x)$, $g(x)$ and $h(x)$ are polynomials such that $f(x)=g(x)h(x)$.
I am then asked to show using induction on the degree of $f$ that $f(A)$ would be an $n times n$-matrix if $A$ is an $n times n$ matrix. I am told any constant term of $f(x), c,$ will be replaced by $cI$ in the matrix polynomial $f(A)$, where $I$ is the $n times n$ identity matrix - same size as A.
Wouldn’t this be trivial as if $I$ is the $n times n$ identity matrix, A would need to be the same size in order to add to I and thus any linear combination of powers of A would be the same size? I don’t know what induction has to do with this.
I am also asked to show that if $f(x)=g(x) h(x)$ and $f(A)=0$, then for any vector $v$ and any polynomial $b(A)$, then $b(A) h(A) v inker(g(A))$ and, similarly, $b(A) g(A) v inker(h(A))$. I don’t know how to start because I think that $b(A)h(A)v$ and $b(A)g(A)v$ are both vectors and we can’t insert vectors into $g(A)$ or $h(A)$ because you can’t take powers of vectors.
Any hints or help would be greatly appreciated!
linear-algebra matrices polynomials
linear-algebra matrices polynomials
edited Mar 16 at 10:32
Andrew
asked Mar 16 at 10:02
AndrewAndrew
357213
357213
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
To show that $f(A)$ is an $ntimes n$-matrix, you could prove by induction that $A^k$ is an $ntimes n$-matrix for all $kinBbbN$, and then note that any linear combination of $ntimes n$-matrices is again an $ntimes n$-matrix. I agree that it's a rather trivial thing to prove though.
To show that $b(A)h(A)vinker g(A)$ for any polynomial $b$ and any vector $v$, it suffices to show that $h(A)vinker g(A)$. You claim that
...we can’t insert vectors into $g(A)$ or $h(A)$ because you can’t take powers of vectors.
Indeed you can't take powers of vectors, but that's not what is being claimed.
You have just shown (by induction) that $g(A)$ is an $ntimes n$-matrix. So you can multiply it by an $n$-vector. Because $h(A)$ is also an $ntimes n$-matrix $h(A)v$ is an $n$-vector. So you can multiply $g(A)$ by $h(A)v$. To show that $h(A)vinker g(A)$ it suffices to show that the product is $0$.
$endgroup$
$begingroup$
Thank you. I can’t use the null factor law, though, can I? That is if $f(A)=g(A)h(A)=0$ this does not imply that either $g(A)$ or $h(A)$ is equal to zero because two nonzero matrices can multiply to zero. Sorry that I’m the original question I forgot to add that $f(A)=0$ for this second part.
$endgroup$
– Andrew
Mar 16 at 10:31
$begingroup$
@Andrew I assumed that $f(A)=0$ for the second part as otherwise it doesn't make sense. And indeed, $g(A)h(A)=0$ does not imply that $g(A)=0$ or $h(A)=0$. But you do know that $g(A)cdot h(A)v=0$ for all $v$...
$endgroup$
– Servaes
Mar 16 at 10:33
$begingroup$
Of course! Thank you very much.
$endgroup$
– Andrew
Mar 16 at 10:38
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3150251%2fkernel-of-polynomial-of-matrix%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
To show that $f(A)$ is an $ntimes n$-matrix, you could prove by induction that $A^k$ is an $ntimes n$-matrix for all $kinBbbN$, and then note that any linear combination of $ntimes n$-matrices is again an $ntimes n$-matrix. I agree that it's a rather trivial thing to prove though.
To show that $b(A)h(A)vinker g(A)$ for any polynomial $b$ and any vector $v$, it suffices to show that $h(A)vinker g(A)$. You claim that
...we can’t insert vectors into $g(A)$ or $h(A)$ because you can’t take powers of vectors.
Indeed you can't take powers of vectors, but that's not what is being claimed.
You have just shown (by induction) that $g(A)$ is an $ntimes n$-matrix. So you can multiply it by an $n$-vector. Because $h(A)$ is also an $ntimes n$-matrix $h(A)v$ is an $n$-vector. So you can multiply $g(A)$ by $h(A)v$. To show that $h(A)vinker g(A)$ it suffices to show that the product is $0$.
$endgroup$
$begingroup$
Thank you. I can’t use the null factor law, though, can I? That is if $f(A)=g(A)h(A)=0$ this does not imply that either $g(A)$ or $h(A)$ is equal to zero because two nonzero matrices can multiply to zero. Sorry that I’m the original question I forgot to add that $f(A)=0$ for this second part.
$endgroup$
– Andrew
Mar 16 at 10:31
$begingroup$
@Andrew I assumed that $f(A)=0$ for the second part as otherwise it doesn't make sense. And indeed, $g(A)h(A)=0$ does not imply that $g(A)=0$ or $h(A)=0$. But you do know that $g(A)cdot h(A)v=0$ for all $v$...
$endgroup$
– Servaes
Mar 16 at 10:33
$begingroup$
Of course! Thank you very much.
$endgroup$
– Andrew
Mar 16 at 10:38
add a comment |
$begingroup$
To show that $f(A)$ is an $ntimes n$-matrix, you could prove by induction that $A^k$ is an $ntimes n$-matrix for all $kinBbbN$, and then note that any linear combination of $ntimes n$-matrices is again an $ntimes n$-matrix. I agree that it's a rather trivial thing to prove though.
To show that $b(A)h(A)vinker g(A)$ for any polynomial $b$ and any vector $v$, it suffices to show that $h(A)vinker g(A)$. You claim that
...we can’t insert vectors into $g(A)$ or $h(A)$ because you can’t take powers of vectors.
Indeed you can't take powers of vectors, but that's not what is being claimed.
You have just shown (by induction) that $g(A)$ is an $ntimes n$-matrix. So you can multiply it by an $n$-vector. Because $h(A)$ is also an $ntimes n$-matrix $h(A)v$ is an $n$-vector. So you can multiply $g(A)$ by $h(A)v$. To show that $h(A)vinker g(A)$ it suffices to show that the product is $0$.
$endgroup$
$begingroup$
Thank you. I can’t use the null factor law, though, can I? That is if $f(A)=g(A)h(A)=0$ this does not imply that either $g(A)$ or $h(A)$ is equal to zero because two nonzero matrices can multiply to zero. Sorry that I’m the original question I forgot to add that $f(A)=0$ for this second part.
$endgroup$
– Andrew
Mar 16 at 10:31
$begingroup$
@Andrew I assumed that $f(A)=0$ for the second part as otherwise it doesn't make sense. And indeed, $g(A)h(A)=0$ does not imply that $g(A)=0$ or $h(A)=0$. But you do know that $g(A)cdot h(A)v=0$ for all $v$...
$endgroup$
– Servaes
Mar 16 at 10:33
$begingroup$
Of course! Thank you very much.
$endgroup$
– Andrew
Mar 16 at 10:38
add a comment |
$begingroup$
To show that $f(A)$ is an $ntimes n$-matrix, you could prove by induction that $A^k$ is an $ntimes n$-matrix for all $kinBbbN$, and then note that any linear combination of $ntimes n$-matrices is again an $ntimes n$-matrix. I agree that it's a rather trivial thing to prove though.
To show that $b(A)h(A)vinker g(A)$ for any polynomial $b$ and any vector $v$, it suffices to show that $h(A)vinker g(A)$. You claim that
...we can’t insert vectors into $g(A)$ or $h(A)$ because you can’t take powers of vectors.
Indeed you can't take powers of vectors, but that's not what is being claimed.
You have just shown (by induction) that $g(A)$ is an $ntimes n$-matrix. So you can multiply it by an $n$-vector. Because $h(A)$ is also an $ntimes n$-matrix $h(A)v$ is an $n$-vector. So you can multiply $g(A)$ by $h(A)v$. To show that $h(A)vinker g(A)$ it suffices to show that the product is $0$.
$endgroup$
To show that $f(A)$ is an $ntimes n$-matrix, you could prove by induction that $A^k$ is an $ntimes n$-matrix for all $kinBbbN$, and then note that any linear combination of $ntimes n$-matrices is again an $ntimes n$-matrix. I agree that it's a rather trivial thing to prove though.
To show that $b(A)h(A)vinker g(A)$ for any polynomial $b$ and any vector $v$, it suffices to show that $h(A)vinker g(A)$. You claim that
...we can’t insert vectors into $g(A)$ or $h(A)$ because you can’t take powers of vectors.
Indeed you can't take powers of vectors, but that's not what is being claimed.
You have just shown (by induction) that $g(A)$ is an $ntimes n$-matrix. So you can multiply it by an $n$-vector. Because $h(A)$ is also an $ntimes n$-matrix $h(A)v$ is an $n$-vector. So you can multiply $g(A)$ by $h(A)v$. To show that $h(A)vinker g(A)$ it suffices to show that the product is $0$.
answered Mar 16 at 10:23
ServaesServaes
28.5k34099
28.5k34099
$begingroup$
Thank you. I can’t use the null factor law, though, can I? That is if $f(A)=g(A)h(A)=0$ this does not imply that either $g(A)$ or $h(A)$ is equal to zero because two nonzero matrices can multiply to zero. Sorry that I’m the original question I forgot to add that $f(A)=0$ for this second part.
$endgroup$
– Andrew
Mar 16 at 10:31
$begingroup$
@Andrew I assumed that $f(A)=0$ for the second part as otherwise it doesn't make sense. And indeed, $g(A)h(A)=0$ does not imply that $g(A)=0$ or $h(A)=0$. But you do know that $g(A)cdot h(A)v=0$ for all $v$...
$endgroup$
– Servaes
Mar 16 at 10:33
$begingroup$
Of course! Thank you very much.
$endgroup$
– Andrew
Mar 16 at 10:38
add a comment |
$begingroup$
Thank you. I can’t use the null factor law, though, can I? That is if $f(A)=g(A)h(A)=0$ this does not imply that either $g(A)$ or $h(A)$ is equal to zero because two nonzero matrices can multiply to zero. Sorry that I’m the original question I forgot to add that $f(A)=0$ for this second part.
$endgroup$
– Andrew
Mar 16 at 10:31
$begingroup$
@Andrew I assumed that $f(A)=0$ for the second part as otherwise it doesn't make sense. And indeed, $g(A)h(A)=0$ does not imply that $g(A)=0$ or $h(A)=0$. But you do know that $g(A)cdot h(A)v=0$ for all $v$...
$endgroup$
– Servaes
Mar 16 at 10:33
$begingroup$
Of course! Thank you very much.
$endgroup$
– Andrew
Mar 16 at 10:38
$begingroup$
Thank you. I can’t use the null factor law, though, can I? That is if $f(A)=g(A)h(A)=0$ this does not imply that either $g(A)$ or $h(A)$ is equal to zero because two nonzero matrices can multiply to zero. Sorry that I’m the original question I forgot to add that $f(A)=0$ for this second part.
$endgroup$
– Andrew
Mar 16 at 10:31
$begingroup$
Thank you. I can’t use the null factor law, though, can I? That is if $f(A)=g(A)h(A)=0$ this does not imply that either $g(A)$ or $h(A)$ is equal to zero because two nonzero matrices can multiply to zero. Sorry that I’m the original question I forgot to add that $f(A)=0$ for this second part.
$endgroup$
– Andrew
Mar 16 at 10:31
$begingroup$
@Andrew I assumed that $f(A)=0$ for the second part as otherwise it doesn't make sense. And indeed, $g(A)h(A)=0$ does not imply that $g(A)=0$ or $h(A)=0$. But you do know that $g(A)cdot h(A)v=0$ for all $v$...
$endgroup$
– Servaes
Mar 16 at 10:33
$begingroup$
@Andrew I assumed that $f(A)=0$ for the second part as otherwise it doesn't make sense. And indeed, $g(A)h(A)=0$ does not imply that $g(A)=0$ or $h(A)=0$. But you do know that $g(A)cdot h(A)v=0$ for all $v$...
$endgroup$
– Servaes
Mar 16 at 10:33
$begingroup$
Of course! Thank you very much.
$endgroup$
– Andrew
Mar 16 at 10:38
$begingroup$
Of course! Thank you very much.
$endgroup$
– Andrew
Mar 16 at 10:38
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3150251%2fkernel-of-polynomial-of-matrix%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown