Finding an error in proof for an axiom of a vector spaceA Counter Examples in Linear Algebra (Vector Space)Existence Proof: $T(v_i)=w_i$ for all $i=1,2,3,dots,n$Linearly independent elements over a vector space over $mathbb R$.Proof that the span of a list is equal to the span of any reordering of the list$rm span(S_1) + rm span(S_2) = rm span(S_1 cup S_2)$ for infinite setsDistance from affine vector space?Proving that the dual of a finite dimensional vector space separates points-is my proof correct?A question about orthogonal vector sets and linear independenceArithmetic and quadratic mean: Distance between $v$ and $(v+w)/2$ vs distance between $sqrt(v^2 + w^2)/2$ and $(v+w)/2$Let $V$ be a vector space over a field $F$ and let $B$ be a basis for $V$. In here $V$ can be infinite dimensional.Lemma about direct sum, missing step
If the only attacker is removed from combat, is a creature still counted as having attacked this turn?
Overlapping circles covering polygon
Is there a reason to prefer HFS+ over APFS for disk images in High Sierra and/or Mojave?
What is the meaning of the following sentence?
How to make a list of partial sums using forEach
Language involving irrational number is not a CFL
Why is the principal energy of an electron lower for excited electrons in a higher energy state?
How to test the sharpness of a knife?
How to get directions in deep space?
Why would five hundred and five be same as one?
How can I, as DM, avoid the Conga Line of Death occurring when implementing some form of flanking rule?
Would this string work as string?
Should I assume I have passed probation?
Isometric embedding of a genus g surface
What is the meaning of "You've never met a graph you didn't like?"
Why does a 97 / 92 key piano exist by Bösendorfer?
Storage of electrolytic capacitors - how long?
Unable to disable Microsoft Store in domain environment
I'm just a whisper. Who am I?
Do I have to know the General Relativity theory to understand the concept of inertial frame?
When and why was runway 07/25 at Kai Tak removed?
Is there anyway, I can have two passwords for my wi-fi
If Captain Marvel (MCU) were to have a child with a human male, would the child be human or Kree?
El Dorado Word Puzzle II: Videogame Edition
Finding an error in proof for an axiom of a vector space
A Counter Examples in Linear Algebra (Vector Space)Existence Proof: $T(v_i)=w_i$ for all $i=1,2,3,dots,n$Linearly independent elements over a vector space over $mathbb R$.Proof that the span of a list is equal to the span of any reordering of the list$rm span(S_1) + rm span(S_2) = rm span(S_1 cup S_2)$ for infinite setsDistance from affine vector space?Proving that the dual of a finite dimensional vector space separates points-is my proof correct?A question about orthogonal vector sets and linear independenceArithmetic and quadratic mean: Distance between $v$ and $(v+w)/2$ vs distance between $sqrt(v^2 + w^2)/2$ and $(v+w)/2$Let $V$ be a vector space over a field $F$ and let $B$ be a basis for $V$. In here $V$ can be infinite dimensional.Lemma about direct sum, missing step
$begingroup$
Recently, I asked a question about axoims of vector spaces and counterexamples. The question can be found at: A Counter Examples in Linear Algebra (Vector Space)
However, a few days before asking this question, I tried to prove that the axiom $1 cdot v = v$ in the definition of a vector space is redundant. For the proof, I have used that every vector space has a basis (which is proved by using Zorn's Lemma). However, I know that this is not true. Because we do have a system where every other axiom is satisfied and only $1 cdot v neq v$ and hence the system fails to be a vector space. I would like to know where exactly in the proof have I made a mistake? I have been stuck on this for a few weeks and no matter what I do, I cannot find the exact point where things start to go wrong. All help, suggestions and comments are appreciated. The proof I tried is as follows:-
We know that $V$ must have a basis (can be proved by using Zorn's Lemma), say $B$. Let $v in V$. Then, $exists v_1, v_2, cdots, v_n in B$ and $alpha_1, alpha_2, cdots, alpha_n in mathbbF$ such that $v = alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = sumlimits_i = 1^n alpha_i cdot v_i$. Now, let $1 cdot v = w$, where $1 in mathbbF$ is the unity of the field and $w in V$. Again, since $B$ is a basis, $exists w_1, w_2, cdots, w_m in B$ and $beta_1, beta_2, cdots, beta_m in mathbbF$ such that $w = beta_1 cdot w_1 + beta_2 cdot w_2 + cdots + beta_m cdot w_m = sumlimits_i = 1^m beta_i cdot w_i$. Therefore, we have
beginalign*
1 cdot sumlimits_i = 1^n alpha_i cdot v_i &= sumlimits_i = 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^n 1 cdot left( alpha_i cdot v_i right) &= sumlimits_i = 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^n left( 1 alpha_i right) cdot v_i &= sumlimits_i = 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^n alpha_i cdot v_i &= sumlimits_i = 1^m beta_i cdot w_i \
endalign*
Consider the two sets $S_1 = leftlbrace v_1, v_2, cdots, v_n rightrbrace$ and $S_2 = leftlbrace w_1, w_2, cdots, w_m rightrbrace$. Clearly, $S_1 subseteq B$ and $S_2 subseteq B$ and hence $S_1, S_2$ are linearly independent. Also, the set $S_1 cup S_2 = leftlbrace v_1, v_2, cdots, v_n, w_1, w_2, cdots, w_m rightrbrace subseteq B$ and is also linearly independent. Let, if possible $S_1 cap S_2 = emptyset$, where $emptyset$ denotes the empty set. This means that $forall i in leftlbrace 1, 2, cdots n rightrbrace$ and $forall j in leftlbrace 1, 2, cdots, m rightrbrace, v_i neq w_j$.
Using that the additive inverse of $a cdot v$ is $left( -a right) cdot v$ and adding the additive inverses of each vector $beta_i cdot w_i$ in the last equation to obtain,
$$sumlimits_i = 1^n alpha_i cdot v_i + sumlimits_i = 1^m left( - beta_i right) cdot w_i = textbf0$$
Since $S_1 cup S_2$ is linearly independent, $forall i in leftlbrace 1, 2, cdots, n rightrbrace, alpha_i = 0$ and $forall i in leftlbrace 1, 2, cdots, m rightrbrace, - beta_i = 0$, which in turn gives that $beta_i = 0$. Therefore, $v = 0 cdot v_1 + 0 cdot v_2 + cdots + 0 cdot v_n = textbf0$ and $w = 0 cdot w_1 + 0 cdot w_2 + cdots + 0 cdot w_m = textbf0$. Hence, $1 cdot textbf0 = textbf0$.
Now, let us consider that $S_1 cap S_2 neq emptyset$. Let there be $r$ vectors, where $0 leq r leq min leftlbrace m, n rightrbrace$, which are common in $S_1$ and $S_2$. We shall name them, $v_i$, where $i in leftlbrace 1, 2, cdots, r rightrbrace$. Thus, our sets look like $S_1 = leftlbrace v_1, v_2, cdots, v_r, v_r + 1, cdots, v_n rightrbrace$ and $S_2 = leftlbrace v_1, v_2, cdots, v_r, w_r + 1, cdots, w_m rightrbrace$. Now, $v = sumlimits_i = 1^r alpha_i cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i$ and $w = sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i$. Again,
beginalign*
1 cdot left( sumlimits_i = 1^r alpha_i cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i right) &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^r 1 cdot left( alpha_i cdot v_i right) + sumlimits_i = r + 1^n 1 cdot left( alpha_i cdot v_i right) &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^r left( 1 alpha_i right) cdot v_i + sumlimits_i = r + 1^n left( 1 alpha_i right) cdot v_i &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^r alpha_i cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
endalign*
Adding the additive inverses of each of the vectors on the right hand side of the equation to both the sides and using axioms of vector space multiple times, we get
$$sumlimits_i = 1^r left( alpha_i - beta_i right) cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i + sumlimits_i = r + 1^m left( - beta_i right) cdot w_i = textbf0$$
Since $S_1 cup S_2$ is linearly independent, $forall i in leftlbrace 1, 2, cdots, r rightrbrace, alpha_i - beta_i = 0 Rightarrow alpha_i = beta_i$. Also, $forall i in leftlbrace r + 1, r + 2, n cdots rightrbrace, alpha_i = 0$ and $forall i in leftlbrace r + 1, r + 2, cdots, m rightrbrace, beta_i = 0$. This tells us that $v = sumlimits_i = 1^r alpha_i cdot v_i$ and $w = 1 cdot v = sumlimits_i = 1^r alpha_i cdot v_i$. Hence, $1 cdot v = v$.
Note that we can prove $- left( a cdot v right) = left( -a right) cdot v$ as follows
beginalign
left( -a right) cdot v + a cdot v = left( -a + a right) cdot v = 0 cdot v = 0
endalign
linear-algebra proof-verification vector-spaces
$endgroup$
add a comment |
$begingroup$
Recently, I asked a question about axoims of vector spaces and counterexamples. The question can be found at: A Counter Examples in Linear Algebra (Vector Space)
However, a few days before asking this question, I tried to prove that the axiom $1 cdot v = v$ in the definition of a vector space is redundant. For the proof, I have used that every vector space has a basis (which is proved by using Zorn's Lemma). However, I know that this is not true. Because we do have a system where every other axiom is satisfied and only $1 cdot v neq v$ and hence the system fails to be a vector space. I would like to know where exactly in the proof have I made a mistake? I have been stuck on this for a few weeks and no matter what I do, I cannot find the exact point where things start to go wrong. All help, suggestions and comments are appreciated. The proof I tried is as follows:-
We know that $V$ must have a basis (can be proved by using Zorn's Lemma), say $B$. Let $v in V$. Then, $exists v_1, v_2, cdots, v_n in B$ and $alpha_1, alpha_2, cdots, alpha_n in mathbbF$ such that $v = alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = sumlimits_i = 1^n alpha_i cdot v_i$. Now, let $1 cdot v = w$, where $1 in mathbbF$ is the unity of the field and $w in V$. Again, since $B$ is a basis, $exists w_1, w_2, cdots, w_m in B$ and $beta_1, beta_2, cdots, beta_m in mathbbF$ such that $w = beta_1 cdot w_1 + beta_2 cdot w_2 + cdots + beta_m cdot w_m = sumlimits_i = 1^m beta_i cdot w_i$. Therefore, we have
beginalign*
1 cdot sumlimits_i = 1^n alpha_i cdot v_i &= sumlimits_i = 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^n 1 cdot left( alpha_i cdot v_i right) &= sumlimits_i = 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^n left( 1 alpha_i right) cdot v_i &= sumlimits_i = 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^n alpha_i cdot v_i &= sumlimits_i = 1^m beta_i cdot w_i \
endalign*
Consider the two sets $S_1 = leftlbrace v_1, v_2, cdots, v_n rightrbrace$ and $S_2 = leftlbrace w_1, w_2, cdots, w_m rightrbrace$. Clearly, $S_1 subseteq B$ and $S_2 subseteq B$ and hence $S_1, S_2$ are linearly independent. Also, the set $S_1 cup S_2 = leftlbrace v_1, v_2, cdots, v_n, w_1, w_2, cdots, w_m rightrbrace subseteq B$ and is also linearly independent. Let, if possible $S_1 cap S_2 = emptyset$, where $emptyset$ denotes the empty set. This means that $forall i in leftlbrace 1, 2, cdots n rightrbrace$ and $forall j in leftlbrace 1, 2, cdots, m rightrbrace, v_i neq w_j$.
Using that the additive inverse of $a cdot v$ is $left( -a right) cdot v$ and adding the additive inverses of each vector $beta_i cdot w_i$ in the last equation to obtain,
$$sumlimits_i = 1^n alpha_i cdot v_i + sumlimits_i = 1^m left( - beta_i right) cdot w_i = textbf0$$
Since $S_1 cup S_2$ is linearly independent, $forall i in leftlbrace 1, 2, cdots, n rightrbrace, alpha_i = 0$ and $forall i in leftlbrace 1, 2, cdots, m rightrbrace, - beta_i = 0$, which in turn gives that $beta_i = 0$. Therefore, $v = 0 cdot v_1 + 0 cdot v_2 + cdots + 0 cdot v_n = textbf0$ and $w = 0 cdot w_1 + 0 cdot w_2 + cdots + 0 cdot w_m = textbf0$. Hence, $1 cdot textbf0 = textbf0$.
Now, let us consider that $S_1 cap S_2 neq emptyset$. Let there be $r$ vectors, where $0 leq r leq min leftlbrace m, n rightrbrace$, which are common in $S_1$ and $S_2$. We shall name them, $v_i$, where $i in leftlbrace 1, 2, cdots, r rightrbrace$. Thus, our sets look like $S_1 = leftlbrace v_1, v_2, cdots, v_r, v_r + 1, cdots, v_n rightrbrace$ and $S_2 = leftlbrace v_1, v_2, cdots, v_r, w_r + 1, cdots, w_m rightrbrace$. Now, $v = sumlimits_i = 1^r alpha_i cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i$ and $w = sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i$. Again,
beginalign*
1 cdot left( sumlimits_i = 1^r alpha_i cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i right) &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^r 1 cdot left( alpha_i cdot v_i right) + sumlimits_i = r + 1^n 1 cdot left( alpha_i cdot v_i right) &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^r left( 1 alpha_i right) cdot v_i + sumlimits_i = r + 1^n left( 1 alpha_i right) cdot v_i &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^r alpha_i cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
endalign*
Adding the additive inverses of each of the vectors on the right hand side of the equation to both the sides and using axioms of vector space multiple times, we get
$$sumlimits_i = 1^r left( alpha_i - beta_i right) cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i + sumlimits_i = r + 1^m left( - beta_i right) cdot w_i = textbf0$$
Since $S_1 cup S_2$ is linearly independent, $forall i in leftlbrace 1, 2, cdots, r rightrbrace, alpha_i - beta_i = 0 Rightarrow alpha_i = beta_i$. Also, $forall i in leftlbrace r + 1, r + 2, n cdots rightrbrace, alpha_i = 0$ and $forall i in leftlbrace r + 1, r + 2, cdots, m rightrbrace, beta_i = 0$. This tells us that $v = sumlimits_i = 1^r alpha_i cdot v_i$ and $w = 1 cdot v = sumlimits_i = 1^r alpha_i cdot v_i$. Hence, $1 cdot v = v$.
Note that we can prove $- left( a cdot v right) = left( -a right) cdot v$ as follows
beginalign
left( -a right) cdot v + a cdot v = left( -a + a right) cdot v = 0 cdot v = 0
endalign
linear-algebra proof-verification vector-spaces
$endgroup$
$begingroup$
"We know that $V$ must have a basis" - this depends on the definition of vector space, so perhaps on $1v=v$?
$endgroup$
– Hagen von Eitzen
Mar 14 at 7:12
add a comment |
$begingroup$
Recently, I asked a question about axoims of vector spaces and counterexamples. The question can be found at: A Counter Examples in Linear Algebra (Vector Space)
However, a few days before asking this question, I tried to prove that the axiom $1 cdot v = v$ in the definition of a vector space is redundant. For the proof, I have used that every vector space has a basis (which is proved by using Zorn's Lemma). However, I know that this is not true. Because we do have a system where every other axiom is satisfied and only $1 cdot v neq v$ and hence the system fails to be a vector space. I would like to know where exactly in the proof have I made a mistake? I have been stuck on this for a few weeks and no matter what I do, I cannot find the exact point where things start to go wrong. All help, suggestions and comments are appreciated. The proof I tried is as follows:-
We know that $V$ must have a basis (can be proved by using Zorn's Lemma), say $B$. Let $v in V$. Then, $exists v_1, v_2, cdots, v_n in B$ and $alpha_1, alpha_2, cdots, alpha_n in mathbbF$ such that $v = alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = sumlimits_i = 1^n alpha_i cdot v_i$. Now, let $1 cdot v = w$, where $1 in mathbbF$ is the unity of the field and $w in V$. Again, since $B$ is a basis, $exists w_1, w_2, cdots, w_m in B$ and $beta_1, beta_2, cdots, beta_m in mathbbF$ such that $w = beta_1 cdot w_1 + beta_2 cdot w_2 + cdots + beta_m cdot w_m = sumlimits_i = 1^m beta_i cdot w_i$. Therefore, we have
beginalign*
1 cdot sumlimits_i = 1^n alpha_i cdot v_i &= sumlimits_i = 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^n 1 cdot left( alpha_i cdot v_i right) &= sumlimits_i = 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^n left( 1 alpha_i right) cdot v_i &= sumlimits_i = 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^n alpha_i cdot v_i &= sumlimits_i = 1^m beta_i cdot w_i \
endalign*
Consider the two sets $S_1 = leftlbrace v_1, v_2, cdots, v_n rightrbrace$ and $S_2 = leftlbrace w_1, w_2, cdots, w_m rightrbrace$. Clearly, $S_1 subseteq B$ and $S_2 subseteq B$ and hence $S_1, S_2$ are linearly independent. Also, the set $S_1 cup S_2 = leftlbrace v_1, v_2, cdots, v_n, w_1, w_2, cdots, w_m rightrbrace subseteq B$ and is also linearly independent. Let, if possible $S_1 cap S_2 = emptyset$, where $emptyset$ denotes the empty set. This means that $forall i in leftlbrace 1, 2, cdots n rightrbrace$ and $forall j in leftlbrace 1, 2, cdots, m rightrbrace, v_i neq w_j$.
Using that the additive inverse of $a cdot v$ is $left( -a right) cdot v$ and adding the additive inverses of each vector $beta_i cdot w_i$ in the last equation to obtain,
$$sumlimits_i = 1^n alpha_i cdot v_i + sumlimits_i = 1^m left( - beta_i right) cdot w_i = textbf0$$
Since $S_1 cup S_2$ is linearly independent, $forall i in leftlbrace 1, 2, cdots, n rightrbrace, alpha_i = 0$ and $forall i in leftlbrace 1, 2, cdots, m rightrbrace, - beta_i = 0$, which in turn gives that $beta_i = 0$. Therefore, $v = 0 cdot v_1 + 0 cdot v_2 + cdots + 0 cdot v_n = textbf0$ and $w = 0 cdot w_1 + 0 cdot w_2 + cdots + 0 cdot w_m = textbf0$. Hence, $1 cdot textbf0 = textbf0$.
Now, let us consider that $S_1 cap S_2 neq emptyset$. Let there be $r$ vectors, where $0 leq r leq min leftlbrace m, n rightrbrace$, which are common in $S_1$ and $S_2$. We shall name them, $v_i$, where $i in leftlbrace 1, 2, cdots, r rightrbrace$. Thus, our sets look like $S_1 = leftlbrace v_1, v_2, cdots, v_r, v_r + 1, cdots, v_n rightrbrace$ and $S_2 = leftlbrace v_1, v_2, cdots, v_r, w_r + 1, cdots, w_m rightrbrace$. Now, $v = sumlimits_i = 1^r alpha_i cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i$ and $w = sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i$. Again,
beginalign*
1 cdot left( sumlimits_i = 1^r alpha_i cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i right) &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^r 1 cdot left( alpha_i cdot v_i right) + sumlimits_i = r + 1^n 1 cdot left( alpha_i cdot v_i right) &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^r left( 1 alpha_i right) cdot v_i + sumlimits_i = r + 1^n left( 1 alpha_i right) cdot v_i &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^r alpha_i cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
endalign*
Adding the additive inverses of each of the vectors on the right hand side of the equation to both the sides and using axioms of vector space multiple times, we get
$$sumlimits_i = 1^r left( alpha_i - beta_i right) cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i + sumlimits_i = r + 1^m left( - beta_i right) cdot w_i = textbf0$$
Since $S_1 cup S_2$ is linearly independent, $forall i in leftlbrace 1, 2, cdots, r rightrbrace, alpha_i - beta_i = 0 Rightarrow alpha_i = beta_i$. Also, $forall i in leftlbrace r + 1, r + 2, n cdots rightrbrace, alpha_i = 0$ and $forall i in leftlbrace r + 1, r + 2, cdots, m rightrbrace, beta_i = 0$. This tells us that $v = sumlimits_i = 1^r alpha_i cdot v_i$ and $w = 1 cdot v = sumlimits_i = 1^r alpha_i cdot v_i$. Hence, $1 cdot v = v$.
Note that we can prove $- left( a cdot v right) = left( -a right) cdot v$ as follows
beginalign
left( -a right) cdot v + a cdot v = left( -a + a right) cdot v = 0 cdot v = 0
endalign
linear-algebra proof-verification vector-spaces
$endgroup$
Recently, I asked a question about axoims of vector spaces and counterexamples. The question can be found at: A Counter Examples in Linear Algebra (Vector Space)
However, a few days before asking this question, I tried to prove that the axiom $1 cdot v = v$ in the definition of a vector space is redundant. For the proof, I have used that every vector space has a basis (which is proved by using Zorn's Lemma). However, I know that this is not true. Because we do have a system where every other axiom is satisfied and only $1 cdot v neq v$ and hence the system fails to be a vector space. I would like to know where exactly in the proof have I made a mistake? I have been stuck on this for a few weeks and no matter what I do, I cannot find the exact point where things start to go wrong. All help, suggestions and comments are appreciated. The proof I tried is as follows:-
We know that $V$ must have a basis (can be proved by using Zorn's Lemma), say $B$. Let $v in V$. Then, $exists v_1, v_2, cdots, v_n in B$ and $alpha_1, alpha_2, cdots, alpha_n in mathbbF$ such that $v = alpha_1 cdot v_1 + alpha_2 cdot v_2 + cdots + alpha_n cdot v_n = sumlimits_i = 1^n alpha_i cdot v_i$. Now, let $1 cdot v = w$, where $1 in mathbbF$ is the unity of the field and $w in V$. Again, since $B$ is a basis, $exists w_1, w_2, cdots, w_m in B$ and $beta_1, beta_2, cdots, beta_m in mathbbF$ such that $w = beta_1 cdot w_1 + beta_2 cdot w_2 + cdots + beta_m cdot w_m = sumlimits_i = 1^m beta_i cdot w_i$. Therefore, we have
beginalign*
1 cdot sumlimits_i = 1^n alpha_i cdot v_i &= sumlimits_i = 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^n 1 cdot left( alpha_i cdot v_i right) &= sumlimits_i = 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^n left( 1 alpha_i right) cdot v_i &= sumlimits_i = 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^n alpha_i cdot v_i &= sumlimits_i = 1^m beta_i cdot w_i \
endalign*
Consider the two sets $S_1 = leftlbrace v_1, v_2, cdots, v_n rightrbrace$ and $S_2 = leftlbrace w_1, w_2, cdots, w_m rightrbrace$. Clearly, $S_1 subseteq B$ and $S_2 subseteq B$ and hence $S_1, S_2$ are linearly independent. Also, the set $S_1 cup S_2 = leftlbrace v_1, v_2, cdots, v_n, w_1, w_2, cdots, w_m rightrbrace subseteq B$ and is also linearly independent. Let, if possible $S_1 cap S_2 = emptyset$, where $emptyset$ denotes the empty set. This means that $forall i in leftlbrace 1, 2, cdots n rightrbrace$ and $forall j in leftlbrace 1, 2, cdots, m rightrbrace, v_i neq w_j$.
Using that the additive inverse of $a cdot v$ is $left( -a right) cdot v$ and adding the additive inverses of each vector $beta_i cdot w_i$ in the last equation to obtain,
$$sumlimits_i = 1^n alpha_i cdot v_i + sumlimits_i = 1^m left( - beta_i right) cdot w_i = textbf0$$
Since $S_1 cup S_2$ is linearly independent, $forall i in leftlbrace 1, 2, cdots, n rightrbrace, alpha_i = 0$ and $forall i in leftlbrace 1, 2, cdots, m rightrbrace, - beta_i = 0$, which in turn gives that $beta_i = 0$. Therefore, $v = 0 cdot v_1 + 0 cdot v_2 + cdots + 0 cdot v_n = textbf0$ and $w = 0 cdot w_1 + 0 cdot w_2 + cdots + 0 cdot w_m = textbf0$. Hence, $1 cdot textbf0 = textbf0$.
Now, let us consider that $S_1 cap S_2 neq emptyset$. Let there be $r$ vectors, where $0 leq r leq min leftlbrace m, n rightrbrace$, which are common in $S_1$ and $S_2$. We shall name them, $v_i$, where $i in leftlbrace 1, 2, cdots, r rightrbrace$. Thus, our sets look like $S_1 = leftlbrace v_1, v_2, cdots, v_r, v_r + 1, cdots, v_n rightrbrace$ and $S_2 = leftlbrace v_1, v_2, cdots, v_r, w_r + 1, cdots, w_m rightrbrace$. Now, $v = sumlimits_i = 1^r alpha_i cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i$ and $w = sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i$. Again,
beginalign*
1 cdot left( sumlimits_i = 1^r alpha_i cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i right) &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^r 1 cdot left( alpha_i cdot v_i right) + sumlimits_i = r + 1^n 1 cdot left( alpha_i cdot v_i right) &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^r left( 1 alpha_i right) cdot v_i + sumlimits_i = r + 1^n left( 1 alpha_i right) cdot v_i &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
therefore sumlimits_i = 1^r alpha_i cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i &= sumlimits_i = 1^r beta_i cdot v_i + sumlimits_i = r + 1^m beta_i cdot w_i \
endalign*
Adding the additive inverses of each of the vectors on the right hand side of the equation to both the sides and using axioms of vector space multiple times, we get
$$sumlimits_i = 1^r left( alpha_i - beta_i right) cdot v_i + sumlimits_i = r + 1^n alpha_i cdot v_i + sumlimits_i = r + 1^m left( - beta_i right) cdot w_i = textbf0$$
Since $S_1 cup S_2$ is linearly independent, $forall i in leftlbrace 1, 2, cdots, r rightrbrace, alpha_i - beta_i = 0 Rightarrow alpha_i = beta_i$. Also, $forall i in leftlbrace r + 1, r + 2, n cdots rightrbrace, alpha_i = 0$ and $forall i in leftlbrace r + 1, r + 2, cdots, m rightrbrace, beta_i = 0$. This tells us that $v = sumlimits_i = 1^r alpha_i cdot v_i$ and $w = 1 cdot v = sumlimits_i = 1^r alpha_i cdot v_i$. Hence, $1 cdot v = v$.
Note that we can prove $- left( a cdot v right) = left( -a right) cdot v$ as follows
beginalign
left( -a right) cdot v + a cdot v = left( -a + a right) cdot v = 0 cdot v = 0
endalign
linear-algebra proof-verification vector-spaces
linear-algebra proof-verification vector-spaces
asked Mar 14 at 7:06
Aniruddha DeshmukhAniruddha Deshmukh
1,171419
1,171419
$begingroup$
"We know that $V$ must have a basis" - this depends on the definition of vector space, so perhaps on $1v=v$?
$endgroup$
– Hagen von Eitzen
Mar 14 at 7:12
add a comment |
$begingroup$
"We know that $V$ must have a basis" - this depends on the definition of vector space, so perhaps on $1v=v$?
$endgroup$
– Hagen von Eitzen
Mar 14 at 7:12
$begingroup$
"We know that $V$ must have a basis" - this depends on the definition of vector space, so perhaps on $1v=v$?
$endgroup$
– Hagen von Eitzen
Mar 14 at 7:12
$begingroup$
"We know that $V$ must have a basis" - this depends on the definition of vector space, so perhaps on $1v=v$?
$endgroup$
– Hagen von Eitzen
Mar 14 at 7:12
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Let $Bbb F=Bbb R$, $V=Bbb Z$ and for $alphainBbb F$, $vinBbb Z$, let $alpha v=0$. Then $v$ is an abelian group and $cdot$ is an action of the ring-without-$1$ (sometimes called rng) $Bbb F$ on $V$. Hence this is almost a vector space - only the fact that $1$ acts as identity is missing.
The $V$ does not have a basis. Indeed, any linear combination $alpha_1v_1+alpha_2v_2+ldots +a_nv_n$ will always be $=0$. So the very first sentence in your argument is invalid.
$endgroup$
2
$begingroup$
@DanielWainfleet I wrote rng on purpose
$endgroup$
– Hagen von Eitzen
Mar 14 at 7:27
1
$begingroup$
Does the proof for existence of basis uses the axiom $1 cdot v = v$ anywhere?
$endgroup$
– Aniruddha Deshmukh
Mar 14 at 7:40
$begingroup$
+1. BTW the "work" in the Q applies the axiom that $a (bv)=(ab )v$ for $a ,b in Bbb F$ and $vin V.$ So if $w,v in V$ with$ w=1v$ then $1w=1(1v)=(1cdot 1)v=1v=w.$ So for all $win 1v:vin V$ we have $1w=w.$ So the axiom $forall vin V,(1v=v)$ is implied, in the presence of the other axioms, by $V=1v:vin V,$ but, as your example shows, is independent of the other axioms.
$endgroup$
– DanielWainfleet
Mar 14 at 7:44
$begingroup$
@AniruddhaDeshmukh . Let $W=1v:xin V.$ Then by my previous comment, $forall win W;(1w=w)$ so we can easily verify that $W$ is a vector space over $F.$(Although, as in the answer, it might be the trivial space $0.$) Now suppose $vin V$ and $1vne v.$ Then $vnot in W.$ But now, for any $bin Bsubset V$ and any $fin Bbb F$ we have $fb=(fcdot 1)b=f(1b)in W$, so the linear span of $B$ is a subset of $W$,and $Wne V$ so $B$ is not a basis for $V.$
$endgroup$
– DanielWainfleet
Mar 14 at 8:22
1
$begingroup$
@AniruddhaDeshmukh Yes, it does. Using Zorn, let $B$ be a maximal independent family and assume $v$ is not in the span of $B$. Then how are you going to conclude that $Bcupv$ is a linear independent family (or even that $vnotin B$)?
$endgroup$
– Hagen von Eitzen
Mar 14 at 13:34
|
show 3 more comments
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3147657%2ffinding-an-error-in-proof-for-an-axiom-of-a-vector-space%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Let $Bbb F=Bbb R$, $V=Bbb Z$ and for $alphainBbb F$, $vinBbb Z$, let $alpha v=0$. Then $v$ is an abelian group and $cdot$ is an action of the ring-without-$1$ (sometimes called rng) $Bbb F$ on $V$. Hence this is almost a vector space - only the fact that $1$ acts as identity is missing.
The $V$ does not have a basis. Indeed, any linear combination $alpha_1v_1+alpha_2v_2+ldots +a_nv_n$ will always be $=0$. So the very first sentence in your argument is invalid.
$endgroup$
2
$begingroup$
@DanielWainfleet I wrote rng on purpose
$endgroup$
– Hagen von Eitzen
Mar 14 at 7:27
1
$begingroup$
Does the proof for existence of basis uses the axiom $1 cdot v = v$ anywhere?
$endgroup$
– Aniruddha Deshmukh
Mar 14 at 7:40
$begingroup$
+1. BTW the "work" in the Q applies the axiom that $a (bv)=(ab )v$ for $a ,b in Bbb F$ and $vin V.$ So if $w,v in V$ with$ w=1v$ then $1w=1(1v)=(1cdot 1)v=1v=w.$ So for all $win 1v:vin V$ we have $1w=w.$ So the axiom $forall vin V,(1v=v)$ is implied, in the presence of the other axioms, by $V=1v:vin V,$ but, as your example shows, is independent of the other axioms.
$endgroup$
– DanielWainfleet
Mar 14 at 7:44
$begingroup$
@AniruddhaDeshmukh . Let $W=1v:xin V.$ Then by my previous comment, $forall win W;(1w=w)$ so we can easily verify that $W$ is a vector space over $F.$(Although, as in the answer, it might be the trivial space $0.$) Now suppose $vin V$ and $1vne v.$ Then $vnot in W.$ But now, for any $bin Bsubset V$ and any $fin Bbb F$ we have $fb=(fcdot 1)b=f(1b)in W$, so the linear span of $B$ is a subset of $W$,and $Wne V$ so $B$ is not a basis for $V.$
$endgroup$
– DanielWainfleet
Mar 14 at 8:22
1
$begingroup$
@AniruddhaDeshmukh Yes, it does. Using Zorn, let $B$ be a maximal independent family and assume $v$ is not in the span of $B$. Then how are you going to conclude that $Bcupv$ is a linear independent family (or even that $vnotin B$)?
$endgroup$
– Hagen von Eitzen
Mar 14 at 13:34
|
show 3 more comments
$begingroup$
Let $Bbb F=Bbb R$, $V=Bbb Z$ and for $alphainBbb F$, $vinBbb Z$, let $alpha v=0$. Then $v$ is an abelian group and $cdot$ is an action of the ring-without-$1$ (sometimes called rng) $Bbb F$ on $V$. Hence this is almost a vector space - only the fact that $1$ acts as identity is missing.
The $V$ does not have a basis. Indeed, any linear combination $alpha_1v_1+alpha_2v_2+ldots +a_nv_n$ will always be $=0$. So the very first sentence in your argument is invalid.
$endgroup$
2
$begingroup$
@DanielWainfleet I wrote rng on purpose
$endgroup$
– Hagen von Eitzen
Mar 14 at 7:27
1
$begingroup$
Does the proof for existence of basis uses the axiom $1 cdot v = v$ anywhere?
$endgroup$
– Aniruddha Deshmukh
Mar 14 at 7:40
$begingroup$
+1. BTW the "work" in the Q applies the axiom that $a (bv)=(ab )v$ for $a ,b in Bbb F$ and $vin V.$ So if $w,v in V$ with$ w=1v$ then $1w=1(1v)=(1cdot 1)v=1v=w.$ So for all $win 1v:vin V$ we have $1w=w.$ So the axiom $forall vin V,(1v=v)$ is implied, in the presence of the other axioms, by $V=1v:vin V,$ but, as your example shows, is independent of the other axioms.
$endgroup$
– DanielWainfleet
Mar 14 at 7:44
$begingroup$
@AniruddhaDeshmukh . Let $W=1v:xin V.$ Then by my previous comment, $forall win W;(1w=w)$ so we can easily verify that $W$ is a vector space over $F.$(Although, as in the answer, it might be the trivial space $0.$) Now suppose $vin V$ and $1vne v.$ Then $vnot in W.$ But now, for any $bin Bsubset V$ and any $fin Bbb F$ we have $fb=(fcdot 1)b=f(1b)in W$, so the linear span of $B$ is a subset of $W$,and $Wne V$ so $B$ is not a basis for $V.$
$endgroup$
– DanielWainfleet
Mar 14 at 8:22
1
$begingroup$
@AniruddhaDeshmukh Yes, it does. Using Zorn, let $B$ be a maximal independent family and assume $v$ is not in the span of $B$. Then how are you going to conclude that $Bcupv$ is a linear independent family (or even that $vnotin B$)?
$endgroup$
– Hagen von Eitzen
Mar 14 at 13:34
|
show 3 more comments
$begingroup$
Let $Bbb F=Bbb R$, $V=Bbb Z$ and for $alphainBbb F$, $vinBbb Z$, let $alpha v=0$. Then $v$ is an abelian group and $cdot$ is an action of the ring-without-$1$ (sometimes called rng) $Bbb F$ on $V$. Hence this is almost a vector space - only the fact that $1$ acts as identity is missing.
The $V$ does not have a basis. Indeed, any linear combination $alpha_1v_1+alpha_2v_2+ldots +a_nv_n$ will always be $=0$. So the very first sentence in your argument is invalid.
$endgroup$
Let $Bbb F=Bbb R$, $V=Bbb Z$ and for $alphainBbb F$, $vinBbb Z$, let $alpha v=0$. Then $v$ is an abelian group and $cdot$ is an action of the ring-without-$1$ (sometimes called rng) $Bbb F$ on $V$. Hence this is almost a vector space - only the fact that $1$ acts as identity is missing.
The $V$ does not have a basis. Indeed, any linear combination $alpha_1v_1+alpha_2v_2+ldots +a_nv_n$ will always be $=0$. So the very first sentence in your argument is invalid.
edited Mar 14 at 7:27
answered Mar 14 at 7:17
Hagen von EitzenHagen von Eitzen
283k23272507
283k23272507
2
$begingroup$
@DanielWainfleet I wrote rng on purpose
$endgroup$
– Hagen von Eitzen
Mar 14 at 7:27
1
$begingroup$
Does the proof for existence of basis uses the axiom $1 cdot v = v$ anywhere?
$endgroup$
– Aniruddha Deshmukh
Mar 14 at 7:40
$begingroup$
+1. BTW the "work" in the Q applies the axiom that $a (bv)=(ab )v$ for $a ,b in Bbb F$ and $vin V.$ So if $w,v in V$ with$ w=1v$ then $1w=1(1v)=(1cdot 1)v=1v=w.$ So for all $win 1v:vin V$ we have $1w=w.$ So the axiom $forall vin V,(1v=v)$ is implied, in the presence of the other axioms, by $V=1v:vin V,$ but, as your example shows, is independent of the other axioms.
$endgroup$
– DanielWainfleet
Mar 14 at 7:44
$begingroup$
@AniruddhaDeshmukh . Let $W=1v:xin V.$ Then by my previous comment, $forall win W;(1w=w)$ so we can easily verify that $W$ is a vector space over $F.$(Although, as in the answer, it might be the trivial space $0.$) Now suppose $vin V$ and $1vne v.$ Then $vnot in W.$ But now, for any $bin Bsubset V$ and any $fin Bbb F$ we have $fb=(fcdot 1)b=f(1b)in W$, so the linear span of $B$ is a subset of $W$,and $Wne V$ so $B$ is not a basis for $V.$
$endgroup$
– DanielWainfleet
Mar 14 at 8:22
1
$begingroup$
@AniruddhaDeshmukh Yes, it does. Using Zorn, let $B$ be a maximal independent family and assume $v$ is not in the span of $B$. Then how are you going to conclude that $Bcupv$ is a linear independent family (or even that $vnotin B$)?
$endgroup$
– Hagen von Eitzen
Mar 14 at 13:34
|
show 3 more comments
2
$begingroup$
@DanielWainfleet I wrote rng on purpose
$endgroup$
– Hagen von Eitzen
Mar 14 at 7:27
1
$begingroup$
Does the proof for existence of basis uses the axiom $1 cdot v = v$ anywhere?
$endgroup$
– Aniruddha Deshmukh
Mar 14 at 7:40
$begingroup$
+1. BTW the "work" in the Q applies the axiom that $a (bv)=(ab )v$ for $a ,b in Bbb F$ and $vin V.$ So if $w,v in V$ with$ w=1v$ then $1w=1(1v)=(1cdot 1)v=1v=w.$ So for all $win 1v:vin V$ we have $1w=w.$ So the axiom $forall vin V,(1v=v)$ is implied, in the presence of the other axioms, by $V=1v:vin V,$ but, as your example shows, is independent of the other axioms.
$endgroup$
– DanielWainfleet
Mar 14 at 7:44
$begingroup$
@AniruddhaDeshmukh . Let $W=1v:xin V.$ Then by my previous comment, $forall win W;(1w=w)$ so we can easily verify that $W$ is a vector space over $F.$(Although, as in the answer, it might be the trivial space $0.$) Now suppose $vin V$ and $1vne v.$ Then $vnot in W.$ But now, for any $bin Bsubset V$ and any $fin Bbb F$ we have $fb=(fcdot 1)b=f(1b)in W$, so the linear span of $B$ is a subset of $W$,and $Wne V$ so $B$ is not a basis for $V.$
$endgroup$
– DanielWainfleet
Mar 14 at 8:22
1
$begingroup$
@AniruddhaDeshmukh Yes, it does. Using Zorn, let $B$ be a maximal independent family and assume $v$ is not in the span of $B$. Then how are you going to conclude that $Bcupv$ is a linear independent family (or even that $vnotin B$)?
$endgroup$
– Hagen von Eitzen
Mar 14 at 13:34
2
2
$begingroup$
@DanielWainfleet I wrote rng on purpose
$endgroup$
– Hagen von Eitzen
Mar 14 at 7:27
$begingroup$
@DanielWainfleet I wrote rng on purpose
$endgroup$
– Hagen von Eitzen
Mar 14 at 7:27
1
1
$begingroup$
Does the proof for existence of basis uses the axiom $1 cdot v = v$ anywhere?
$endgroup$
– Aniruddha Deshmukh
Mar 14 at 7:40
$begingroup$
Does the proof for existence of basis uses the axiom $1 cdot v = v$ anywhere?
$endgroup$
– Aniruddha Deshmukh
Mar 14 at 7:40
$begingroup$
+1. BTW the "work" in the Q applies the axiom that $a (bv)=(ab )v$ for $a ,b in Bbb F$ and $vin V.$ So if $w,v in V$ with$ w=1v$ then $1w=1(1v)=(1cdot 1)v=1v=w.$ So for all $win 1v:vin V$ we have $1w=w.$ So the axiom $forall vin V,(1v=v)$ is implied, in the presence of the other axioms, by $V=1v:vin V,$ but, as your example shows, is independent of the other axioms.
$endgroup$
– DanielWainfleet
Mar 14 at 7:44
$begingroup$
+1. BTW the "work" in the Q applies the axiom that $a (bv)=(ab )v$ for $a ,b in Bbb F$ and $vin V.$ So if $w,v in V$ with$ w=1v$ then $1w=1(1v)=(1cdot 1)v=1v=w.$ So for all $win 1v:vin V$ we have $1w=w.$ So the axiom $forall vin V,(1v=v)$ is implied, in the presence of the other axioms, by $V=1v:vin V,$ but, as your example shows, is independent of the other axioms.
$endgroup$
– DanielWainfleet
Mar 14 at 7:44
$begingroup$
@AniruddhaDeshmukh . Let $W=1v:xin V.$ Then by my previous comment, $forall win W;(1w=w)$ so we can easily verify that $W$ is a vector space over $F.$(Although, as in the answer, it might be the trivial space $0.$) Now suppose $vin V$ and $1vne v.$ Then $vnot in W.$ But now, for any $bin Bsubset V$ and any $fin Bbb F$ we have $fb=(fcdot 1)b=f(1b)in W$, so the linear span of $B$ is a subset of $W$,and $Wne V$ so $B$ is not a basis for $V.$
$endgroup$
– DanielWainfleet
Mar 14 at 8:22
$begingroup$
@AniruddhaDeshmukh . Let $W=1v:xin V.$ Then by my previous comment, $forall win W;(1w=w)$ so we can easily verify that $W$ is a vector space over $F.$(Although, as in the answer, it might be the trivial space $0.$) Now suppose $vin V$ and $1vne v.$ Then $vnot in W.$ But now, for any $bin Bsubset V$ and any $fin Bbb F$ we have $fb=(fcdot 1)b=f(1b)in W$, so the linear span of $B$ is a subset of $W$,and $Wne V$ so $B$ is not a basis for $V.$
$endgroup$
– DanielWainfleet
Mar 14 at 8:22
1
1
$begingroup$
@AniruddhaDeshmukh Yes, it does. Using Zorn, let $B$ be a maximal independent family and assume $v$ is not in the span of $B$. Then how are you going to conclude that $Bcupv$ is a linear independent family (or even that $vnotin B$)?
$endgroup$
– Hagen von Eitzen
Mar 14 at 13:34
$begingroup$
@AniruddhaDeshmukh Yes, it does. Using Zorn, let $B$ be a maximal independent family and assume $v$ is not in the span of $B$. Then how are you going to conclude that $Bcupv$ is a linear independent family (or even that $vnotin B$)?
$endgroup$
– Hagen von Eitzen
Mar 14 at 13:34
|
show 3 more comments
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3147657%2ffinding-an-error-in-proof-for-an-axiom-of-a-vector-space%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
"We know that $V$ must have a basis" - this depends on the definition of vector space, so perhaps on $1v=v$?
$endgroup$
– Hagen von Eitzen
Mar 14 at 7:12