Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
30 Cards in this Set
- Front
- Back
- 3rd side (hint)
The ten axioms to verify Vector Spaces? (u,v,w are vectors m,k are scalars W and V are vector spaces)
|
1.) u +v is in W
2.) u +v = v + u 3.) u +(v +w)= (u +v) + w 4.) There is a 0 so that u + 0 = o +u =u. 5.) there is a -u so that u =(-u)=0 6.)ku is in V 7.) k(u+v)=vu +kv 8.) (k +m)u=ku +mu 9.) k(mu)=-(km) u 10.) 1u =u |
|
|
Theorem 4.1.1 - properties of Zero Vector
|
a. 0u=0
b. k0=0 c. (-1)u= -u d. if ku=0, the k=0 or u=0 |
|
|
Subspace
|
W is a subspace of V is W is a vector space itself and the same rules of addition and multiplication apply in W as they do in V
|
|
|
Theorem 4.2.1 - subspace check
|
W is a supspace of V if (u and v are vectors, k is a scalar):
a.) u +v is in W b.) ku is i W for all k's |
|
|
Span of S
|
The subspace of V that is formed from all possible linear combinations of a non-empty set S is called the span of S and is noted span{w1...w} or Span(S)
|
|
|
Theorem 4.2.4 - homogeneous solution sets
|
The solution set of a homogeneous linear system Ax=0 in "n" unknowns is a subspace in Rn
|
|
|
Theorem 4.2.5 - equivalent Spans
|
if S and T are nonempty subspace in Vector Space V, then Span(S)=Span(T) if and only if every element in S is an linear combination of the elements in T and vice versa.
|
|
|
Linear Independence
|
If S is a nonempty set of vectors in a Vector space V, S is said to be linearly independent if there is only the trivial solution to kS=0
|
|
|
Theorem 4.3.1 - Linear independence with 2 or more vectors
|
A set S with two or more vectors is:
a.) Linearly independent if and only if no vectors in S is expressible as a linear combination of other vectors in S. b.) Lineraly dependent if and only if one or more vectors is expressible as a linear combination of the vectors in the set. |
|
|
Theorem 4.3.2 - Special sets and linear independence
|
a.) Any finite set that contains 0 is dependent
b.) A set with only one vector is linearly independent if and only if that vector is not =0 c.) A set with exactly two vectors is linearly independent if and only if neither vector is a scalar multiple of the other. |
|
|
Theorem 4.3.3 - Dependence in Rn
|
Let S={v1,v2....vr} be a set of vectors in Rn. If r > n then the set is linearly independent.
|
|
|
Theorem 4.3.4 - Wronskian explained
|
If the functions f1,f2,fn have fn-1 contiuous derivatives, the wronskian exists. If it is not zero then the set is linearly independent.
|
|
|
Wronskian
|
det[[f1,f2,...fn]
[f1ⁱ,f2ⁱ...fnⁱ] [f1ⁱⁱ,f2ⁱⁱ...fnⁱⁱ]] |
|
|
Basis of a Vector Space
|
S is a basis of V if :
a.) S is linearly independent b.) S spans V |
|
|
Checking if a set is a Basis
|
set Ax=0 and Ax=b. The matrix of coeffecients will be the same for the two. Theorem 2.3.8 says that if the det≠0 then both systems have the solutions we want and therefore it is a basis.
|
|
|
Coordinate Vector
|
Let S={v1,v2....vn} be a basis for vector space V. v=c1v1 +c2v2....+cnvn is the expression for the vector v, then [v]s={c1,c2...cn} is called the coordinate vector of v relative to S.
|
|
|
Theorem 4.4.1 -unique basis representation
|
If S={v1,v2....vn} is a basis for Vector space V, then all vectors in v can be expressed as a linear combination of the elements in S in exactly one way.
|
|
|
Theorem 4.5.1 - dimension of basis
|
All bases for a finite-dimensional vector space have the same number of vectors.
|
|
|
Theorem 4.5.2 - specs on dimension of basis
|
Let V be a finite-dimensional vector space, and let {v1,v2...vn} be any basis:
a.) if a set has more than n vectors, then it is linearly dependent b.) if a set has few than n vectors, then it does' span V. |
|
|
Dimension
|
The dimension of a finite-dimensional vectorspace V is denotes as dim(V) and represent the # of vectors found in a basis for that vector space.
|
|
|
Theorem 4.5.3 - the plus minus theorem
|
Let S be a nonempty set of vectors in a vector space V:
a.) If S is a linearly independent set ,and if v is a vector in V that is outside Span(S), then the set S union v that results from inserting v into is is still linearly independent. b.) If v is a vector in S that is expressible as a linear combination of other vectors in S, and if S- {v} denotes the set obtained by removing v from S, then S and S - {v} span the same space. |
|
|
Theorem 4.5.4 - Rn basis
|
Let V be a vector space in Rn and let S be a subspace of V with exactly n vectors. Then S is a basis for V if and only if S spans V or S is linearly independent
|
You only have to check one of them instead of both
|
|
Theorem 4.5.5
|
Let S be a finite set of vectors in a finite-dimensional vector space V: a.) If S spans V but is not a basis for V, then S can be reduced to a basis for C by removing dependent vectors from S.
b.) If S is linearly independent set that is not already a basis for V, then S can be enlarged to a basis for V by inserting linearly independent vectors into S. |
|
|
Transition matrix
|
The matrix that, when multiplied on the right by a coordinate vector relative to a basis B returns the coordinate vector relative to basis S.
|
|
|
How to find the transition matrix
|
PB to Bⁱ is found by forming the matrix [Bⁱ|B] and doing gauss Jordan until you end up with [I|PB to Bⁱ]
|
[new basis|old basis]
|
|
Theorem 4.6.2
|
Let Bⁱ={u1...un} ad S={e1,...en}, then the transition matrix from Bⁱ to S is simply to vectors u1,...un written as columns like so [u1|u2|un]
|
|
|
Dot product
|
|v||u| cos (ø)
|
|
|
Cross Product
|
|v||u| sin (ø)
|
Only valid in R3 and the cross product is orthogonal to both original vectors.
|
|
Orthogonality
|
Two vectors are orthogonal if the result of their dot product is zero.
|
|
|
Triple Scalar Product
|
v dot u x w
|
The scalar triple product yields the volume of the parallelepipied created by the three vectors. Also, that quantity is zero, they three vectors all lie in the same plane.
|