User Pass
Home Sign Up Contact Log In
Catullus16
offline
Link
 
field axioms:
+* commutativity
+* associativity
+* unitality (identities)
+* invertability (inverses)
distributivity

power series for exp(x), sin(x), cos(x)
fundamental theorem of algebra

vectorspace operations:
+: VxV->V
*: FxV->V
elements of V are vectors
elements of F are scalars

vectorspace axioms:
+ commutativity
+ associativity
+ identity
+ inverse
* identity
* associativity (field multiplication)
right distributivity
left distributivity

linear combination of a set of vectors has the form of an additive series with lamba scalars with each in the field of F

a set <S> is a generating system of V if every vector of V can be written as a linear combination of elements in the set of <S>

a basis of V is a linearly independent generating set, where any element in V can be written in a unique way as a linear combination of the elements of the basis (a linear combination is always a finite sum)

a map f:V->W between vectorspaces is linear if f(v+v')=f(v)+f(v') and f(cv)=cf(v) for all v,v' in V and c in F

linear transformations:
f is uniquely determined by the images of the basis vectors and every f(bi) can be written as a linear combination with c in B, so there are finitely many entries, which are a matrix of numbers Bw[f]Bv = ([f(bi)]Bw,...,[f(bn)]Bw)

compositions of linear transformations:
f:V->W, g:W->U
g(f(x)):V->W->U
g(f(x)) is a matrix, where its ith column consists by definition of the coefficient of g(f(bi)) with respect to Bu
so the product of the matrix Bu[g]Bw and the ith column of Bw[f]Bv gives the ith column of Bu[g(f)]Bv

changing bases:
EROS do not change the status of a system of vectors <S>, whether a basis, linearly independent, and/or a generating system
(swapping, scaling, adding scaled member vector)

left multiplication with an elementary matrix corresponds to a row operation
right multiplication with an elementary matrix corresponds to a column operation

any linearly independent set can be extended to a basis
the dimension of a vectorspace V is the number of elements of a basis of V

subvectorspace is a subset containing the zero vector that is closed under addition and scalar multiplication
if f:V->W is linear, then R(f) and N(f) are subvectorspaces
R(f) is range/image of f
N(f) is null/kernel of f
images and preimages of subvectorspaces are again subvectorspaces
rank of a map is the dimension of its range
nullity of a map is the dimension of its kernel
row-rank is the dimension of the span of the rows (as elements in Fn)
column-rank is the dimension of a span of the columns (as elements in Fm)
for above, row-rank equals column-rank equals rank of f
rank-nullity theorem
 
Catullus16
offline
Link
 
zorn's lemma
order theory: posets with largest, maximal, smallest, and minimal elements

symmetric and alternating maps
find matrix of linear map

find range of linear map:
0) change basis of source, not basis of target
1) find reduced column echelon form of map
2) columns with pivot elements for a basis for the range of f
3) basis vectors should match dimension of target

find nullspace of linear map:
0) change basis of target, not basis of source
1) find reduced row echelon form of map
2) dimension of kernel is the number of columns that do not contain a pivot element
3) basis vectors should match dimension of source, with each one lying in the nullspace (i.e. rref(A)*(b)=0)
4) easiest way is have the entries of the basis matrix corresponding to the non-pivot columns be an identity matrix and the remaining entries of the basis matrix be the negatives of the elements of thos non-pivot columns.
e.g. if rref(A)=[ 1 0 -1 -1
0 1 -1 0
0 0 0 0 ]
then columns 3 and 4 are nonpivot columns
and a basis is (1 (1 <--- negative of first row of columns 3 and 4
1 0 <--- negative of second row of columns 3 and 4
1 0 <--- top of identity matrix corresponding to column 3
0), 1) <--- bottom of identity matrix corresponding to column 4
note that rref(A)*(b)=0
 
Catullus16
offline
Link
 
a map f:V1xV2->W is bilinear if it is linear in each coordinate:
f(v1+v1',v2) = f(v1,v2)+f(v1',v2)
f(v1,v2+v2') = f(v1,v2)+f(v1,v2')
f(lv1,v2) = lf(v1,v2)
f(v1,lv2) = lf(v1,v2)
a map f:V1x...xVn->W is multilinear (or n-linear) if it is linear in each coordinate

dimension of the vectorspace of all multilinear maps V1x...xVn->W is dim(V1)*..*dim(Vn)*dim(W)
symmetric = swapping two entries of the input does not affect the function value
alternating = swapping two coordinates results in multiplying the output with -1 (and if it vanishes whenever we have some entry twice)

if V is n-dimensional, then the vectorspace of all alternating n-linear maps V1x...xVn->F is one dimensional

det(GoF)=det(G)det(F)
det(GoF)=det(FoG)
invertible if det=/=0

eigenvectors
characteristic polynomial

effects of ERO/CRO on det???
review map calculations
finding matrix, finding map, finding range/null
 


You are not logged in. Please log in if you want to post a reply.