>Scalars, vectors, matrices
>Addition & subtraction (vector, matrix)
>Multiplication (scalar, vector, matrix)
>Division (scalar, matrix)
>Signal processing, others...
Scalars & Vectors
• Scalar: fancy word for a number:
>e.g., 3.14159, or -42.
>Scalar can be integer, real, or complex.
• Vector: a collection of numbers:
>[age, height, weight] or [x, y, z, roll,
• Compact notation: p = [x, y, z, roll, pitch, yaw]
> So pi is the ith element, e.g., in above, p2=y
• Ease of manipulation (examples later).
Visualize a vector
• Row vector:
• Column vector:
• The transpose operator:
>If , then
Operations: Scalar multiplication
• Vector multiplied by scalar:
>Multiply each vector element by the
• Scalar multiplication increases length
• If the scalar is negative, also
reverses direction of the vector.
Addition of vectors
>Add two like-shape vectors (row & row).
• Linear combinations of vectors:
>Example: u = c1v1 + c2v2
• Set of all linear combinations of a
set of vectors is the linear space
spanned by the set.
• A set of vectors is linearly
independent if none of the vectors
in the set can be written as a linear
combination of the other members
of the set.
•Also called “dot product:”
> x•y = x1y1 + x2y2 + … + xnyn
• Maps two vectors to a scalar.
• Length (also called norm):
> x•x = x1x1 + x2x2 + … + xnxn
>x•x = ||x||2
> ||x|| is the length of the vector
> Note ||ax||=|a| ||x||
Angle between two vectors
• Defined using inner product:
Example of compact notation
• Set of data: x=[x1,…,xn], y=[y1,…,yn]
• Correlation coefficient:
• A basis for V is a set of vectors B in V that
span V. Vectors in B are linearly
independent, so any v in V can be written as
a linear combination of vectors in B.
• Orthogonal bases -> simple combinations.
> Orthogonal if u•v = 0, as cos θ = 0.
• Usual basis (of infinitely many) for R3 is:
>B = [1,0,0]T , [0,1,0]T , [0,0,1]T
> coefficients of v using B are coordinates.
Projections of vectors
• Projection of v onto w:
>In 2 dimensions, x = ||v|| cos θ, or in higher
Projection onto basis
• Matrices are arrays of numbers.
>Convenient, compact notation.
• Are operators, mapping between vector
spaces (e.g., change of basis or
• Some special names:
>Square has same number of rows and
>Diagonal has non-zero elements only on
• if elements are all = 1, call it I identity matrix,
mapping vector space to itself.
>Symmetric is square with
• If one considers a matrix as a set of
>Can multiply matrix by scalar.
>Can add conforming matrices (same #
• Multiply vector by matrix:
• Can concatenate mappings:
> If we have, u = Wv and v = Mz, then
u = W(Mz) = Pz
• W must have same number of rows as
M has columns, so multiplying an rxs
matrix by sxt matrix gives an rxt matrix.
>Associative, distributive, but not
Inner and outer product
• Inner product: Maps two vectors to a
• Outer product: Maps two vectors to a
• The inverse of M is that matrix which,
when multiplied by M, gives the identity
> MM-1 = M-1 M = I
• Use the inverse for the solution to n
simultaneous linear equations:
>If y = A x, then x = A-1 y.
• M-1 may not exist, then M is singular.
Eigenvectors & eigenvalues
•Matrix mapping u = Wv takes domain V to
• Generally, elements in V are changed in
length and direction.
• Elements that are changed in length only are
called eigenvectors of W.
>If , then v is an eigenvector of W with
• W can have more than one eigenvector; they
form a basis for the column space of W.