# Vector Norms and Orthogonality

Last updated

Last updated

Vector Spaces

A *vector space* is a set V of vectors on which two operations + and **·** are defined, called *vector addition* and *scalar multiplication.*

The operation + (**vector addition**) must satisfy the following conditions, where u, v, w are vectors and c,d are scalars:

Commutative law:

**u**+**v**=**v**+**u**Associative law:

**u**+ (**v**+**w**) = (**u**+**v**) +**w**Additive identity:

**0**+**v**=**v**and**v**+**0**=**v**Additive inverses:

**u**+**(-u)**=**0**

The operation **·** (**scalar multiplication**) must satisfy the following conditions:

*Distributive law over vector sum:*c**·**(**u**+**v**) = c**·****u**+ c**·****v***Distributive law over scalar sum*: (c+d)**·****v**= c**·****v**+ d**·****v***Associative law of scalar product*: c**·**(d**·****v**) = (cd)**·****v***Unitary law*: 1**·****v**=**v**

Vector Norm

Vector norm is a non**-**negative number which describes the length or extent of the vector in space. Also known as vector length or magnitude. There are different ways to calculate vector norms.

$X =\begin{bmatrix}x_1 \\ x_2 \\ x_3 \end{bmatrix}$

L1 Norm

L1 Norm is calculated as the sum of absolute values of the vector.

Calculates the Manhattan distance from the origin of vector space

L2 Norm

L2 Norm is calculated as the square root of the sum of the squared vector values.

Calculates the Euclidean distance from the origin of vector space

More commonly used in ML

Max Norm

Max Norm is calculated as the maximum vector values.

Calculating Norm using NumPy

Orthogonality

**Orthogonal**: Two vectors are **orthogonal **if they are perpendicular to each other and their dot product is zero. **u . v = 0**

Proof: **u · v** = |**u**| |**v**| cos θ (From Cosine Similarity)
Note: cosine of 90° = 0

**Orthonormal**: Two vectors are **orthonormal **if their dot product is zero and norm/length of each vector is 1.
**u . v = 0 and |u| = 1, |v| = 1**

In 3-D Euclidean space, using **L2 Norm** to calculate length; we come up with the following equation for **orthonormal **vectors.

**Normal**: A vector is said to be normal to a surface or curve if it is perpendicular to it.

Finding orthogonality using NumPy

Links: What is a Vector Space? (Abstract Algebra) Gentle Introduction to Vector Norms in Machine Learning From Norm to Orthogonality

$L^1 = |x_1| + |x_2|+|x_3|$

$L^2 = \sqrt{ x_1^2 + x_2^2+x_3^2}$

$L^\infty = max(|x_1| , |x_2|,|x_3|)$

$\sqrt{ u_1^2 + u_2^2+u_3^2} = 1 and \sqrt{ v_1^2 + v_2^2+v_3^2} = 1$