Andrew Gurung
  • Introduction
  • Data Science
    • Natural Language Processing
      • Sentiment analysis using Twitter
    • Linear Algebra
      • Linear algebra explained in four pages
      • Vectors
        • Vector Basics
        • Vector Projection
        • Cosine Similarity
        • Vector Norms and Orthogonality
        • Linear combination and span
        • Linear independence and Basis vectors
      • Matrices
        • Matrix Arithmetic
        • Matrix Operations
        • Functions and Linear Transformations
        • Matrix types
      • Eigendecomposition, Eigenvectors and Eigenvalues
      • Principle Component Analysis (PCA)
      • Singular-Value Decomposition(SVD)
      • Linear Algebra: Deep Learning Book
    • Calculus
      • Functions, Limits, Continuity and Differentiability
      • Scalar Derivative and Partial Derivatives
      • Gradient
      • Matrix Calculus
      • Maxima and Minima using Derivatives
      • Gradient Descent and its types
    • Statistics and Probability
      • Probability Rules and Axioms
      • Types of Events
      • Frequentist vs Bayesian View
      • Random Variables
      • MLE, MAP, and Naive Bayes
      • Probability Distributions
      • P-Value and hypothesis test
    • 7 Step DS Process
      • 1: Business Requirement
      • 2: Data Acquisition
      • 3: Data Processing
        • SQL Techniques
        • Cleaning Text Data
      • 4: Data Exploration
      • 5: Modeling
      • 6: Model deployment
      • 7: Communication
    • Miscellaneous
      • LaTeX commands
  • Computer Science
    • Primer
      • Big O Notation
  • Life
    • Health
      • Minimalist Workout Routine
      • Reddit FAQ on Nootropics
      • Hiking/Biking Resources
    • Philosophy
      • Aristotle's Defense of Private Property
    • Self-improvement
      • 100 Mental Models
      • Don't break the chain
      • Cal Newport's 5 Productivity tips
      • Andrew Ng's advice on deliberate practice
      • Atomic Habits
      • Turn sound effects off in Outlook
    • Food and Travel
      • 2019 Guide to Pesticides in Produce
      • Recipe
        • Spicy Sesame Noodles
      • Travel
        • Hiking
    • Art
      • Scott Adams: 80% of the rules of good writing
      • Learn Blues Guitar
    • Tools
      • Software
        • Docker
        • Visual Studio Code
        • Terminal
        • Comparing Git Workflow
      • Life Hacks
        • DIY Deck Cleaner
  • Knowledge Vault
    • Book
      • The Almanack of Naval Ravikant
    • Media
    • Course/Training
Powered by GitBook
On this page
  • Scalars
  • Vector
  • Vector Arithmetic
  • Vector Scalar Multiplication
  • Vector Dot Product

Was this helpful?

  1. Data Science
  2. Linear Algebra
  3. Vectors

Vector Basics

PreviousVectorsNextVector Projection

Last updated 6 years ago

Was this helpful?

Scalars

Scalars are single numbers. Example: x ∈ R denotes that the scalar value x is a member of real valued numbers R.

Vector

Vector in Machine Learning is a collection/array of numbers that corresponds to some features.

Example: [2,5,1] may be used to classify an apple where the first, second and third values represent features such as size, color and number of seeds in a fruit respectively.

Vector in Python can be represented as a NumPy array.

from numpy import array
v = array([2, 5, 1])
print(v)
## Output
[2 5 1]

Vector Arithmetic

Two vectors of equal length can be added, subtracted, divided or multiplied with each other to result in a new vector with the same length.

If a = [a1, a2, a3] and b = [b1, b2, b3] then the following operation will yield:

a) Addition: c = [a1+b1, a2+b2, a3+b3]

b) Subtraction: c = [a1-b1, a2-b2, a3-b3]

c) Division: c = [a1/b1, a2/b2, a3/b3]

d) Multiplication: c = [a1*b1, a2*b2, a3*b3]

# Vector Arithmetic
a = array([10, 20, 30])
b = array([5, 10, 15])

addition = a + b
subtraction = a - b
division = a / b
multiplication = a * b

# print result
print(addition)
print(subtraction)
print(division)
print(multiplication)
[15 30 45]
[ 5 10 15]
[2. 2. 2.]
[ 50 200 450]

Vector Scalar Multiplication

Vector can be multiplied by a scalar value. This results in scaling the magnitude of a vector.

If a = [a1, a2, a3] and s = scalar

Vector Scalar Multiplication: c = [s*a1, s*a2, s*a3]

# Vector Scalar Multiplication
a = array([10, 20, 30])
s = 0.5

vsmulti = s * a
print(vsmulti)
[ 5. 10. 15.]

Vector Dot Product

Vector dot product is a number/value obtained by adding the multiplied elements of two vectors of the same length. Named after the dot(period) operator which describes it.

If a = [a1, a2, a3] and b = [b1, b2, b3] then

Vector Dot Product: c = a . b = (a1*b1 + a2*b2 + a3*b3)

Note: The dot product is an important tool for calculating vector projections, determining orthogonality, etc.

# Vector Scalar Multiplication
a = array([1, 2, 3])
b = array([1, 2, 3])

dotProduct = a.dot(b)
print(dotProduct)
14

Alternate method: Vector Dot Product

import numpy as np
np.vdot(a,b)
14