Mathematics for ML & DS Specialization
Whether you’re a novice or an experienced professional, our curated Machine Learning courses cater to diverse skill levels, providing a comprehensive and hands-on learning experience.
Skills You Will Gain
Vectors and Matrices
Matrix product
Linear Transformations
Rank, Basis, and Span
Eigenvectors and Eigenvalues
Derivatives
Gradients
Optimization
Gradient Descent
Gradient Descent in Neural Networks
Newton’s Method
Probability
Random Variables
Bayes Theorem
Gaussian Distribution
Variance and Covariance
Sampling and Point Estimates
Maximum Likelihood Estimation
This course includes
- 1 : 1 Session
- 100% Placement Assistance
- 12 Weeks
- Real TIme Project Training
Syllabus Overview
Mathematics for ML & DS Specialization
Mastering the Mathematical Foundations: A Comprehensive Course in Mathematics for Machine Learning and Data Science
Linear Algebra for Machine Learning and Data Science:
Week 1: Systems of Linear Equations
Lesson 1: Systems of Linear equations: two variables
- Machine learning motivation
- Systems of sentences
- Systems of equations
- Systems of equations as lines
- A geometric notion of singularity
- Singular vs nonsingular matrices
- Linear dependence and independence
- The determinant
Lesson 2: Systems of Linear Equations: three variables
- Systems of equations (3×3)
- Singular vs non-singular (3×3)
- Systems of equations as planes (3×3)
- Linear dependence and independence (3×3)
- The determinant (3×3)
Week 2: Solving systems of Linear Equations
Lesson 1: Solving systems of Linear Equations: Elimination
- Machine learning motivation
- Solving non-singular systems of linear equations
- Solving singular systems of linear equations
- Solving systems of equations with more variables
- Matrix row-reduction
- Row operations that preserve singularity
Lesson 2: Solving systems of Linear Equations: Row Echelon Form and Rank
- The rank of a matrix
- The rank of a matrix in general
- Row echelon form
- Row echelon form in general
- Reduced row echelon form
Week 3: Vectors and Linear Transformations
Lesson 1: Vectors
- Norm of a vector
- Sum and difference of vectors
- Distance between vectors
- Multiplying a vector by a scalar
- The dot product
- Geometric Dot Product
- Multiplying a matrix by a vector
- Lab: Vector Operations: Scalar Multiplication, Sum and Dot Product of Vectors
Lesson 2: Linear transformations
- Matrices as linear transformations
- Linear transformations as matrices
- Matrix multiplication
- The identity matrix
- Matrix inverse
- Which matrices have an inverse?
- Neural networks and matrices
Week 4: Determinants and Eigenvectors
Lesson 1: Determinants In-depth
- Machine Learning Motivation
- Singularity and rank of linear transformation
- Determinant as an area
- Determinant of a product
- Determinants of inverses
Lesson 2: Eigenvalues and Eigenvectors
- Bases in Linear Algebra
- Span in Linear Algebra
- Interactive visualization: Linear Span
- Eigenbases
- Eigenvalues and eigenvectors
- Exploring lists, sets, dictionaries, tuples.
- Classes, objects, inheritance, polymorphism.
- Try-except blocks, custom exceptions, and best practices.
Probability & Statistics for Machine Learning & Data Science:
Week 7: Introduction to probability and random variables
Lesson 1: Introduction to probability
- Concept of probability: repeated random trials
- Conditional probability and independence
- Discriminative learning and conditional probability
- Bayes theorem
Lesson 2: Random variables
- Random variables
- Cumulative distribution function
- Discrete random variables: Bernoulli distribution
- Discrete random variables: Binomial distribution
- Probability mass function
- Continuous random variables: Uniform distribution
- Continuous random variables: Gaussian distribution
- Continuous random variables: Chi squared distribution
- Probability distribution function
Week 8-9: Describing distributions and random vectors
Lesson 1: Describing distributions
- Measures of central tendency: mean, median, mode
- Expected values
- Quantiles and box-plots
- Measures of dispersion: variance, standard deviation
Lesson 2: Random vectors
- Joint distributions
- Marginal and conditional distributions
- Independence
- Measures of relatedness: covariance
- Multivariate normal distribution
Week 10-11: Introduction to statistics
Lesson 1: Sampling and point estimates
- Population vs. sample
- Describing samples: sample proportion and sample mean
- Distribution of sample mean and proportion: Central Limit Theorem
- Point estimates
- Biased vs Unbiased estimates
Lesson 2: Maximum likelihood estimation
- ML motivation example: Linear Discriminant Analysis
- Likelihood
- Intuition behind maximum likelihood estimation
- MLE: How to get the maximum using calculus
Lesson 3: Bayesian statistics
- ML motivation example: Naive Bayes
- Frequentist vs. Bayesian statistics
- A priori/ a posteriori distributions
- Bayesian estimators: posterior mean, posterior median, MAP
Week 12: Interval statistics and Hypothesis testing
Lesson 1: Confidence intervals
- Margin of error
- Interval estimation
- Confidence Interval for mean of population
- CI for parameters in linear regression
- Prediction Interval
Lesson 2: Hypothesis testing
- ML Motivation: AB Testing
- Criminal trial
- Two types of errors
- Test for proportion and means
- Two sample inference for difference between groups
- ANOVA
- Power of a test
Transform Your Skills: Enroll Now to Master Mathematics of ML & DS
Join us and unlock the potential of intelligent systems with our Machine Learning courses. Enroll now to take the first step towards a future powered by data-driven intelligence.