Deep Learning Prerequisites: Linear Regression in Python
Data science: Learn linear regression from scratch and build your own working program in Python for data analysis.
This course teaches you about one popular technique used in machine learning, data science and statistics: linear regression. We cover the theory from the ground up: derivation of the solution, and applications to realworld problems. We show you how one might code their own linear regression module in Python.
Linear regression is the simplest machine learning model you can learn, yet there is so much depth that you'll be returning to it for years to come. That's why it's a great introductory course if you're interested in taking your first steps in the fields of:
 deep learning
 machine learning
 data science
 statistics
In the first section, I will show you how to use 1D linear regression to prove that Moore's Law is true.
What's that you say? Moore's Law is not linear?
You are correct! I will show you how linear regression can still be applied.
In the next section, we will extend 1D linear regression to anydimensional linear regression  in other words, how to create a machine learning model that can learn from multiple inputs.
We will apply multidimensional linear regression to predicting a patient's systolic blood pressure given their age and weight.
Finally, we will discuss some practical machine learning issues that you want to be mindful of when you perform data analysis, such as generalization, overfitting, traintest splits, and so on.
This course does not require any external materials. Everything needed (Python, and some Python libraries) can be obtained for FREE.
If you are a programmer and you want to enhance your coding abilities by learning about data science, then this course is for you. If you have a technical or mathematical background, and you want to know how to apply your skills as a software engineer or "hacker", this course may be useful.
This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.
Your Instructor
Hi! We are UpDegree, providing high quality interactive course.
We are the team of 100+ instructor over the globe. Adel (Oxford), Anand (Standford), Akash (IITB), Rajib (IITK), Partha (IITM) are few of them.
Updegree courses are different, In each and every course you not only get the videos lectures but you also get quiz,code,assignment etc to test your practical understanding!
Course Curriculum

StartIntroduction and Outline (7:22)

StartDefine the model in 1D, derive the solution (Updated Version) (4:33)

StartDefine the model in 1D, derive the solution (7:29)

StartDefine the model in 1D, derive the solution (8:24)

StartDetermine how good the model is  rsquared (5:56)

StartRsquared in code (5:59)

StartDemonstrating Moore's Law in Code (3:28)

StartDefine the multidimensional problem and derive the solution (Updated Version) (7:52)

StartDefine the multidimensional problem and derive the solution (1:33)

StartHow to solve multiple linear regression using only matrices (4:45)

StartCoding the multidimensional solution in Python (11:57)

StartPolynomial regression  extending linear regression (with Python code) (4:28)

StartPredicting Systolic Blood Pressure from Age and Weight (4:05)

StartWhat do all these letters mean? (4:22)

StartInterpreting the Weights (14:49)

StartGeneralization error, train and test sets (3:42)

StartL1 Regularization  Theory (1:40)

StartL2 Regularization  Theory (3:00)

StartL1 vs L2 Regularization (9:02)

StartThe Dummy Variable Trap (7:51)

StartGradient Descent Tutorial (19:09)

StartGradient Descent for Linear Regression (23:20)

StartBypass the Dummy Variable Trap with Gradient Descent (3:51)

StartCategorical inputs (5:43)

StartProbabilistic Interpretation of Squared Error (30:04)