Get the knowledge to use Python for building fast and better linear models and to deploy the resulting models in Python with uCertify’s course Regression Analysis with Python. The course provides hands-on experience of the concepts, Regression – The Workhorse of Data Science, Approaching Simple Linear Regression, Multiple Regression in Action. Logistic Regression, Data Preparation, Achieving Generalization, and so on.
Lessons
10+ Lessons | 52+ Exercises | 60+ Quizzes | 38+ Flashcards | 38+ Glossary of terms
TestPrep
35+ Pre Assessment Questions | 35+ Post Assessment Questions
Lessons 1: Preface
- What this course covers
- What you need for this course
- Who this course is for
- Conventions
Lessons 2: Regression – The Workhorse of Data Science
- Regression analysis and data science
- Python for data science
- Python packages and functions for linear models
- Summary
Lessons 3: Approaching Simple Linear Regression
- Defining a regression problem
- Starting from the basics
- Extending to linear regression
- Minimizing the cost function
- Summary
Lessons 4: Multiple Regression in Action
- Using multiple features
- Revisiting gradient descent
- Estimating feature importance
- Interaction models
- Polynomial regression
- Summary
Lessons 5: Logistic Regression
- Defining a classification problem
- Defining a probability-based approach
- Revisiting gradient descent
- Multiclass Logistic Regression
- An example
- Summary
Lessons 6: Data Preparation
- Numeric feature scaling
- Qualitative feature encoding
- Numeric feature transformation
- Missing data
- Outliers
- Summary
Lessons 7: Achieving Generalization
- Checking on out-of-sample data
- Greedy selection of features
- Regularization optimized by grid-search
- Stability selection
- Summary
Lessons 8: Online and Batch Learning
- Batch learning
- Online mini-batch learning
- Summary
Lessons 9: Advanced Regression Methods
- Least Angle Regression
- Bayesian regression
- SGD classification with hinge loss
- Regression trees (CART)
- Bagging and boosting
- Gradient Boosting Regressor with LAD
- Summary
Lessons 10: Real-world Applications for Regression Models
- Downloading the datasets
- A regression problem
- An imbalanced and multiclass classification problem
- A ranking problem
- A time series problem
- Summary
Hands-on LAB Activities (Performance Labs)
Approaching Simple Linear Regression
- Creating a One-Column Matrix Structure
- Visualizing the Distribution of Errors
- Plotting a Normal Distribution Graph
- Plotting a Scatterplot
- Standardizing a Variable
- Showing Regression Analysis Parameters
- Showing the Summary of Regression Analysis
- Printing the Residual Sum of Squared Errors
- Plotting Standardized Residuals
- Predicting with a Regression Model
- Regressing with Scikit-learn
- Using the fmin Minimization Procedure
- Finding Mean and Median
- Obtaining the Inverse of a Matrix
Multiple Regression in Action
- Printing Eigenvalues
- Visualizing the Correlation Matrix
- Obtaining the Correlation Matrix
- Standardizing Using the Scikit-learn Preprocessing Module
- Printing Standardized Coefficients
- Obtaining the R-squared Baseline
- Recording Coefficient of Determination Using R-squared
- Reporting All R-squared Increment Above 0.03
- Representing LSTAT Using the Scatterplot
- Testing Degree of a Polynomial
Logistic Regression
- Creating a Dummy Dataset
- Obtaining a Classification Report
- Representing a Confusion Matrix Using Heatmap
- Creating a Confusion Matrix
- Plotting the sigmoid Function
- Fitting a Multiple Linear Regressor
- Creating and Fitting a Logistic Regressor Classifier
- Obtaining the Feature Vector and its Original and Predicted Labels
- Visualizing Multiclass Logistic Regressor
- Creating a Dummy Four-Class Dataset
Data Preparation
- Centering the Variables
- Demonstrating the Logistic Regression
- Analyzing Qualitative Data Using Logit
- Transforming Qualitative Data
- Using LabelBinarizer
- Using the Hashing Trick
- Obtaining Residuals
- Replacing Missing Values With the Mean Value
- Representing Outliers Among Predictors
- Showing Outliers
Achieving Generalization
- Splitting a Dataset
- Bootstrapping a Dataset
- Applying Third-Degree Polynomial Expansion
- Plotting the Distribution of Scores
- Demonstrating Working of Recursive Elimination
- Implementing L2 Regularization
- Performing Random Grid Search
Online and Batch Learning
- Demonstrating Mini-Batch Learning
Advanced Regression Methods
- Obtaining LARS Coefficients
- Using Bayesian Regression
- Using the SGDClassifier Class With the hinge Loss
- Implementing SVR
- Implementing CART
- Implementing Random Forest Regressor
- Implementing Bagging
- Implementing Boosting
- Implementing Gradient Boosting Regressor with LAD
Reviews
There are no reviews yet.