Enroll yourself in the Machine Learning Python course and lab to gain expertise on the processes, patterns, and strategies needed for building effective learning systems. The Machine learning course imparts skills that are required for understanding machine learning algorithms, models, and core machine learning concepts, evaluating classifiers and regressors, connections, extensions, and further directions. The study guide is equipped with learning resources to broaden your toolbox and explore some of the field’s most sophisticated and exciting techniques.
Lessons
16+ Lessons | 44+ Exercises | 95+ Quizzes | 100+ Flashcards | 100+ Glossary of terms
TestPrep
55+ Pre Assessment Questions | 55+ Post Assessment Questions
Lessons 1: Let’s Discuss Learning
- Welcome
- Scope, Terminology, Prediction, and Data
- Putting the Machine in Machine Learning
- Examples of Learning Systems
- Evaluating Learning Systems
- A Process for Building Learning Systems
- Assumptions and Reality of Learning
- End-of-Lesson Material
Lessons 2: Some Technical Background
- About Our Setup
- The Need for Mathematical Language
- Our Software for Tackling Machine Learning
- Probability
- Linear Combinations, Weighted Sums, and Dot Products
- A Geometric View: Points in Space
- Notation and the Plus-One Trick
- Getting Groovy, Breaking the Straight-Jacket, and Nonlinearity
- NumPy versus “All the Maths”
- Floating-Point Issues
- EOC
Lessons 3: Predicting Categories: Getting Started with Classification
- Classification Tasks
- A Simple Classification Dataset
- Training and Testing: Don’t Teach to the Test
- Evaluation: Grading the Exam
- Simple Classifier #1: Nearest Neighbors, Long Distance Relationships, and Assumptions
- Simple Classifier #2: Naive Bayes, Probability, and Broken Promises
- Simplistic Evaluation of Classifiers
- EOC
Lessons 4: Predicting Numerical Values: Getting Started with Regression
- A Simple Regression Dataset
- Nearest-Neighbors Regression and Summary Statistics
- Linear Regression and Errors
- Optimization: Picking the Best Answer
- Simple Evaluation and Comparison of Regressors
- EOC
Lessons 5: Evaluating and Comparing Learners
- Evaluation and Why Less Is More
- Terminology for Learning Phases
- Major Tom, There’s Something Wrong: Overfitting and Underfitting
- From Errors to Costs
- (Re)Sampling: Making More from Less
- Break-It-Down: Deconstructing Error into Bias and Variance
- Graphical Evaluation and Comparison
- Comparing Learners with Cross-Validation
- EOC
Lessons 6: Evaluating Classifiers
- Baseline Classifiers
- Beyond Accuracy: Metrics for Classification
- ROC Curves
- Another Take on Multiclass: One-versus-One
- Precision-Recall Curves
- Cumulative Response and Lift Curves
- More Sophisticated Evaluation of Classifiers: Take Two
- EOC
Lessons 7: Evaluating Regressors
- Baseline Regressors
- Additional Measures for Regression
- Residual Plots
- A First Look at Standardization
- Evaluating Regressors in a More Sophisticated Way: Take Two
- EOC
Lessons 8: More Classification Methods
- Revisiting Classification
- Decision Trees
- Support Vector Classifiers
- Logistic Regression
- Discriminant Analysis
- Assumptions, Biases, and Classifiers
- Comparison of Classifiers: Take Three
- EOC
Lessons 9: More Regression Methods
- Linear Regression in the Penalty Box: Regularization
- Support Vector Regression
- Piecewise Constant Regression
- Regression Trees
- Comparison of Regressors: Take Three
- EOC
Lessons 10: Manual Feature Engineering: Manipulating Data for Fun and Profit
- Feature Engineering Terminology and Motivation
- Feature Selection and Data Reduction: Taking out the Trash
- Feature Scaling
- Discretization
- Categorical Coding
- Relationships and Interactions
- Target Manipulations
- EOC
Lessons 11: Tuning Hyperparameters and Pipelines
- Models, Parameters, Hyperparameters
- Tuning Hyperparameters
- Down the Recursive Rabbit Hole: Nested Cross-Validation
- Pipelines
- Pipelines and Tuning Together
- EOC
Lessons 12: Combining Learners
- Ensembles
- Voting Ensembles
- Bagging and Random Forests
- Boosting
- Comparing the Tree-Ensemble Methods
- EOC
Lessons 13: Models That Engineer Features for Us
- Feature Selection
- Feature Construction with Kernels
- Principal Components Analysis: An Unsupervised Technique
- EOC
Lessons 14: Feature Engineering for Domains: Domain-Specific Learning
- Working with Text
- Clustering
- Working with Images
- EOC
Lessons 15: Connections, Extensions, and Further Directions
- Optimization
- Linear Regression from Raw Materials
- Building Logistic Regression from Raw Materials
- SVM from Raw Materials
- Neural Networks
- Probabilistic Graphical Models
- EOC
Appendix A: mlwpy.py Listing
Hands-on LAB Activities (Performance Labs)
Some Technical Background
- Plotting a Probability Distribution Graph
- Using the zip Function
- Calculating the Sum of Squares
- Plotting a Line Graph
- Plotting a 3D Graph
- Plotting a Polynomial Graph
- Using the numpy.dot() Method
Predicting Categories: Getting Started with Classification
- Displaying Histograms
Predicting Numerical Values: Getting Started with Regression
- Defining an Outlier
- Calculating the Median Value
- Estimating the Multiple Regression Equation
Evaluating and Comparing Learners
- Constructing a Swarm Plot
- Using the describe() Method
- Viewing Variance
Evaluating Classifiers
- Creating a Confusion Matrix
- Creating an ROC Curve
- Recreating an ROC Curve
- Creating a Trendline Graph
Evaluating Regressors
- Viewing the Standard Deviation
- Constructing a Scatterplot
- Evaluating the Prediction Error Rates
More Classification Methods
- Evaluating a Logistic Model
- Creating a Covariance Matrix
- Using the load_digits() Function
More Regression Methods
- Illustrating a Less Consistent Relationship
- Illustrating a Piecewise Constant Regression
Manual Feature Engineering: Manipulating Data for Fun and Profit
- Manipulating the Target
- Manipulating the Input Space
Combining Learners
- Calculating the Mean Value
Models That Engineer Features for Us
- Displaying a Correlation Matrix
- Creating a Nonlinear Model
- Performing a Principal Component Analysis
- Using the Manifold Method
Feature Engineering for Domains: Domain-Specific Learning
- Encoding Text
Connections, Extensions, and Further Directions
- Building an Estimated Simple Linear Regression Equation
Reviews
There are no reviews yet.