Learn to build decision trees for applied machine learning from scratch in Python. 3.6 (72 ratings) Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
What you’ll learn
- The most common decision tree algorithms
- Understand the core idea behind decision trees
- Developing code from scratch
- Applying ML for practical problems
- Bagging and Boosting
- Random Forest, Gradient Boosting
- Basic python
Decision trees are one of the hottest topics in Machine Learning. They dominate many Kaggle competitions nowadays. Empower yourself for challenges.
This course covers both fundamentals of decision tree algorithms such as CHAID, ID3, C4.5, CART, Regression Trees and its hands-on practical applications. Besides, we will mention some bagging and boosting methods such as Random Forest or Gradient Boosting to increase decision tree accuracy. Finally, we will focus on some tree based frameworks such as LightGBM, XGBoost and Chefboost.
We will create our own decision tree framework from scratch in Python. Meanwhile, step by step exercises guide you to understand concepts clearly.
This course appeals to ones who interested in Machine Learning, Data Science and Data Mining.Who this course is for:
- Interested in Machine Learning
- Wonder Data Mining
Serengil received his MSc in Computer Science from Galatasaray University in 2011.
He has been working as a software developer for a fintech company since 2010. Currently, he is a member of AI and Machine Learning team as a Data Scientist in this company.
His current research interests are Machine Learning, particularly applications of Deep Learning and Cryptography in particular Elliptic Curve cryptosystems.
Serengil contributed many open source projects as well. Repositories he pushed to GitHub got hundreds of stars and forks, and thousands of installations as well.