- Windows: https://phoenixnap.com/kb/how-to-install-python-3-windows
- Ubuntu: https://phoenixnap.com/kb/how-to-install-python-3-ubuntu
- MacOS : https://flaviocopes.com/python-installation-macos/
- Windows: https://docs.anaconda.com/anaconda/install/windows/
- Ubuntu: https://docs.anaconda.com/anaconda/install/linux/
- MacOS : https://docs.anaconda.com/anaconda/install/mac-os
- A beginner-friendly and easy-to-follow video : https://youtu.be/tRZGeaHPoaw?si=06GZKYd83iAvLx8A
- Cheat sheet for future reference : https://education.github.com/git-cheat-sheet-education.pdf
- A short 8-min video covering almost everything you will need : https://youtu.be/2JE66WFpaII?si=5eDA-wD6sj0Xv86M
- Cheat sheet for future reference : https://www.markdownguide.org/cheat-sheet/
- A quick introduction : https://realpython.com/jupyter-notebook-introduction/
-
Python : https://www.youtube.com/playlist?list=PL-osiE80TeTskrapNbzXhwoFUiLCjGgY7 (Videos 2-10)
-
Pandas : https://www.w3schools.com/python/pandas/default.asp (till cleaning data)
-
Matplotlib : https://towardsdatascience.com/matplotlib-tutorial-learn-basics-of-pythons-powerful-plotting-library-b5d1b8f67596
-
Complete Tutorial (Highly Recommended for beginners) : https://cs231n.github.io/python-numpy-tutorial/
-
Introduction to ML : https://towardsdatascience.com/what-is-machine-learning-how-i-explain-the-concept-to-a-newcomer-d96f35a5c4f3
-
Supervised and Unsupervised ML : https://www.youtube.com/watch?v=xtOg44r6dsE
-
Andrew Ng Course: https://youtube.com/playlist?list=PLkDaE6sCZn6FNC6YRfRQc_FbeQrF8BwGI&si=b9WaafbzNVJTP2EK #9-#20
-
Summary: https://www.geeksforgeeks.org/ml-linear-regression/
-
Implementation from Scratch: https://towardsdatascience.com/coding-linear-regression-from-scratch-c42ec079902
-
Loss Functions for Regression: https://heartbeat.comet.ml/5-regression-loss-functions-all-machine-learners-should-know-4fb140e9d4b0
-
Gradient Descent in detail: https://medium.com/geekculture/mathematics-behind-gradient-descent-f2a49a0b714f
-
Normal Equation: https://youtu.be/pRSqKgwOd5k?si=fZ95wn2zx3u9LwuY
-
Proof of Normal Equation: https://www.geeksforgeeks.org/ml-normal-equation-in-linear-regression/
-
Multiple Linear Regression & Polynomial Regression: https://youtube.com/playlist?list=PLkDaE6sCZn6FNC6YRfRQc_FbeQrF8BwGI&si=b9WaafbzNVJTP2EK #21-#30
-
Scikit-Learn Library:
-
Handling missing data: https://www.freecodecamp.org/news/how-to-handle-missing-data-in-a-dataset/
-
IQR method for dealing with outliers: https://youtu.be/A3gClkblXK8?si=DWVqjzkLePYg3qf0
-
Box Plot: https://www.geeksforgeeks.org/what-is-box-plot-and-the-condition-of-outliers/
-
Correlation Matrix: https://youtu.be/1fFVt4tQjRE?si=V12tPp0Bs2jUOyjJ https://www.geeksforgeeks.org/exploring-correlation-in-python/
-
One-hot encoding for categorical data: https://www.educative.io/blog/one-hot-encoding
-
Feature Scaling & Normalisation: https://www.geeksforgeeks.org/ml-feature-scaling-part-2/
-
Andrew Ng Couse: https://youtube.com/playlist?list=PLkDaE6sCZn6FNC6YRfRQc_FbeQrF8BwGI&si=b9WaafbzNVJTP2EK #31-#36
-
Detailed Explaination with intuition: https://philippmuens.com/logistic-regression-from-scratch
-
Summary: https://towardsdatascience.com/introduction-to-logistic-regression-66248243c148
-
Implementation from scratch: https://pub.towardsai.net/logistic-regression-from-scratch-with-only-python-code-9d3ae607e739
-
Implementation using sklearn library: https://www.educative.io/answers/how-to-implement-logistic-regression-using-the-scikit-learn-kit
-
Andrew Ng playlist : https://youtube.com/playlist?list=PLkDaE6sCZn6FNC6YRfRQc_FbeQrF8BwGI&si=b9WaafbzNVJTP2EK #37-#41
-
Overfitting : https://www.geeksforgeeks.org/underfitting-and-overfitting-in-machine-learning/
-
Regularisation technique to prevent overfitting: https://www.datacamp.com/tutorial/towards-preventing-overfitting-regularization
-
Video explanation: https://youtu.be/LbX4X71-TFI?si=kgTfnlMe-8-ngsrY
-
Blog explanation: https://neptune.ai/blog/performance-metrics-in-machine-learning-complete-guide
-
Video Explanation: https://youtu.be/CQveSaMyEwM?si=-efMXsl6UeknpRJx
-
Blog Explanation: https://www.javatpoint.com/k-nearest-neighbor-algorithm-for-machine-learning
-
Finding optimal value of k: https://www.geeksforgeeks.org/how-to-find-the-optimal-value-of-k-in-knn/
-
Basic terminology of trees : https://www.programiz.com/dsa/trees
-
Decision trees classification explained: https://www.youtube.com/watch?v=ZVR2Way4nwQ
-
Entropy: https://www.analyticsvidhya.com/blog/2020/11/entropy-a-key-concept-for-all-data-science-beginners/
-
Information gain and Gini index: https://medium.com/analytics-steps/understanding-the-gini-index-and-information-gain-in-decision-trees-ab4720518ba8
-
Implementation from scratch: https://www.youtube.com/watch?v=sgQAhG5Q7iY https://towardsdatascience.com/decision-tree-in-machine-learning-e380942a4c96
-
Sklearn library documentation: https://scikit-learn.org/stable/modules/tree.html
-
A brief summary: https://towardsdatascience.com/decision-trees-in-machine-learning-641b9c4e8052
-
Everything explained: https://neptune.ai/blog/ensemble-learning-guide
-
More on bagging and boosting: https://youtu.be/sN5ZcJLDMaE?si=IkOe83jPES8QYgBF
-
Random Forests explained in more detail: https://www.youtube.com/watch?v=J4Wdy0Wc_xQ&ab_channel=StatQuestwithJoshStarmer