DEVrepublik

Visit our upcoming webinars

Hierarchical clustering; Association Rules;

Registration

Fee: 650 UAH

Online

Hierarchical clustering is an alternative approach to k-means clustering for identifying groups in the dataset. It does not require us to pre-specify the number of clusters to be generated as is required by the k-means approach. Let's talk in more details.

Loss Functions

Registration

Fee: 650 UAH

Online

The loss function is the bread and butter of modern machine learning; it takes your algorithm from theoretical to practical and transforms neural networks from glorified matrix multiplication into deep learning. Let's talk in more details.

Support Vector Machine

Registration

Fee: 650 UAH

Online

“Support Vector Machine” (SVM) is a supervised machine learning algorithm which can be used for both classification or regression challenges. Let's talk in more details.

Time Series Analysis

Registration

Fee: 650 UAH

Online

Time Series Analysis provides us with a robust statistical framework for assessing the behaviour of time series, such as asset prices, in order to help us trade off of this behaviour. Let's talk in more details.

Linear Regression and Math behind it

Registration

Fee: 650 UAH

Online

We are going to talk about linear regression, one of the most well known and well understood algorithms in machine learning. We are going to focus on the simple linear regression, which contains only one input variable. But the same logic and analyses will extend to the multi-variable linear regression. Let's talk in more details.  

Boosting, part 1: Adaboost

Registration

Fee: 650 UAH

Online

AdaBoost (adaptive boosting) is an ensemble learning algorithm that can be used for classification or regression. Although AdaBoost is more resistant to overfitting than many machine learning algorithms, it is often sensitive to noisy data and outliers. Let's talk in more details.

Boosting, part 2: Gradient Boosting, XGboost

Registration

Fee: 650 UAH

Online

Gradient Boosting is another technique for performing supervised machine learning tasks, like classification and regression. XGBoost stands for Extreme Gradient Boosting; it is a specific implementation of the Gradient Boosting method which uses more accurate approximations to find the best tree model. It employs a number of nifty tricks that make it exceptionally successful, particularly with structured data. Let's talk in more details.

Model deployment, using Flask

Registration

Fee: 650 UAH

Online

Your model is ready and now is the time to deploy it into productio. Here we will tell you how to do it, using Flask. Let's talk in more details.

Linear regression using Python

Registration

Fee: 650 UAH

Online

Linear Regression is usually the first machine learning algorithm that every data scientist comes across. It is a simple model but everyone needs to master it as it lays the foundation for other machine learning algorithms. Let's talk in more details.