oter
Audio available in app

Support Vector Machines find optimal hyperplanes to separate classes in data from "summary" of Machine Learning by Stephen Marsland

Support Vector Machines are a type of machine learning algorithm that works by finding the optimal hyperplane to separate classes in data. This hyperplane is the decision boundary that best separates the data points into different classes. The goal of Support Vector Machines is to find the hyperplane that maximizes the margin between the classes, which helps improve the generalization performance of the model. The optimal hyperplane is found by maximizing the margin between the closest data points from each class to the hyperplane. These data points are called support vectors because they are crucial in defining the decision boundary. By maximizing the margin, Support Vector Machines are able to find a hyperplane that not only separates the classes but also generalizes well to unseen data. In cases where the data is not linearly separable, Support Vector Machines use a technique called the kernel trick to map the data into a higher-dimensional space where it becomes linearly separable. This allows Support Vector Machines to find non-linear decision boundaries by transforming the data into a space where a linear hyperplane can be used to separate the classes. Support Vector Machines are known for their ability to handle high-dimensional data efficiently and effectively. They are widely used in various applications such as image recognition, text classification, and bioinformatics. By finding the optimal hyperplane to separate classes in data, Support Vector Machines provide a powerful tool for building accurate and robust machine learning models.
    oter

    Machine Learning

    Stephen Marsland

    Open in app
    Now you can listen to your microbooks on-the-go. Download the Oter App on your mobile device and continue making progress towards your goals, no matter where you are.