文档详情

神经网络-.ppt

发布:2017-03-24约5.03千字共20页下载文档
文本预览下载声明
Introduction to Support Vector Machine (SVM) Introduction -- SVM classification Support vector machines (SVMs) developed by Vapnik, have gained wide acceptance because of their high generalization ability for a wide range of applications. There are many variations of SVM, including the soft margin classifier, adaptive margin classifier, and so on. Even though the two classes divided by the margin are slightly overlapped and noise exists, they all have the common property that the constructed hyper plane effectively separates two classes. Linear classifier denotes +1 denotes -1 f(x,w,b) = sign(w x + b) How would you classify this data? w x + b=0 w x + b0 w x + b0 Introduction -- SVM classification Linear classifier denotes +1 denotes -1 f(x,w,b) = sign(w x + b) How would you classify this data? Introduction -- SVM classification Linear classifier denotes +1 denotes -1 f(x,w,b) = sign(w x + b) How would you classify this data? Introduction -- SVM classification Linear classifier denotes +1 denotes -1 f(x,w,b) = sign(w x + b) Any of these would be fine.. ..but which is best? Introduction -- SVM classification Linear classifier denotes +1 denotes -1 f(x,w,b) = sign(w x + b) How would you classify this data? Misclassified to +1 class Introduction -- SVM classification Linear classifier denotes +1 denotes -1 f(x,w,b) = sign(w x + b) denotes +1 denotes -1 f(x,w,b) = sign(w x + b) Define the margin of a linear classifier as the width that the boundary could be increased by before hitting a datapoint. Introduction -- SVM classification Linear classifier denotes +1 denotes -1 f(x,w,b) = sign(w x + b) The maximum margin linear classifier is the linear classifier with the, um, maximum margin. This is the simplest kind of SVM (Called an LSVM) Linear SVM Support Vectors are those datapoints that the margin pushes up against Maximizing the margin is good according to intuition and PAC theory Implies that only support vectors are important; other training examples
显示全部
相似文档