Matches in DBpedia 2016-04 for { <http://wikidata.dbpedia.org/resource/Q7309628> ?p ?o }
Showing triples 1 to 46 of
46
with 100 triples per page.
- Q7309628 subject Q7035969.
- Q7309628 subject Q7211696.
- Q7309628 subject Q8826201.
- Q7309628 abstract "Regularization perspectives on support vector machines provide a way of interpreting support vector machines (SVMs) in the context of other machine learning algorithms. SVM algorithms categorize multidimensional data, with the goal of fitting the training set data well, but also avoiding overfitting, so that the solution generalizes to new data points. Regularization algorithms also aim to fit training set data and avoid overfitting. They do this by choosing a fitting function that has low error on the training set, but also is not too complicated, where complicated functions are functions with high norms in some function space. Specifically, Tikhonov regularization algorithms choose a function that minimize the sum of training set error plus the function's norm. The training set error can be calculated with different loss functions. For example, regularized least squares is a special case of Tikhonov regularization using the squared error loss as the loss function.Regularization perspectives on support vector machines interpret SVM as a special case Tikhonov regularization, specifically Tikhonov regularization with the hinge loss for a loss function. This provides a theoretical framework with which to analyze SVM algorithms and compare them to other algorithms with the same goals: to generalize without overfitting. SVM was first proposed in 1995 by Corinna Cortes and Vladimir Vapnik, and framed geometrically as a method for finding hyperplanes that can separate multidimensional data into two categories. This traditional geometric interpretation of SVMs provides useful intuition about how SVMs work, but is difficult to relate to other machine learning techniques for avoiding overfitting like regularization, early stopping, sparsity and Bayesian inference. However, once it was discovered that SVM is also a special case of Tikhonov regularization, regularization perspectives on SVM provided the theory necessary to fit SVM within a broader class of algorithms. This has enabled detailed comparisons between SVM and other forms of Tikhonov regularization, and theoretical grounding for why it is beneficial to use SVM's loss function, the hinge loss.".
- Q7309628 thumbnail Hinge_and_Misclassification_Loss.png?width=300.
- Q7309628 wikiPageExternalLink svmlight.joachims.org.
- Q7309628 wikiPageExternalLink v=onepage&q=vapnik%20the%20nature%20of%20statistical%20learning%20theory&f=false.
- Q7309628 wikiPageExternalLink evgeniou-reviewall.pdf.
- Q7309628 wikiPageWikiLink Q1036748.
- Q7309628 wikiPageWikiLink Q1050404.
- Q7309628 wikiPageWikiLink Q11348.
- Q7309628 wikiPageWikiLink Q12485.
- Q7309628 wikiPageWikiLink Q13222616.
- Q7309628 wikiPageWikiLink Q170084.
- Q7309628 wikiPageWikiLink Q182505.
- Q7309628 wikiPageWikiLink Q1940696.
- Q7309628 wikiPageWikiLink Q2061913.
- Q7309628 wikiPageWikiLink Q2539.
- Q7309628 wikiPageWikiLink Q2778212.
- Q7309628 wikiPageWikiLink Q282453.
- Q7309628 wikiPageWikiLink Q2890061.
- Q7309628 wikiPageWikiLink Q319913.
- Q7309628 wikiPageWikiLink Q322339.
- Q7309628 wikiPageWikiLink Q331309.
- Q7309628 wikiPageWikiLink Q3345678.
- Q7309628 wikiPageWikiLink Q3985153.
- Q7309628 wikiPageWikiLink Q44337.
- Q7309628 wikiPageWikiLink Q4440864.
- Q7309628 wikiPageWikiLink Q5326898.
- Q7309628 wikiPageWikiLink Q5767098.
- Q7309628 wikiPageWikiLink Q657586.
- Q7309628 wikiPageWikiLink Q7035969.
- Q7309628 wikiPageWikiLink Q7211696.
- Q7309628 wikiPageWikiLink Q7233207.
- Q7309628 wikiPageWikiLink Q7314263.
- Q7309628 wikiPageWikiLink Q7604400.
- Q7309628 wikiPageWikiLink Q812535.
- Q7309628 wikiPageWikiLink Q8366.
- Q7309628 wikiPageWikiLink Q8826201.
- Q7309628 wikiPageWikiLink Q92916.
- Q7309628 wikiPageWikiLink Q934367.
- Q7309628 wikiPageWikiLink Q956437.
- Q7309628 wikiPageWikiLink Q983367.
- Q7309628 comment "Regularization perspectives on support vector machines provide a way of interpreting support vector machines (SVMs) in the context of other machine learning algorithms. SVM algorithms categorize multidimensional data, with the goal of fitting the training set data well, but also avoiding overfitting, so that the solution generalizes to new data points. Regularization algorithms also aim to fit training set data and avoid overfitting.".
- Q7309628 label "Regularization perspectives on support vector machines".
- Q7309628 depiction Hinge_and_Misclassification_Loss.png.