Matches in DBpedia 2016-04 for { <http://wikidata.dbpedia.org/resource/Q4874475> ?p ?o }
Showing triples 1 to 28 of
28
with 100 triples per page.
- Q4874475 subject Q5639957.
- Q4874475 subject Q7015116.
- Q4874475 subject Q7612002.
- Q4874475 abstract "In machine learning, kernel methods arise from the assumption of an inner product space or similarity structure on inputs. For some such methods, such as support vector machines (SVMs), the original formulation and its regularization were not Bayesian in nature. It is helpful to understand them from a Bayesian perspective. Because the kernels are not necessarily positive semidefinite, the underlying structure may not be inner product spaces, but instead more general reproducing kernel Hilbert spaces. In Bayesian probability kernel methods are a key component of Gaussian processes, where the kernel function is known as the covariance function. Kernel methods have traditionally been used in supervised learning problems where the input space is usually a space of vectors while the output space is a space of scalars. More recently these methods have been extended to problems that deal with multiple outputs such as in multi-task learning.In this article we analyze the connections between the regularization and the Bayesian point of view for kernel methods in the case of scalar outputs. A mathematical equivalence between the regularization and the Bayesian point of view is easily proved in cases where the reproducing kernel Hilbert space is finite-dimensional. The infinite-dimensional case raises subtle mathematical issues; we will consider here the finite-dimensional case. We start with a brief review of the main ideas underlying kernel methods for scalar learning, and briefly introduce the concepts of regularization and Gaussian processes. We then show how both points of view arrive at essentially equivalent estimators, and show the connection that ties them together.".
- Q4874475 wikiPageWikiLink Q1149000.
- Q4874475 wikiPageWikiLink Q1409400.
- Q4874475 wikiPageWikiLink Q1496376.
- Q4874475 wikiPageWikiLink Q17100712.
- Q4874475 wikiPageWikiLink Q190056.
- Q4874475 wikiPageWikiLink Q2061913.
- Q4874475 wikiPageWikiLink Q2293170.
- Q4874475 wikiPageWikiLink Q2431134.
- Q4874475 wikiPageWikiLink Q2539.
- Q4874475 wikiPageWikiLink Q2778212.
- Q4874475 wikiPageWikiLink Q278079.
- Q4874475 wikiPageWikiLink Q278090.
- Q4874475 wikiPageWikiLink Q282453.
- Q4874475 wikiPageWikiLink Q334384.
- Q4874475 wikiPageWikiLink Q3345678.
- Q4874475 wikiPageWikiLink Q45284.
- Q4874475 wikiPageWikiLink Q5639957.
- Q4874475 wikiPageWikiLink Q620622.
- Q4874475 wikiPageWikiLink Q6934509.
- Q4874475 wikiPageWikiLink Q7015116.
- Q4874475 wikiPageWikiLink Q7612002.
- Q4874475 wikiPageWikiLink Q812534.
- Q4874475 comment "In machine learning, kernel methods arise from the assumption of an inner product space or similarity structure on inputs. For some such methods, such as support vector machines (SVMs), the original formulation and its regularization were not Bayesian in nature. It is helpful to understand them from a Bayesian perspective.".
- Q4874475 label "Bayesian interpretation of kernel regularization".