Matches in DBpedia 2016-04 for { <http://wikidata.dbpedia.org/resource/Q17083041> ?p ?o }
Showing triples 1 to 21 of
21
with 100 triples per page.
- Q17083041 subject Q8367393.
- Q17083041 subject Q8382009.
- Q17083041 abstract "In information theory and machine learning, information gain is a synonym for Kullback–Leibler divergence. However, in the context of decision trees, the term is sometimes used synonymously with mutual information, which is the expectation value of the Kullback–Leibler divergence of a conditional probability distribution.In particular, the information gain about a random variable X obtained from an observation that a random variable A takes the value A=a is the Kullback-Leibler divergence DKL(p(x | a) || p(x | I)) of the prior distribution p(x | I) for x from the posterior distribution p(x | a) for x given a. The expected value of the information gain is the mutual information I(X; A) of X and A – i.e. the reduction in the entropy of X achieved by learning the state of the random variable A. In machine learning, this concept can be used to define a preferred sequence of attributes to investigate to most rapidly narrow down the state of X. Such a sequence (which depends on the outcome of the investigation of previous attributes at each stage) is called a decision tree. Usually an attribute with high mutual information should be preferred to other attributes.".
- Q17083041 wikiPageWikiLink Q1074648.
- Q17083041 wikiPageWikiLink Q131222.
- Q17083041 wikiPageWikiLink Q16766476.
- Q17083041 wikiPageWikiLink Q1744628.
- Q17083041 wikiPageWikiLink Q200125.
- Q17083041 wikiPageWikiLink Q204570.
- Q17083041 wikiPageWikiLink Q252973.
- Q17083041 wikiPageWikiLink Q2539.
- Q17083041 wikiPageWikiLink Q255166.
- Q17083041 wikiPageWikiLink Q278079.
- Q17083041 wikiPageWikiLink Q278090.
- Q17083041 wikiPageWikiLink Q331309.
- Q17083041 wikiPageWikiLink Q3985153.
- Q17083041 wikiPageWikiLink Q6031086.
- Q17083041 wikiPageWikiLink Q8367393.
- Q17083041 wikiPageWikiLink Q8382009.
- Q17083041 comment "In information theory and machine learning, information gain is a synonym for Kullback–Leibler divergence.".
- Q17083041 label "Information gain in decision trees".