Matches in DBpedia 2015-10 for { <http://dbpedia.org/resource/Inter-rater_reliability> ?p ?o }
Showing triples 1 to 85 of
85
with 100 triples per page.
- Inter-rater_reliability abstract "In statistics, inter-rater reliability, inter-rater agreement, or concordance is the degree of agreement among raters. It gives a score of how much homogeneity, or consensus, there is in the ratings given by judges. It is useful in refining the tools given to human judges, for example by determining if a particular scale is appropriate for measuring a particular variable. If various raters do not agree, either the scale is defective or the raters need to be re-trained.There are a number of statistics which can be used to determine inter-rater reliability. Different statistics are appropriate for different types of measurement. Some options are: joint-probability of agreement, Cohen's kappa and the related Fleiss' kappa, inter-rater correlation, concordance correlation coefficient and intra-class correlation.".
- Inter-rater_reliability wikiPageExternalLink agree.htm.
- Inter-rater_reliability wikiPageExternalLink kappa.
- Inter-rater_reliability wikiPageExternalLink full.
- Inter-rater_reliability wikiPageExternalLink agreestat.html.
- Inter-rater_reliability wikiPageExternalLink chance_agreement_correction.html.
- Inter-rater_reliability wikiPageExternalLink book4.
- Inter-rater_reliability wikiPageExternalLink book_excerpts.html.
- Inter-rater_reliability wikiPageExternalLink bjmsp2008_interrater.pdf.
- Inter-rater_reliability wikiPageExternalLink 9781439810804.
- Inter-rater_reliability wikiPageExternalLink reliability.html.
- Inter-rater_reliability wikiPageExternalLink ira.
- Inter-rater_reliability wikiPageID "7837393".
- Inter-rater_reliability wikiPageLength "15137".
- Inter-rater_reliability wikiPageOutDegree "29".
- Inter-rater_reliability wikiPageRevisionID "680023238".
- Inter-rater_reliability wikiPageWikiLink Bland–Altman_plot.
- Inter-rater_reliability wikiPageWikiLink Category:Inter-rater_reliability.
- Inter-rater_reliability wikiPageWikiLink Category:Statistical_data_types.
- Inter-rater_reliability wikiPageWikiLink Cohens_kappa.
- Inter-rater_reliability wikiPageWikiLink Computational_linguistics.
- Inter-rater_reliability wikiPageWikiLink Concordance_correlation_coefficient.
- Inter-rater_reliability wikiPageWikiLink Consensus.
- Inter-rater_reliability wikiPageWikiLink Consensus_decision-making.
- Inter-rater_reliability wikiPageWikiLink Content_analysis.
- Inter-rater_reliability wikiPageWikiLink Experimenters_bias.
- Inter-rater_reliability wikiPageWikiLink Fleiss_kappa.
- Inter-rater_reliability wikiPageWikiLink Generalizability_theory.
- Inter-rater_reliability wikiPageWikiLink Intra-class_correlation.
- Inter-rater_reliability wikiPageWikiLink Intra-class_correlation_coefficient.
- Inter-rater_reliability wikiPageWikiLink Intraclass_correlation.
- Inter-rater_reliability wikiPageWikiLink Level_of_measurement.
- Inter-rater_reliability wikiPageWikiLink Mean.
- Inter-rater_reliability wikiPageWikiLink Nominal_data.
- Inter-rater_reliability wikiPageWikiLink Observational_studies.
- Inter-rater_reliability wikiPageWikiLink Observational_study.
- Inter-rater_reliability wikiPageWikiLink Observer-expectancy_effect.
- Inter-rater_reliability wikiPageWikiLink Pearson_product-moment_correlation_coefficient.
- Inter-rater_reliability wikiPageWikiLink Psychometrics.
- Inter-rater_reliability wikiPageWikiLink Rasch_model.
- Inter-rater_reliability wikiPageWikiLink Spearmans_rank_correlation_coefficient.
- Inter-rater_reliability wikiPageWikiLink Standard_deviation.
- Inter-rater_reliability wikiPageWikiLink Statistics.
- Inter-rater_reliability wikiPageWikiLink Survey_research.
- Inter-rater_reliability wikiPageWikiLink Wikt:homogeneity.
- Inter-rater_reliability wikiPageWikiLink File:Bland-Altman-Plot.png.
- Inter-rater_reliability wikiPageWikiLinkText "Inter-rater reliability".
- Inter-rater_reliability wikiPageWikiLinkText "Inter-rater reliability#Sources of inter-rater disagreement".
- Inter-rater_reliability wikiPageWikiLinkText "Inter-rater_reliability#Limits_of_agreement".
- Inter-rater_reliability wikiPageWikiLinkText "Reliability analysis".
- Inter-rater_reliability wikiPageWikiLinkText "agreement between clinicians".
- Inter-rater_reliability wikiPageWikiLinkText "concordance".
- Inter-rater_reliability wikiPageWikiLinkText "concordant".
- Inter-rater_reliability wikiPageWikiLinkText "degree of agreement".
- Inter-rater_reliability wikiPageWikiLinkText "inter-judge".
- Inter-rater_reliability wikiPageWikiLinkText "inter-observer reliability".
- Inter-rater_reliability wikiPageWikiLinkText "inter-observer".
- Inter-rater_reliability wikiPageWikiLinkText "inter-rater agreement".
- Inter-rater_reliability wikiPageWikiLinkText "inter-rater reliability".
- Inter-rater_reliability wikiPageWikiLinkText "inter-rater".
- Inter-rater_reliability wikiPageWikiLinkText "interobserver reliability".
- Inter-rater_reliability wikiPageWikiLinkText "reliability of agreement".
- Inter-rater_reliability wikiPageWikiLinkText "reproducibility".
- Inter-rater_reliability wikiPageWikiLinkText "variation among raters".
- Inter-rater_reliability hasPhotoCollection Inter-rater_reliability.
- Inter-rater_reliability wikiPageUsesTemplate Template:Main.
- Inter-rater_reliability wikiPageUsesTemplate Template:Note.
- Inter-rater_reliability wikiPageUsesTemplate Template:Ref.
- Inter-rater_reliability subject Category:Inter-rater_reliability.
- Inter-rater_reliability subject Category:Statistical_data_types.
- Inter-rater_reliability hypernym Degree.
- Inter-rater_reliability type Type.
- Inter-rater_reliability type University.
- Inter-rater_reliability type Type.
- Inter-rater_reliability type Thing.
- Inter-rater_reliability comment "In statistics, inter-rater reliability, inter-rater agreement, or concordance is the degree of agreement among raters. It gives a score of how much homogeneity, or consensus, there is in the ratings given by judges. It is useful in refining the tools given to human judges, for example by determining if a particular scale is appropriate for measuring a particular variable.".
- Inter-rater_reliability label "Inter-rater reliability".
- Inter-rater_reliability sameAs Interrater-Reliabilität.
- Inter-rater_reliability sameAs Adostasun_neurri.
- Inter-rater_reliability sameAs m.026fs2y.
- Inter-rater_reliability sameAs Değerleyici_güvenebilirliği.
- Inter-rater_reliability sameAs Q470749.
- Inter-rater_reliability sameAs Q470749.
- Inter-rater_reliability wasDerivedFrom Inter-rater_reliability?oldid=680023238.
- Inter-rater_reliability isPrimaryTopicOf Inter-rater_reliability.