GLOSSARY ENTRY (DERIVED FROM QUESTION BELOW) | ||||||
---|---|---|---|---|---|---|
|
16:45 Aug 27, 2015 |
Portuguese to English translations [PRO] Tech/Engineering - Mathematics & Statistics / Statistics | |||||
---|---|---|---|---|---|
|
| ||||
| Selected response from: DLyons Ireland Local time: 14:35 | ||||
Grading comment
|
Summary of reference entries provided | |||
---|---|---|---|
kappa agreement index / kappa index of agreement |
|
Cohen's kappa coefficient/Kendall's tau coefficient/rank correlation Explanation: have a look at wiki |
| |
Login to enter a peer comment (or grade) |
inter-rater agreement statistic Explanation: "Inter-rater agreement (kappa): creates a classification table, from raw data in the spreadsheet, for two observers and calculates an inter-rater agreement statistic (Kappa) to evaluate the agreement between two classifications on ordinal or nominal scales (Cohen, 1960; Fleiss et al., 2003)." https://www.medcalc.org/manual/kappa.php |
| |
Login to enter a peer comment (or grade) |
index of interrater reliability / index of agreement Explanation: I assume we're talking about Cohen's kappa here. https://en.wikipedia.org/wiki/Cohen's_kappa -------------------------------------------------- Note added at 1 hr (2015-08-27 18:04:39 GMT) -------------------------------------------------- The tau here may be that of Goodman and Kruskal. It gives the probabilities of assigning cases correctly to a set of categories, in the light of data on another set of categories. -------------------------------------------------- Note added at 5 hrs (2015-08-27 21:54:46 GMT) -------------------------------------------------- The tau coefficient used here is, I think, that due to Ma and Redmond (and NOT Kruskall or Kendall!) for a definition see http://www.unesco.org/csi/pub/source/rs9.htm This is essentially the same as Naesset's "Tau baseia-se na probabilidade a priori (NAESSET, 1996), ou seja, a concordância esperada (Pr) pode ser obtida antes mesmo de elaborar a matriz de erros. P r = 1/k, onde k é número de categorias ou classes." So this would be called the "tau index of agreement" and the kappa is the "index of interrater reliability". |
| |
Grading comment
| ||
Login to enter a peer comment (or grade) |
6 hrs |
Reference: kappa agreement index / kappa index of agreement Reference information: http://www.acronymfinder.com/Kappa-Index-of-Agreement-(KIA).... "What does KIA stand for? KIA stands for Kappa Index of Agreement" https://books.google.com.br/books?id=j7aawGLbtEoC&pg=PA150&l... https://books.google.com.br/books?id=6Q74SS7iUvIC&pg=PA148&l... |
| |
Login to enter a peer comment (or grade) |
Login or register (free and only takes a few minutes) to participate in this question.
You will also have access to many other tools and opportunities designed for those who have language-related jobs (or are passionate about them). Participation is free and the site has a strict confidentiality policy.