Índice de concordância

English translation: index of interrater reliability / index of agreement

GLOSSARY ENTRY (DERIVED FROM QUESTION BELOW)
Portuguese term or phrase:Índice de concordância
English translation:index of interrater reliability / index of agreement
Entered by: DLyons

16:45 Aug 27, 2015
Portuguese to English translations [PRO]
Tech/Engineering - Mathematics & Statistics / Statistics
Portuguese term or phrase: Índice de concordância
Índice de concordância Kappa e Índice de concordância Tau. Encontrei traduções como Agreement Index e Concordance Index, mas em sites nos quais eu não confio. Alguém pode me ajudar? Obrigada.
Bianca August
Brazil
index of interrater reliability / index of agreement
Explanation:
I assume we're talking about Cohen's kappa here.
https://en.wikipedia.org/wiki/Cohen's_kappa

--------------------------------------------------
Note added at 1 hr (2015-08-27 18:04:39 GMT)
--------------------------------------------------

The tau here may be that of Goodman and Kruskal. It gives the probabilities of assigning cases correctly to a set of categories, in the light of data on another set of categories.

--------------------------------------------------
Note added at 5 hrs (2015-08-27 21:54:46 GMT)
--------------------------------------------------

The tau coefficient used here is, I think, that due to Ma and Redmond (and NOT Kruskall or Kendall!) for a definition see http://www.unesco.org/csi/pub/source/rs9.htm

This is essentially the same as Naesset's "Tau baseia-se na probabilidade a priori (NAESSET, 1996), ou seja, a concordância esperada (Pr) pode ser obtida antes mesmo de elaborar a matriz de erros. P r = 1/k, onde k é número de categorias ou classes."

So this would be called the "tau index of agreement" and the kappa is the "index of interrater reliability".
Selected response from:

DLyons
Ireland
Local time: 14:35
Grading comment
Thanks!
4 KudoZ points were awarded for this answer



Summary of answers provided
4 +1index of interrater reliability / index of agreement
DLyons
3Cohen's kappa coefficient/Kendall's tau coefficient/rank correlation
Richard Purdom
3inter-rater agreement statistic
Ligia Costa
Summary of reference entries provided
kappa agreement index / kappa index of agreement
Matheus Chaud

  

Answers


12 mins   confidence: Answerer confidence 3/5Answerer confidence 3/5
Cohen's kappa coefficient/Kendall's tau coefficient/rank correlation


Explanation:
have a look at wiki

Richard Purdom
Portugal
Local time: 14:35
Native speaker of: English
PRO pts in category: 8

Peer comments on this answer (and responses from the answerer)
neutral  DLyons: I don't think it's Kendall's tau in this context.
4 hrs
Login to enter a peer comment (or grade)

12 mins   confidence: Answerer confidence 3/5Answerer confidence 3/5
inter-rater agreement statistic


Explanation:
"Inter-rater agreement (kappa): creates a classification table, from raw data in the spreadsheet, for two observers and calculates an inter-rater agreement statistic (Kappa) to evaluate the agreement between two classifications on ordinal or nominal scales (Cohen, 1960; Fleiss et al., 2003)."

https://www.medcalc.org/manual/kappa.php

Ligia Costa
Brazil
Local time: 10:35
Native speaker of: Portuguese
Login to enter a peer comment (or grade)

34 mins   confidence: Answerer confidence 4/5Answerer confidence 4/5 peer agreement (net): +1
index of interrater reliability / index of agreement


Explanation:
I assume we're talking about Cohen's kappa here.
https://en.wikipedia.org/wiki/Cohen's_kappa

--------------------------------------------------
Note added at 1 hr (2015-08-27 18:04:39 GMT)
--------------------------------------------------

The tau here may be that of Goodman and Kruskal. It gives the probabilities of assigning cases correctly to a set of categories, in the light of data on another set of categories.

--------------------------------------------------
Note added at 5 hrs (2015-08-27 21:54:46 GMT)
--------------------------------------------------

The tau coefficient used here is, I think, that due to Ma and Redmond (and NOT Kruskall or Kendall!) for a definition see http://www.unesco.org/csi/pub/source/rs9.htm

This is essentially the same as Naesset's "Tau baseia-se na probabilidade a priori (NAESSET, 1996), ou seja, a concordância esperada (Pr) pode ser obtida antes mesmo de elaborar a matriz de erros. P r = 1/k, onde k é número de categorias ou classes."

So this would be called the "tau index of agreement" and the kappa is the "index of interrater reliability".

DLyons
Ireland
Local time: 14:35
Specializes in field
Native speaker of: English
PRO pts in category: 44
Grading comment
Thanks!

Peer comments on this answer (and responses from the answerer)
agree  Matheus Chaud: kappa agreement index / kappa index of agreement
6 hrs
  -> Thanks Matheus_RC.
Login to enter a peer comment (or grade)




Reference comments


6 hrs
Reference: kappa agreement index / kappa index of agreement

Reference information:
http://www.acronymfinder.com/Kappa-Index-of-Agreement-(KIA)....
"What does KIA stand for?
KIA stands for Kappa Index of Agreement"

https://books.google.com.br/books?id=j7aawGLbtEoC&pg=PA150&l...

https://books.google.com.br/books?id=6Q74SS7iUvIC&pg=PA148&l...

Matheus Chaud
Brazil
Works in field
Native speaker of: Native in PortuguesePortuguese
PRO pts in category: 51
Login to enter a peer comment (or grade)



Login or register (free and only takes a few minutes) to participate in this question.

You will also have access to many other tools and opportunities designed for those who have language-related jobs (or are passionate about them). Participation is free and the site has a strict confidentiality policy.

KudoZ™ translation help

The KudoZ network provides a framework for translators and others to assist each other with translations or explanations of terms and short phrases.


See also:
Term search
  • All of ProZ.com
  • Term search
  • Jobs
  • Forums
  • Multiple search