Coefficient of agreement in a series of reports for am-bulatory electroencephalograms at the clinical neuro-phisiology department of Clinicas Hospital

Authors

  • Carina Mezquita Universidad de la República, Facultad de Medicina, Hospital de Clínicas, Departamento de Neurofisiología, Licenciada en Neurofisiología Clínica
  • Mariana Legnani Universidad de la República, Facultad de Medicina, Hospital de Clínicas, Departamento de Neurofisiología Clínica, Asistente de Neurofisiología Clínica. Neuróloga
  • Luis Urban Universidad de la República, Facultad de Medicina, Hospital de Clínicas, Departamento de Neurofisiología Clínica, Asistente de Neurofisiología Clínica. Neurólogo
  • Heber Jochen Hackembruch Universidad de la República, Facultad de Medicina, Hospital de Clínicas, Departamento de Neurofisiología Clínica, Profesor Agregado de Neurofisiología Clínica. Neurólogo

DOI:

https://doi.org/10.29193/RMU.39.2.3

Keywords:

ELECTROENCEPHALOGRAM, CONCORDANCE KAPPA

Abstract

An electroencephalogram (EEG) is a neurophysiological technique that measures electrical activity in the brain for diagnostic purposes in epilepsy, and in patients with nonepileptic acute and chronic encelopathies. This test must be performed by physicians who are specialized in the area and have the appropriate updated and uniform training, in order to avoid dissimilar conclusions and outdated terms. We compared a series of ambulatory EEG by analyzing the Kappa or coefficient of agreement rate among observers to objectively learn how an EEG is interpreted and about agreement rates at the Neurophisiology Clinic. We believe it is important to learn about interobserver similarities and differences to allow for the correction of problems noticed and improve the quality of care.

References

Ebersole J, Husain A, Nordli D. Current Practice of Clinical Electroencephalography. 4th ed. Lippincott Williams & Wilkins: Philadelphia, 2014.

McKay JH, Tatum WO. Artifact mimicking ictal epileptiform activity in EEG. J Clin Neurophysiol 2019; 36(4):275- 88. doi: 10.1097/WNP.0000000000000597.

Mathias SV, Bensalem-Owen M. Artifacts that can be misinterpreted as interictal discharges. J Clin Neurophysiol 2019; 36(4):264-74. doi: 10.1097/WNP.0000000000000605.

Kang JY, Krauss GL. Normal variants are commonly overread as in-terictal epileptiform abnormalities. J Clin Neurophysiol 2019; 36(4): 257-63. doi: 10.1097/WNP.0000000000000613.

Cohen J. A coefficient of agreement for nominal scales. EPM 1960; 20:37-46. doi: 10.1177/001316446002000104.

American Electroencephalographic Society guidelines for standard electrode position nomenclature. J Clin Neurophysiol 1991; 8(2):200-2.

Kutluay E, Kalamangalam G. Montages for noninvasive EEG recording. J Clin Neurophysiol 2019; 36(5):330-6. doi: 10.1097/WNP.0000000000000546.

Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics 1977; 33(1):159-74.

Mc Hugh ML. Interrater reliability: the kappa statistic. Biochem Med (Zagreb) 2012; 22(3):276-82.

Viera AJ, Garrett JM. Understanding interobserver agreement: the kappa statistic. Fam Med 2005; 37(5):360-3.

Zec S, Soriani N, Comoretto R, Baldi I. High agreement and high prevalence: the paradox of Cohen’s Kappa. Open Nurs J 2017; 11:211-8. doi: 10.2174/1874434601711010211.

Published

2023-05-25

How to Cite

1.
Mezquita C, Legnani M, Urban L, Hackembruch HJ. Coefficient of agreement in a series of reports for am-bulatory electroencephalograms at the clinical neuro-phisiology department of Clinicas Hospital. Rev. Méd. Urug. [Internet]. 2023 May 25 [cited 2024 Oct. 18];39(2):e203. Available from: https://revista.rmu.org.uy/index.php/rmu/article/view/1026

Most read articles by the same author(s)