Coefficient of agreement in a series of reports for am-bulatory electroencephalograms at the clinical neuro-phisiology department of Clinicas Hospital
DOI:
https://doi.org/10.29193/RMU.39.2.3Keywords:
ELECTROENCEPHALOGRAM, CONCORDANCE KAPPAAbstract
An electroencephalogram (EEG) is a neurophysiological technique that measures electrical activity in the brain for diagnostic purposes in epilepsy, and in patients with nonepileptic acute and chronic encelopathies. This test must be performed by physicians who are specialized in the area and have the appropriate updated and uniform training, in order to avoid dissimilar conclusions and outdated terms. We compared a series of ambulatory EEG by analyzing the Kappa or coefficient of agreement rate among observers to objectively learn how an EEG is interpreted and about agreement rates at the Neurophisiology Clinic. We believe it is important to learn about interobserver similarities and differences to allow for the correction of problems noticed and improve the quality of care.
References
Ebersole J, Husain A, Nordli D. Current Practice of Clinical Electroencephalography. 4th ed. Lippincott Williams & Wilkins: Philadelphia, 2014.
McKay JH, Tatum WO. Artifact mimicking ictal epileptiform activity in EEG. J Clin Neurophysiol 2019; 36(4):275- 88. doi: 10.1097/WNP.0000000000000597.
Mathias SV, Bensalem-Owen M. Artifacts that can be misinterpreted as interictal discharges. J Clin Neurophysiol 2019; 36(4):264-74. doi: 10.1097/WNP.0000000000000605.
Kang JY, Krauss GL. Normal variants are commonly overread as in-terictal epileptiform abnormalities. J Clin Neurophysiol 2019; 36(4): 257-63. doi: 10.1097/WNP.0000000000000613.
Cohen J. A coefficient of agreement for nominal scales. EPM 1960; 20:37-46. doi: 10.1177/001316446002000104.
American Electroencephalographic Society guidelines for standard electrode position nomenclature. J Clin Neurophysiol 1991; 8(2):200-2.
Kutluay E, Kalamangalam G. Montages for noninvasive EEG recording. J Clin Neurophysiol 2019; 36(5):330-6. doi: 10.1097/WNP.0000000000000546.
Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics 1977; 33(1):159-74.
Mc Hugh ML. Interrater reliability: the kappa statistic. Biochem Med (Zagreb) 2012; 22(3):276-82.
Viera AJ, Garrett JM. Understanding interobserver agreement: the kappa statistic. Fam Med 2005; 37(5):360-3.
Zec S, Soriani N, Comoretto R, Baldi I. High agreement and high prevalence: the paradox of Cohen’s Kappa. Open Nurs J 2017; 11:211-8. doi: 10.2174/1874434601711010211.