File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Inter-rater reliability of examiners in the Hong Kong College of radiologists' palliative medicine oral examination

TitleInter-rater reliability of examiners in the Hong Kong College of radiologists' palliative medicine oral examination
Authors
KeywordsOncologists
Palliative medicine
Issue Date2017
Citation
Hong Kong Journal of Radiology, 2017, v. 20, n. 3, p. 232-236 How to Cite?
Abstract© 2017 Hong Kong College of Radiologists. Objective: To analyse the inter-rater reliability of scores in the Palliative Medicine Oral Examination among examiners, among observers, and between examiners and observers. Methods: The Palliative Medicine Subspecialty Board aims to train oncology specialists for palliative medicine through a 4-year accreditation programme. At the end of the programme, trainees undergo a Board Examination involving subjective ratings by examiners. Each candidate rotated through two panels during the 1-day examination; one panel involved the written dissertation and questions pertaining to symptom management (viva 1) and the other about psychosocial issue (viva 2) and ethics (viva 3). A total of 10 candidates were evaluated on the four occasions using a 10-point scale by six examiners and four observers, along with one external examiner. Intraclass correlation coefficient (ICC) was calculated to determine inter-rater reliability (concordance) among examiners, among observers, and between examiners and observers. ICC values are classified as poor (≤0.20), fair (0.21-0.40), moderate (0.41-0.60), good (0.61-0.80), and very good (0.81-1.00). Results: Among examiners, concordance was overall good at different stations. Among observers, concordance was fair to very good across different stations. Between examiners and observers, concordance was fair to moderate at two stations. Across all stations, concordance was good between examiners and observers. Conclusion: The inter-rater reliability was good at the Board Examination administered by the Palliative Medicine Subspecialty Board of the Hong Kong College of Radiologists. The examination is reliable in accrediting practitioners for subspecialty certification.
Persistent Identifierhttp://hdl.handle.net/10722/251694
ISSN
2023 Impact Factor: 0.2
2023 SCImago Journal Rankings: 0.127
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorChow, R.-
dc.contributor.authorZhang, L.-
dc.contributor.authorSoong, I. S.-
dc.contributor.authorMang, O. W.K.-
dc.contributor.authorLui, L. C.Y.-
dc.contributor.authorWong, K. H.-
dc.contributor.authorSiu, S. W.K.-
dc.contributor.authorLo, S. H.-
dc.contributor.authorYuen, K. K.-
dc.contributor.authorYau, Y. S.H.-
dc.contributor.authorWong, K. Y.-
dc.contributor.authorLeung, C.-
dc.contributor.authorWong, S. Y.-
dc.contributor.authorNgan, R.-
dc.contributor.authorChow, E.-
dc.contributor.authorYeung, R.-
dc.date.accessioned2018-03-08T05:00:42Z-
dc.date.available2018-03-08T05:00:42Z-
dc.date.issued2017-
dc.identifier.citationHong Kong Journal of Radiology, 2017, v. 20, n. 3, p. 232-236-
dc.identifier.issn2223-6619-
dc.identifier.urihttp://hdl.handle.net/10722/251694-
dc.description.abstract© 2017 Hong Kong College of Radiologists. Objective: To analyse the inter-rater reliability of scores in the Palliative Medicine Oral Examination among examiners, among observers, and between examiners and observers. Methods: The Palliative Medicine Subspecialty Board aims to train oncology specialists for palliative medicine through a 4-year accreditation programme. At the end of the programme, trainees undergo a Board Examination involving subjective ratings by examiners. Each candidate rotated through two panels during the 1-day examination; one panel involved the written dissertation and questions pertaining to symptom management (viva 1) and the other about psychosocial issue (viva 2) and ethics (viva 3). A total of 10 candidates were evaluated on the four occasions using a 10-point scale by six examiners and four observers, along with one external examiner. Intraclass correlation coefficient (ICC) was calculated to determine inter-rater reliability (concordance) among examiners, among observers, and between examiners and observers. ICC values are classified as poor (≤0.20), fair (0.21-0.40), moderate (0.41-0.60), good (0.61-0.80), and very good (0.81-1.00). Results: Among examiners, concordance was overall good at different stations. Among observers, concordance was fair to very good across different stations. Between examiners and observers, concordance was fair to moderate at two stations. Across all stations, concordance was good between examiners and observers. Conclusion: The inter-rater reliability was good at the Board Examination administered by the Palliative Medicine Subspecialty Board of the Hong Kong College of Radiologists. The examination is reliable in accrediting practitioners for subspecialty certification.-
dc.languageeng-
dc.relation.ispartofHong Kong Journal of Radiology-
dc.subjectOncologists-
dc.subjectPalliative medicine-
dc.titleInter-rater reliability of examiners in the Hong Kong College of radiologists' palliative medicine oral examination-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.12809/hkjr1716804-
dc.identifier.scopuseid_2-s2.0-85032895133-
dc.identifier.volume20-
dc.identifier.issue3-
dc.identifier.spage232-
dc.identifier.epage236-
dc.identifier.isiWOS:000418601200009-
dc.identifier.issnl2223-6619-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats