We have located links that may give you full text access.
Intra- and inter-examiner agreement when assessing radiographic implant bone levels: Differences related to brightness, accuracy, participant demographics and implant characteristics.
Clinical Oral Implants Research 2018 July
OBJECTIVES: Evaluate intra- and inter-examiner agreement of radiographic marginal bone level (MBL) assessment around Brånemark single implants; and whether agreement related to radiograph brightness, discrimination level (accuracy), participant demographics or implant characteristics.
MATERIALS AND METHODS: Seventy-four participants assessed MBLs of 100 digital radiographs twice with normal brightness, and twice with increased brightness. Intra-examiner agreement with and without increased brightness to the same thread, and within one thread; and inter-examiner agreement as compared with the group (defined by the mode) for the first assessments with and without increased brightness, to the same thread, and within one thread were calculated with Cohen's Kappa. Relationships between agreement, thread discrimination level (accuracy), brightness, participant and implant characteristics were explored.
RESULTS: When assessing 100 "Normal" radiographs twice, a participant on average assessed 24% differently to themselves (poor intra-examiner agreement, median Kappa 0.58, range 0.21-0.82); and 28% differently to other participants (poor inter-examiner agreement, median Kappa 0.53, range 0.05-0.80). Agreement within examiners improved when radiographs were "Bright" (median Kappa 0.58 vs. 0.62, p < 0.001, accuracy to same thread; median Kappa 0.94 vs. 0.96, p < 0.001, accuracy within one thread). Agreement between examiners was neither better nor worse when radiographs were "Bright" (median Kappa 0.53 vs. 0.55, p = 0.64, accuracy to same thread; median Kappa 0.93 vs. 0.93, p = 0.23, accuracy within one thread). Intra- and inter-examiner agreements were lower when accuracy to the same thread was required (p < 0.001, p < 0.001). Neither intra- nor inter-examiner agreement related to age, time since graduation, specialty, viewing device, implant experience, external hex familiarity, periimplantitis treatment experience, implant location or width (p-values 0.05-0.999). Intra-examiner agreement increased across dental assistants (n = 11), general dentists (n = 16) and specialists (n = 47) ("Bright" assessments, p = 0.045, median Kappa's 0.55, 0.60, 0.65 respectively); and for females (n = 8, males = 58) ("Normal" assessments, p = 0.019, median 0.68 vs. 0.55), but female numbers were low.
CONCLUSIONS: Agreement within and between examiners when assessing MBLs was poor. Disagreement occurred around 25% of the time, potentially affecting consistent disease assessments. No participant or implant characteristic clearly affected agreement. Brighter radiographs improved intra-examiner agreement. Overall, perceived MBL changes below 1 mm are likely due to human, not biological variation.
MATERIALS AND METHODS: Seventy-four participants assessed MBLs of 100 digital radiographs twice with normal brightness, and twice with increased brightness. Intra-examiner agreement with and without increased brightness to the same thread, and within one thread; and inter-examiner agreement as compared with the group (defined by the mode) for the first assessments with and without increased brightness, to the same thread, and within one thread were calculated with Cohen's Kappa. Relationships between agreement, thread discrimination level (accuracy), brightness, participant and implant characteristics were explored.
RESULTS: When assessing 100 "Normal" radiographs twice, a participant on average assessed 24% differently to themselves (poor intra-examiner agreement, median Kappa 0.58, range 0.21-0.82); and 28% differently to other participants (poor inter-examiner agreement, median Kappa 0.53, range 0.05-0.80). Agreement within examiners improved when radiographs were "Bright" (median Kappa 0.58 vs. 0.62, p < 0.001, accuracy to same thread; median Kappa 0.94 vs. 0.96, p < 0.001, accuracy within one thread). Agreement between examiners was neither better nor worse when radiographs were "Bright" (median Kappa 0.53 vs. 0.55, p = 0.64, accuracy to same thread; median Kappa 0.93 vs. 0.93, p = 0.23, accuracy within one thread). Intra- and inter-examiner agreements were lower when accuracy to the same thread was required (p < 0.001, p < 0.001). Neither intra- nor inter-examiner agreement related to age, time since graduation, specialty, viewing device, implant experience, external hex familiarity, periimplantitis treatment experience, implant location or width (p-values 0.05-0.999). Intra-examiner agreement increased across dental assistants (n = 11), general dentists (n = 16) and specialists (n = 47) ("Bright" assessments, p = 0.045, median Kappa's 0.55, 0.60, 0.65 respectively); and for females (n = 8, males = 58) ("Normal" assessments, p = 0.019, median 0.68 vs. 0.55), but female numbers were low.
CONCLUSIONS: Agreement within and between examiners when assessing MBLs was poor. Disagreement occurred around 25% of the time, potentially affecting consistent disease assessments. No participant or implant characteristic clearly affected agreement. Brighter radiographs improved intra-examiner agreement. Overall, perceived MBL changes below 1 mm are likely due to human, not biological variation.
Full text links
Related Resources
Trending Papers
Challenges in Septic Shock: From New Hemodynamics to Blood Purification Therapies.Journal of Personalized Medicine 2024 Februrary 4
Molecular Targets of Novel Therapeutics for Diabetic Kidney Disease: A New Era of Nephroprotection.International Journal of Molecular Sciences 2024 April 4
Perioperative echocardiographic strain analysis: what anesthesiologists should know.Canadian Journal of Anaesthesia 2024 April 11
The 'Ten Commandments' for the 2023 European Society of Cardiology guidelines for the management of endocarditis.European Heart Journal 2024 April 18
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app