Add like
Add dislike
Add to saved papers

Assessment of inter- and intraobserver agreement for META score in distinguishing osteoporotic from multiple myeloma vertebral fractures.

PURPOSE: To conduct an independent assessment of inter- and intraobserver agreement for the META score as a tool for differentiating osteoporotic vertebral fractures and multiple myeloma vertebral fractures.

METHODS: This is a retrospective observational study. The magnetic resonance imaging analysis was made by two independent spinal surgeons. We designated a Subjective assessment, in which the surgeon should establish a diagnostic classification for each vertebral fracture based on personal experience: secondary to osteoporosis, categorized as a benign vertebral fracture (BVF), or attributed to multiple myeloma, categorized a malign vertebral fracture (MVF). After a 90-day interval, both surgeons repeated the evaluations. For the next step, the observers should establish a diagnosis between BVF and MVF according to the META score system, and both observers repeated the evaluations after a 90-day interval. The intra and interobserver reliability of the Subjective evaluation was studied using the kappa (κ) test. Then, the META evaluations were paralleled using the intraclass correlation coefficient (ICC).

RESULTS: A total of 220 patients who had the potential to participate in the study were initially enrolled, but after applying the exclusion criteria, 44 patients were included. Thirty-three patients had BVF, and 12 patients presented MVF. Interobserver agreement for both Subjective evaluations moments (initial and 90-days interval) found a slight agreement for both moments (0.35 and 0.40 respectively). Kappa test for both META evaluations moments (initial and 90-days interval) found a moderate interobserver agreement for both moments (0.54 and 0.48 respectively). It was observed that the ICC calculated for the Initial evaluation using META score was 0.680 and that in the 90-days interval was 0.726, indicating regular to good agreement. Kappa test for intraobserver agreements for the Subjective evaluation presented moderate agreement for both Surgeons. On the other side, Kappa test for intraobserver agreements for the META evaluation presented substantial agreement for both Surgeons. The Intraclass Correlation Coefficient of the META score found presented an almost perfect agreement for both Surgeons.

CONCLUSION: Intra and interobserver agreement for both surgeons were unsatisfactory. The lack of consistent reproducibility by the same observer discourages and disfavors the routine use of the META score in clinical decision making, when potentially cases of multiple myeloma may be present.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app