Skip to main content
Advertisement

Main menu

  • Home
  • Content
    • Current Issue
    • Accepted Manuscripts
    • Article Preview
    • Past Issue Archive
    • Video Articles
    • AJNR Case Collection
    • Case of the Week Archive
    • Case of the Month Archive
    • Classic Case Archive
  • Special Collections
    • AJNR Awards
    • Low-Field MRI
    • Alzheimer Disease
    • ASNR Foundation Special Collection
    • Photon-Counting CT
    • View All
  • Multimedia
    • AJNR Podcasts
    • AJNR SCANtastic
    • Trainee Corner
    • MRI Safety Corner
    • Imaging Protocols
  • For Authors
    • Submit a Manuscript
    • Submit a Video Article
    • Submit an eLetter to the Editor/Response
    • Manuscript Submission Guidelines
    • Statistical Tips
    • Fast Publishing of Accepted Manuscripts
    • Graphical Abstract Preparation
    • Imaging Protocol Submission
    • Author Policies
  • About Us
    • About AJNR
    • Editorial Board
    • Editorial Board Alumni
  • More
    • Become a Reviewer/Academy of Reviewers
    • Subscribers
    • Permissions
    • Alerts
    • Feedback
    • Advertisers
    • ASNR Home

User menu

  • Alerts
  • Log in

Search

  • Advanced search
American Journal of Neuroradiology
American Journal of Neuroradiology

American Journal of Neuroradiology

ASHNR American Society of Functional Neuroradiology ASHNR American Society of Pediatric Neuroradiology ASSR
  • Alerts
  • Log in

Advanced Search

  • Home
  • Content
    • Current Issue
    • Accepted Manuscripts
    • Article Preview
    • Past Issue Archive
    • Video Articles
    • AJNR Case Collection
    • Case of the Week Archive
    • Case of the Month Archive
    • Classic Case Archive
  • Special Collections
    • AJNR Awards
    • Low-Field MRI
    • Alzheimer Disease
    • ASNR Foundation Special Collection
    • Photon-Counting CT
    • View All
  • Multimedia
    • AJNR Podcasts
    • AJNR SCANtastic
    • Trainee Corner
    • MRI Safety Corner
    • Imaging Protocols
  • For Authors
    • Submit a Manuscript
    • Submit a Video Article
    • Submit an eLetter to the Editor/Response
    • Manuscript Submission Guidelines
    • Statistical Tips
    • Fast Publishing of Accepted Manuscripts
    • Graphical Abstract Preparation
    • Imaging Protocol Submission
    • Author Policies
  • About Us
    • About AJNR
    • Editorial Board
    • Editorial Board Alumni
  • More
    • Become a Reviewer/Academy of Reviewers
    • Subscribers
    • Permissions
    • Alerts
    • Feedback
    • Advertisers
    • ASNR Home
  • Follow AJNR on Twitter
  • Visit AJNR on Facebook
  • Follow AJNR on Instagram
  • Join AJNR on LinkedIn
  • RSS Feeds

AJNR Awards, New Junior Editors, and more. Read the latest AJNR updates

Research ArticleSpine Imaging and Spine Image-Guided Interventions

Enhancing the Radiologist-Patient Relationship through Improved Communication: A Quantitative Readability Analysis in Spine Radiology

D.R. Hansberry, A.L. Donovan, A.V. Prabhu, N. Agarwal, M. Cox and A.E. Flanders
American Journal of Neuroradiology June 2017, 38 (6) 1252-1256; DOI: https://doi.org/10.3174/ajnr.A5151
D.R. Hansberry
aFrom the Department of Radiology (D.R.H., M.C., A.E.F.), Thomas Jefferson University Hospital, Philadelphia, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for D.R. Hansberry
A.L. Donovan
bDepartment of Radiation Oncology (A.L.D., A.V.P.), University of Pittsburgh Cancer Institute, Pittsburgh, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for A.L. Donovan
A.V. Prabhu
bDepartment of Radiation Oncology (A.L.D., A.V.P.), University of Pittsburgh Cancer Institute, Pittsburgh, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for A.V. Prabhu
N. Agarwal
cDepartment of Neurological Surgery (N.A.), University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania.
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for N. Agarwal
M. Cox
aFrom the Department of Radiology (D.R.H., M.C., A.E.F.), Thomas Jefferson University Hospital, Philadelphia, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for M. Cox
A.E. Flanders
aFrom the Department of Radiology (D.R.H., M.C., A.E.F.), Thomas Jefferson University Hospital, Philadelphia, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for A.E. Flanders
  • Article
  • Figures & Data
  • Supplemental
  • Info & Metrics
  • Responses
  • References
  • PDF
Loading

Abstract

BACKGROUND AND PURPOSE: More than 75 million Americans have less than adequate health literacy skills according to the National Center for Education Statistics. Readability scores are used as a measure of how well populations read and understand patient education materials. The purpose of this study was to assess the readability of Web sites dedicated to patient education for radiologic spine imaging and interventions.

MATERIALS AND METHODS: Eleven search terms relevant to radiologic spine imaging were searched on the public Internet, and the top 10 links for each term were collected and analyzed to determine readability scores by using 10 well-validated quantitative readability assessments from patient-centered education Web sites. The search terms included the following: x-ray spine, CT spine, MR imaging spine, lumbar puncture, kyphoplasty, vertebroplasty, discogram, myelogram, cervical spine, thoracic spine, and lumbar spine.

RESULTS: Collectively, the 110 articles were written at an 11.3 grade level (grade range, 7.1–16.9). None of the articles were written at the American Medical Association and National Institutes of Health recommended 3rd-to-7th grade reading levels. The vertebroplasty articles were written at a statistically significant (P < .05) more advanced level than the articles for x-ray spine, CT spine, and MR imaging spine.

CONCLUSIONS: Increasing use of the Internet to obtain health information has made it imperative that on-line patient education be written for easy comprehension by the average American. However, given the discordance between readability scores of the articles and the American Medical Association and National Institutes of Health recommended guidelines, it is likely that many patients do not fully benefit from these resources.

ABBREVIATIONS:

AMA
American Medical Association
FRE
Flesch Reading Ease
GFI
Gunning Fog Index
NIH
National Institutes of Health

As barriers to on-line access have decreased, the Internet has emerged as a primary resource for Americans desiring greater understanding of their health. According to a June 2015 report by the Pew Research Center,1 up to 84% of adults access the Internet, and within the past year, 72% of those users have searched for health information.2 Specifically, 55% wanted to learn more about a disease or medical problem; and 43%, about a medical treatment or procedure.2 Studies have confirmed that this on-line research impacts decision-making for many patients: the questions they ask, the types of treatment they pursue, and whether they visit a physician.2⇓⇓–5

Although more adults are accessing health care information on-line than ever before,2,4 it is uncertain how much of this information is fully comprehended due to poor health literacy. Health literacy, as defined by the US Department of Health and Human Services, is “the capacity to obtain, process, and understand basic health information and services needed to make appropriate health decisions.”6 In a 2003 assessment commissioned by the US Department of Education, only 12% of adults were found to have proficient health literacy. Proficiency was defined as having the skills necessary to locate, understand, and use information contained within documents commonly encountered in the medical system, such as medication dosing instructions, preventative care documentation, and insurance information. This definition indicates an ability to read, analyze, and synthesize complex content. More than 75 million Americans demonstrated either basic or below basic health literacy and would experience difficulty reading and comprehending health care–related text.7 The importance of health literacy cannot be understated because it has a direct influence on both health outcomes and health care expenditures. Studies have linked low health literacy to increased hospitalizations,8,9 higher mortality rates,8,10 and an annual cost to the US economy of up to $238 billion.11 In fact, the American Medical Association (AMA) has identified low health literacy as a strong independent predictor of health status.12

Readability, defined as the degree of ease with which a given text can be read and comprehended, is 1 correlative measure of health literacy.13 The reading level of the average American is between the 7th and 8th grade, while the average Medicaid enrollee reads at just a 5th grade level.12 Therefore, to maximize the number of individuals benefiting from patient education, the AMA and the National Institutes of Health (NIH) recommend that content be written at a level commensurate with the 3rd-to-7th grade levels.12,14 However, patient education materials across numerous specialties in medicine do not meet this recommendation. A 2013 readability study published in Journal of the American Medical Association analyzed material from 16 different medical specialties and determined that it was too complex for the average patient.15 Similar conclusions have been drawn regarding the surgical subspecialties.16

Readability analyses specific to spine-related patient education have also revealed a failure to meet reading level guidelines.17⇓⇓–20 However, research to date has only examined surgical procedures and material sourced from professional society Web sites. Three of the 4 studies were also limited by an analysis that incorporated just 1 readability assessment. The purpose of this study was to quantitatively determine the readability of patient education Web sites pertaining to radiologic diagnostic tests and interventions of the spine. We used 10 readability assessments that are well-vetted in the literature to avoid bias from any single test. This analysis does not include patient education materials related to imaging of the brain.

Materials and Methods

This study examined publicly available data; thus, institutional review board oversight was not required. In December 2015, Web sites dedicated to patient education relevant to spine imaging were sought on the public Internet by using the Google search engine. Eleven keywords were separately entered as search terms: x-ray spine, CT spine, MR imaging spine, lumbar puncture, kyphoplasty, vertebroplasty, discogram, myelogram, cervical spine, thoracic spine, and lumbar spine. The first 10 articles intended for patients for each term were included in the analysis. Web sites not specifically directed toward patients were excluded. The text of 110 articles was copied, pasted, and saved as individual Microsoft Word (Microsoft, Redmond, Washington) documents. Images, figures, tables, references, and other noneducational text were removed.

Each document was then analyzed, and a readability analysis was performed with Readability Studio Professional Edition (Oleander Software, Vandalia, Ohio). An individual readability score was calculated for each of the 10 following well-validated assessments (Table): the Coleman-Liau Index,21 Flesch Reading Ease (FRE),22 Flesch-Kincaid Grade Level,23 FORCAST,24 Fry Graph,25 Gunning Fox Index (GFI),26 New Dale-Chall,27 New Fog Count,23 Raygor Readability Estimate,28 and SMOG.29 The FRE reports scores on a 0–100 scale with lower numbers corresponding to more difficult-to-read text. The remaining 9 scales report the readability of the text as a grade level. For instance, a GFI score of 9.0 corresponds to a 9th grade reading level.

View this table:
  • View inline
  • View popup

Formulas for the readability assessments

Statistical analysis was conducted by using OriginPro (OriginLab, Northamptom, Massachusetts) to compare readability scores among the 11 keywords. A 1-way ANOVA and a Tukey Honestly Significant Difference post hoc analysis were performed with P < .05.

Results

Collectively, the 110 articles had a mean FRE score of 51.9, classifying them as fairly difficult on the FRE scale, and an 11.3 mean grade level averaged across the other 9 assessments, scored on the basis of grade level (Fig 1). FRE scores ranged from 74 (fairly easy) to 14 (very difficult), and grade levels ranged from 7.1 to 16.9. None of the articles (0/110) met the recommendations of the AMA and NIH of being written within a 3rd-to-7th grade level. Approximately 35% (39/110) were written at a level that required a high school education or higher (score of ≥12). An additional 50 articles scored between a 9th and 12th grade levels (On-line Table).

Fig 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig 1.

The grade level taken as the mean of all readability scales examined in this study for the 10 top search results for each key term. The red box represents the AMA and NIH recommended 3rd-to-7th grade guidelines.

The articles consisted of many words characterized as complex, long, or unfamiliar. Words with at least 3 syllables were considered complex and composed 16.1% of the text of the articles, while words with at least 6 characters were considered long and composed 33.7%. More than 28% of words were classified as unfamiliar, as determined by an absence from the Dale-Chall list of simple words, which contains 3000 words known by most 4th grade children.27 In addition, unfamiliar words made up at least one-third of the text for 19 of the 110 (17.3%) articles. Sentences ranged from 23 to 127 words.

The 1-way ANOVA found a statistical difference among the 11 keywords (F(10,99) = 3.19, P = .001). Average grade levels for each searched term were as follows: x-ray spine, 9.4; CT spine,9.6; MR imaging spine, 10.2; discogram, 10.7; myelogram, 11.0; cervical spine, 11.2; thoracic spine, 11.8; lumbar spine,11.8; lumbar puncture, 12.0; kyphoplasty, 12.4; and vertebroplasty, 13.4. Tukey Honestly Significant Difference post hoc analysis indicated that the vertebroplasty articles were significantly more advanced than the articles for x-ray spine, CT spine, and MR imaging spine (P < .05).

Discussion

Due to the inherently complex nature of spine diagnoses and treatments, patients are apt to seek more information on the Internet. Up to 77% of individuals begin this process with a search engine such as Google.2 More than 90% do not look beyond the first page of results.30 Consequently, patients wishing to learn more about radiologic spine imaging and interventions would likely encounter 1 of the 110 articles in this study when searching for these 11 terms. With a mean readability score of 11.3, these articles would be too complex for the average American who reads at a 7th-to-8th grade level. In addition, the abundance of uncommon words and long sentences would make understanding difficult for those classified as having less than proficient health literacy, which indicates an inability to read and synthesize complex health care–related text. Therefore, 62% of the adult population identified by the US Department of Education as having either basic or below basic health literacy would not fully benefit from this information and may be led to uninformed decisions that negatively affect health outcomes.7

If on-line patient education resources were written at a 7th grade reading level or lower, more Americans would be able to read and understand the material more thoroughly. Consequently, patients would likely experience increased involvement in their care and improved communication with their physicians. When empowered with knowledge, patients have been shown to ask more questions, communicate concerns with greater confidence, and actively engage in the medical decision-making process.31⇓–33 Patients have also reported greater satisfaction, particularly with informed consent.34 In radiology, health literacy has been linked to differing rates of imaging use35 and patient knowledge of procedure details and radiation use.36 Complex examinations and interventions, including those of the spine, stand to benefit from the active patient engagement and enhanced patient-provider communication resulting from well-written education materials.

The results of this study are consistent with prior research investigating the readability of on-line patient education. Web sites for both medical and surgical subspecialties are routinely written at a level exceeding the 7th grade.37⇓⇓–40 Those dedicated to radiology, including radiologyinfo.org sponsored by the American College of Radiology and Radiological Society of North America, are written at a level too advanced for most patients.41 In addition, patient education materials from professional society Web sites, Wikipedia, WebMD, and hospital Web sites have all been written above the average comprehension level.42⇓⇓–45 This study, strengthened by the incorporation of text sourced from multiple Web site types and the use of 10 readability assessments, adds additional support to the conclusions drawn by prior spine imaging readability research. Collectively, these results highlight the need for further action to satisfy AMA and NIH readability recommendations. Authors and editors should use simpler words, construct shorter sentences, reduce abbreviations and acronyms, and eliminate medical jargon.14 Resources from the NIH,14 Centers for Disease Control and Prevention,46 and Center for Medicare and Medicaid Services are available to offer further guidance.47

This study is limited by the constraints of the readability assessments. Most important, the algorithms for certain quantitative parameters, such as the number of letters, syllables, words, and sentences used in the text, may lead to inaccurate scores for medical terminology. For instance, words with few syllables that are not necessarily familiar to the average person may lead to inappropriately low scores, while multisyllabic common words would be scored with a higher grade level. The FORCAST formula, which is based solely on the number of single-syllable words, is particularly susceptible to this bias. For example, “pia” would receive a lower rating than “operation,” despite being an uncommon term. The other assessments that use syllable counts, including the FRE, Flesch-Kincaid Grade Level, Fry Graph, GFI, and SMOG, may be affected to a somewhat lesser extent due to the use of additional variables. In this study, incorporation of 10 readability assessments reduces the bias of any single algorithm. An additional limitation is that none of the assessments evaluated the nontextual elements of readability, such as style, format, and organization13 or the use of supplemental material, such as images or diagrams. Further work is needed to determine the effect of these elements on the comprehension of patient education materials, specifically in radiology. Conducting readability and comprehension tests with target prospective patient populations may also be revealing.

Conclusions

With increasing use of the Internet for patient self-education, there is a growing need for the readability of material to fall within the limits of the average American's comprehension. However, an average reading level is often far exceeded in many disciplines of medicine. Spine imaging and radiologic interventions have not been an exception. It is imperative to broaden awareness of this discrepancy to mitigate the negative outcomes of poor health literacy. By adhering to the AMA and NIH guidelines, physicians, professional societies, and other authors can increase patient comprehension of on-line health care materials.

References

  1. 1.↵
    1. Perrin A,
    2. Duggan M
    . Americans' Internet Access: 2000–15. Pew Research Center; 2015. http://www.pewinternet.org/2015/06/26/americans-internet-access-2000-2015/. Accessed August 15, 2016.
  2. 2.↵
    1. Fox S,
    2. Duggan M
    . Health Online 2013. Pew Research Center: Science & Tech. January 15, 2013. http://www.pewinternet.org/2013/01/15/health-online-2013/. Accessed August 15, 2016.
  3. 3.↵
    1. Pourmand A,
    2. Sikka N
    . Online health information impacts patients' decisions to seek emergency department care. West J Emerg Med 2011;12:174–77 pmid:21691522
    PubMed
  4. 4.↵
    1. Rice RE
    . Influences, usage, and outcomes of Internet health information searching: multivariate results from the Pew surveys. Int J Med Inform 2006;75:8–28 doi:10.1016/j.ijmedinf.2005.07.032 pmid:16125453
    CrossRefPubMed
  5. 5.↵
    1. Rainie L,
    2. Fox S
    . The Online Health Care Revolution. Pew Research Center: Internet, Science & Tech. 2000. http://www.pewinternet.org/2000/11/26/the-online-health-care-revolution/. Accessed August 15, 2016.
  6. 6.↵
    Healthy People 2010. U.S. Department of Health and Human Services. Published November 26, 2000. http://www.healthypeople.gov/2010/. Accessed August 15, 2016.
  7. 7.↵
    National Assessment of Adult Literacy: A Nationally Representative and Continuing Assessment of English Language Literary Skills of American Adults. 2003. National Center for Education Statistics. https://nces.ed.gov/naal/fr_skills.asp. Accessed August 15, 2016.
  8. 8.↵
    1. Berkman ND,
    2. Sheridan SL,
    3. Donahue KE, et al
    . Low health literacy and health outcomes: an updated systematic review. Ann Intern Med 2011;155:97–107 doi:10.7326/0003-4819-155-2-201107190-00005 pmid:21768583
    CrossRefPubMed
  9. 9.↵
    1. Baker DW,
    2. Gazmararian JA,
    3. Williams MV, et al
    . Functional health literacy and the risk of hospital admission among Medicare managed care enrollees. Am J Public Health 2002;92:1278–83 doi:10.2105/AJPH.92.8.1278 pmid:12144984
    CrossRefPubMed
  10. 10.↵
    1. Baker DW,
    2. Wolf MS,
    3. Feinglass J, et al
    . Health literacy and mortality among elderly persons. Arch Intern Med 2007;167:1503–09 doi:10.1001/archinte.167.14.1503 pmid:17646604
    CrossRefPubMed
  11. 11.↵
    1. Vernon JA,
    2. Trujillo A,
    3. Rosenbaum SJ, et al
    . Low Health Literacy: Implications for National Health Policy. Washington, DC: Department of Health Policy, School of Public Health and Health Services, The George Washington University; 2007
  12. 12.↵
    1. Weiss BD,
    2. Schwartzberg JG
    ; American Medical Association. Health Literacy and Patient Safety: Help Patients Understand: Manual for Clinicians. Chicago: AMA Foundation; 2007
  13. 13.↵
    1. DuBay WH
    . The Principles of Readability. Costa Mesa: Impact Information; 2004
  14. 14.↵
    How to Write Easy-to-Read Health Materials. MedlinePLus. https://medlineplus.gov/etr.html. Accessed August 15, 2016.
  15. 15.↵
    1. Agarwal N,
    2. Hansberry DR,
    3. Sabourin V, et al
    . A comparative analysis of the quality of patient education materials from medical specialties. JAMA Intern Med 2013;173:1257–59 doi:10.1001/jamainternmed.2013.6060 pmid:23689468
    CrossRefPubMed
  16. 16.↵
    1. Hansberry DR,
    2. Agarwal N,
    3. Shah R, et al
    . Analysis of the readability of patient education materials from surgical subspecialties. Laryngoscope 2014;124:405–12 doi:10.1002/lary.24261 pmid:23775508
    CrossRefPubMed
  17. 17.↵
    1. Eltorai AE,
    2. Cheatham M,
    3. Naqvi SS, et al
    . Is the readability of spine-related patient education material improving? An assessment of subspecialty websites. Spine (Phila Pa 1976) 2016;41:1041–48 doi:10.1097/BRS.0000000000001446 pmid:27294810
    CrossRefPubMed
  18. 18.↵
    1. Ryu JH,
    2. Yi PH
    . Readability of spine-related patient education materials from leading orthopedic academic centers. Spine (Phila Pa 1976) 2016;41:E561–65 doi:10.1097/BRS.0000000000001321 pmid:26641845
    CrossRefPubMed
  19. 19.↵
    1. Agarwal N,
    2. Feghhi DP,
    3. Gupta R, et al
    . A comparative analysis of minimally invasive and open spine surgery patient education resources. J Neurosurg Spine 2014;21:468–74 doi:10.3171/2014.5.SPINE13600 pmid:24926930
    CrossRefPubMed
  20. 20.↵
    1. Vives M,
    2. Young L,
    3. Sabharwal S
    . Readability of spine-related patient education materials from subspecialty organization and spine practitioner websites. Spine (Phila Pa 1976) 2009;34:2826–31 doi:10.1097/BRS.0b013e3181b4bb0c pmid:19910867
    CrossRefPubMed
  21. 21.↵
    1. Coleman ML,
    2. Liau TL
    . A computer readability formula designed for machine scoring. J Appl Psychol 1975;60:283–84
    CrossRef
  22. 22.↵
    1. Flesch R
    . A new readability yardstick. J Appl Psychol 1948;32:221–33 doi:10.1037/h0057532 pmid:18867058
    CrossRefPubMed
  23. 23.↵
    1. Kincaid JP,
    2. Fishburne RP Jr.,
    3. Rogers RL
    . Derivation of New Readability Forumlas (Automated Readability Index, Fog Count and Flesch Reading Ease Formula) For Navy Enlisted Personnel. Millington: National Technical Information Service; 1975
  24. 24.↵
    1. Caylor JS,
    2. Sticht TG,
    3. Fox LC, et al
    . Methodologies for Determining Reading Requirements of Military Occupational Specialties. Technical Report. Alexandria: Human Resources Research Organization; 1973
  25. 25.↵
    1. Fry E
    . A readability formula that saves time. Journal of Reading 1968;11:513–78
  26. 26.↵
    1. Gunning R
    . The Technique of Clear Writing. New York: McGraw-Hill; 1952
  27. 27.↵
    1. Chall JS,
    2. Dale E
    . Readability Revisited: The New Dale-Chall Readability Formula. Northampton: Brookline Books Cambridge; 1995
  28. 28.↵
    1. Pearson PD,
    2. Hansen J
    1. Raygor AL
    . The Raygor readability estimate: a quick and easy way to determine difficulty. In: Pearson PD, Hansen J, ed. Reading: Theory, Research, and Practice: Twenty-Sixth Yearbook of the National Reading Conference. Clemson: National Reading Conference; 1977
  29. 29.↵
    1. McLaughlin GH
    . SMOG grading: a new readability formula. Journal of Reading 1969;12:639–46
  30. 30.↵
    Chitika. The Value of Google Result Positioning. June 7, 2013. https://chitika.com/google-positioning-value. Accessed August 15, 2016.
  31. 31.↵
    1. Lee CJ,
    2. Gray SW,
    3. Lewis N
    . Internet use leads cancer patients to be active health care consumers. Patient Educ Couns 2010;81(suppl):S63–69 doi:10.1016/j.pec.2010.09.004 pmid:20889279
    CrossRefPubMed
  32. 32.↵
    1. Iverson SA,
    2. Howard KB,
    3. Penney BK
    . Impact of internet use on health-related behaviors and the patient-physician relationship: a survey-based study and review. J Am Osteopath Assoc 2008;108:699–711 pmid:19075034
    PubMed
  33. 33.↵
    1. Hironaka LK,
    2. Paasche-Orlow MK
    . The implications of health literacy on patient-provider communication. Arch Dis Child 2008;93:428–32 doi:10.1136/adc.2007.131516 pmid:17916588
    Abstract/FREE Full Text
  34. 34.↵
    1. Fraval A,
    2. Chandrananth J,
    3. Chong YM, et al
    . Internet based patient education improves informed consent for elective orthopaedic surgery: a randomized controlled trial. BMC Musculoskelet Disord 2015;16:14 doi:10.1186/s12891-015-0466-9 pmid:25885962
    CrossRefPubMed
  35. 35.↵
    1. Morrison AK,
    2. Brousseau DC,
    3. Brazauskas R, et al
    . Health literacy affects likelihood of radiology testing in the pediatric emergency department. J Pediatr 2015;166:1037–41.e1 doi:10.1016/j.jpeds.2014.12.009 pmid:25596100
    CrossRefPubMed
  36. 36.↵
    1. Gebhard RD,
    2. Goske MJ,
    3. Salisbury SR, et al
    . Improving health literacy: use of an informational brochure improves parents' understanding of their child's fluoroscopic examination. AJR Am J Roentgenol 2015;204:W95–W103 doi:10.2214/AJR.14.12573 pmid:25539281
    CrossRefPubMed
  37. 37.↵
    1. Prabhu AV,
    2. Hansberry DR,
    3. Agarwal N, et al
    . Radiation oncology and online patient education materials: deviating from NIH and AMA recommendations. Int J Radiat Oncol Biol Phys 2016;96:521–28 doi:10.1016/j.ijrobp.2016.06.2449 pmid:27681748
    CrossRefPubMed
  38. 38.↵
    1. Prabhu AV,
    2. Gupta R,
    3. Kim C, et al
    . Patient education materials in dermatology: addressing the health literacy needs of patients. JAMA Dermatol 2016;152:946–47 doi:10.1001/jamadermatol.2016.1135 pmid:27191054
    CrossRefPubMed
  39. 39.↵
    1. Agarwal N,
    2. Chaudhari A,
    3. Hansberry DR, et al
    . A comparative analysis of neurosurgical online education materials to assess patient comprehension. J Clin Neurosci 2013;20:1357–61 doi:10.1016/j.jocn.2012.10.047 pmid:23809099
    CrossRefPubMed
  40. 40.↵
    1. Kasabwala K,
    2. Misra P,
    3. Hansberry DR, et al
    . Readability assessment of the American Rhinologic Society patient education materials. Int Forum Allergy Rhinol 2013;3:325–33 doi:10.1002/alr.21097 pmid:23044857
    CrossRefPubMed
  41. 41.↵
    1. Hansberry DR,
    2. John A,
    3. John E, et al
    . A critical review of the readability of online patient education resources from RadiologyInfo.Org. AJR Am J Roentgenol 2014;202:566–75 doi:10.2214/AJR.13.11223 pmid:24555593
    CrossRefPubMed
  42. 42.↵
    1. Hansberry DR,
    2. Agarwal N,
    3. Baker SR
    . Health literacy and online educational resources: an opportunity to educate patients. AJR Am J Roentgenol 2015;204:111–16 doi:10.2214/AJR.14.13086 pmid:25539245
    CrossRefPubMed
  43. 43.↵
    1. Hansberry DR,
    2. Ramchand T,
    3. Patel S, et al
    . Are we failing to communicate? Internet-based patient education materials and radiation safety. Eur J Radiol 2014;83:1698–702 doi:10.1016/j.ejrad.2014.04.013 pmid:24968965
    CrossRefPubMed
  44. 44.↵
    1. Hansberry DR,
    2. Kraus C,
    3. Agarwal N, et al
    . Health literacy in vascular and interventional radiology: a comparative analysis of online patient education resources. Cardiovasc Intervent Radiol 2014;37:1034–40 doi:10.1007/s00270-013-0752-6 pmid:24482028
    CrossRefPubMed
  45. 45.↵
    1. Hansberry DR,
    2. Agarwal N,
    3. Gonzales SF, et al
    . Are we effectively informing patients? A quantitative analysis of on-line patient education resources from the American Society of Neuroradiology. AJNR Am J Neuroradiol 2014;35:1270–75 doi:10.3174/ajnr.A3854 pmid:24763420
    Abstract/FREE Full Text
  46. 46.↵
    Centers for Disease Control and Prevention. Health Literacy Guidance and Standards. 2015. https://health.gov/communication/literacy/quickguide/quickguide.pdf. Accessed August 15, 2016.
  47. 47.↵
    Toolkit for Making Written Material Clear and Effective. Centers for Medicare and Medicaid Services. https://www.cms.gov/Outreach-and-Education/Outreach/WrittenMaterialsToolkit/index.html. Accessed August 15, 2016.
  • Received October 7, 2016.
  • Accepted after revision January 25, 2017.
  • © 2017 by American Journal of Neuroradiology
View Abstract
PreviousNext
Back to top

In this issue

American Journal of Neuroradiology: 38 (6)
American Journal of Neuroradiology
Vol. 38, Issue 6
1 Jun 2017
  • Table of Contents
  • Index by author
  • Complete Issue (PDF)
Advertisement
Print
Download PDF
Email Article

Thank you for your interest in spreading the word on American Journal of Neuroradiology.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Enhancing the Radiologist-Patient Relationship through Improved Communication: A Quantitative Readability Analysis in Spine Radiology
(Your Name) has sent you a message from American Journal of Neuroradiology
(Your Name) thought you would like to see the American Journal of Neuroradiology web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Cite this article
D.R. Hansberry, A.L. Donovan, A.V. Prabhu, N. Agarwal, M. Cox, A.E. Flanders
Enhancing the Radiologist-Patient Relationship through Improved Communication: A Quantitative Readability Analysis in Spine Radiology
American Journal of Neuroradiology Jun 2017, 38 (6) 1252-1256; DOI: 10.3174/ajnr.A5151

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
0 Responses
Respond to this article
Share
Bookmark this article
Enhancing the Radiologist-Patient Relationship through Improved Communication: A Quantitative Readability Analysis in Spine Radiology
D.R. Hansberry, A.L. Donovan, A.V. Prabhu, N. Agarwal, M. Cox, A.E. Flanders
American Journal of Neuroradiology Jun 2017, 38 (6) 1252-1256; DOI: 10.3174/ajnr.A5151
del.icio.us logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Purchase

Jump to section

  • Article
    • Abstract
    • ABBREVIATIONS:
    • Materials and Methods
    • Results
    • Discussion
    • Conclusions
    • References
  • Figures & Data
  • Supplemental
  • Info & Metrics
  • Responses
  • References
  • PDF

Related Articles

  • PubMed
  • Google Scholar

Cited By...

  • No citing articles found.
  • Crossref (6)
  • Google Scholar

This article has been cited by the following articles in journals that are participating in Crossref Cited-by Linking.

  • Patient Education in Neurosurgery: Part 1 of a Systematic Review
    Nathan A. Shlobin, Jeffrey R. Clark, Steven C. Hoffman, Benjamin S. Hopkins, Kartik Kesavabhotla, Nader S. Dahdaleh
    World Neurosurgery 2021 147
  • Readability of Patient Education Materials From RadiologyInfo.org: Has There Been Progress Over the Past 5 Years?
    Matthew Bange, Eric Huh, Sherwin A. Novin, Ferdinand K. Hui, Paul H. Yi
    American Journal of Roentgenology 2019 213 4
  • Lung Cancer Screening Guidelines: How Readable Are Internet-Based Patient Education Resources?
    David Richard Hansberry, Michael D. White, Michael D'Angelo, Arpan V. Prabhu, Sarah Kamel, Paras Lakhani, Baskaran Sundaram
    American Journal of Roentgenology 2018 211 1
  • Digital Footprint of Neurological Surgeons
    Christopher Kim, Raghav Gupta, Aakash Shah, Evan Madill, Arpan V. Prabhu, Nitin Agarwal
    World Neurosurgery 2018 113
  • Quantitative analysis of the level of readability of online emergency radiology-based patient education resources
    David R. Hansberry, Michael D’Angelo, Michael D. White, Arpan V. Prabhu, Mougnyan Cox, Nitin Agarwal, Sandeep Deshmukh
    Emergency Radiology 2018 25 2
  • A printed information leaflet about MRI and radiologists improves neuroradiology patient health literacy
    Daniel Thomas Ginat, Gregory Christoforidis
    The Neuroradiology Journal 2018 31 6

More in this TOC Section

  • Cone Beam CT Myelography
  • Post-Procedural Brachial Neuritis Features
  • Diagnostic Value of Brain WMH in SIH
Show more Spine Imaging and Spine Image-Guided Interventions

Similar Articles

Advertisement

Indexed Content

  • Current Issue
  • Accepted Manuscripts
  • Article Preview
  • Past Issues
  • Editorials
  • Editor's Choice
  • Fellows' Journal Club
  • Letters to the Editor
  • Video Articles

Cases

  • Case Collection
  • Archive - Case of the Week
  • Archive - Case of the Month
  • Archive - Classic Case

More from AJNR

  • Trainee Corner
  • Imaging Protocols
  • MRI Safety Corner

Multimedia

  • AJNR Podcasts
  • AJNR Scantastics

Resources

  • Turnaround Time
  • Submit a Manuscript
  • Submit a Video Article
  • Submit an eLetter to the Editor/Response
  • Manuscript Submission Guidelines
  • Statistical Tips
  • Fast Publishing of Accepted Manuscripts
  • Graphical Abstract Preparation
  • Imaging Protocol Submission
  • Evidence-Based Medicine Level Guide
  • Publishing Checklists
  • Author Policies
  • Become a Reviewer/Academy of Reviewers
  • News and Updates

About Us

  • About AJNR
  • Editorial Board
  • Editorial Board Alumni
  • Alerts
  • Permissions
  • Not an AJNR Subscriber? Join Now
  • Advertise with Us
  • Librarian Resources
  • Feedback
  • Terms and Conditions
  • AJNR Editorial Board Alumni

American Society of Neuroradiology

  • Not an ASNR Member? Join Now

© 2025 by the American Society of Neuroradiology All rights, including for text and data mining, AI training, and similar technologies, are reserved.
Print ISSN: 0195-6108 Online ISSN: 1936-959X

Powered by HighWire