Welcome to the AHS 2020 ePoster Session. Please scroll down to view all of the submitted posters or press Control-F to search. To view the poster and its abstract, click on the poster image. Many posters also have a brief audio introduction which can be played by going to the bottom of the poster screen.
P006: ASSESSMENT OF ONLINE PATIENT EDUCATION RESOURCES FROM ACADEMIC AND NON-ACADEMIC HERNIA CENTERS ACROSS THE UNITED STATES
Kevin K Seeras1; Robert J Acho, DO1; Konstantinos Spaniolas, MD2; Pryor D Aurora, MD2; Salvatore Docimo, DO2; 1Henry Ford Macomb Hospital; 2Stony Brook University
Background: The American Medical Association (AMA) and the National Institutes of Health (NIH) suggest patient education materials should not exceed the sixth-grade reading level. Several studies have shown that patient information has been written well above the recommended reading level across multiple specialties. The aim of this study is to evaluate the readability of online patient materials provided by hernia surgeons across the United States in comparison to the current NIH and AMA recommendations.
Methods: A search was conducted utilizing the Google search engine. The key words “Hernia Center” and “University Hernia Center” were used to identify links to surgical programs within the United States. All programs were categorized into two groups: non-academic hernia centers and academic affiliated hernia programs. Programs who identify as a “hernia center” on their website with no identifiable university or academic affiliation were designated as a non-academic hernia center. Academic hernia programs consisted of surgeons, who claim an academic affiliation. The website’s general description of a hernia was copied and pasted into the Readable.io service and the following readability tests were conducted via the program: Flesch-Kincaid Grade Level (FKGL), Gunning Fox Index (GFI), Coleman-Liau Index (CLI), and Simple Measure of Gobbledygook (SMOG). The four readability tests were compared to the recommended readability level (grade level ≤ 6.9) utilizing a one-sample t-test. Academic and non-academic program readability scores were compared utilizing a two-sample t-test. A p-value of <0.05 was used to determine statistical significance.
Results: Of 96 websites, zero (0%) had fulfilled the recommended reading level in all four tests. The mean test scores for all non-academic centers (n=50) were as follows: FKGL (11.14 ± 2.68), GFI (14.39 ± 3.07), CLI (9.29 ± 2.48) and SMOG (13.38 ± 2.03). The mean test scores [SK1] for all academic programs (n=46) were as follows: FKGL (11.7 ± 2.66), GFI (15.01 ± 2.99), CLI (9.34 ± 1.91) and SMOG (13.71 ± 2.02). Both groups were compared to a value of 6.9 with a one-sample t-test revealing a p-value of 0.001 for all four tests and were considered statistically significant. The two groups were compared to each other with a two-sample t-test with a p-value of >0.05 for all four tests and there were no statistically significant differences.
Conclusions: The majority of hernia centers, regardless of their status as academic or non-academic, provide online educational material at a reading level well above the recommended by the NIH and AMA. Our results reveal that readability related to hernia pathology on hernia center websites is far too difficult for the average patient. We propose that the readability of less than the sixth-grade level be considered in order to improve patient education regarding hernia pathology and repair.
Click the image below to expand: