The Effect of Paper Reading Versus Screen Reading on the Inferential Reading Performance among University ESL Learners

Main Article Content

Musharraf Aziz
Ahdi Hassan
Omar Aljamili


ESL learners, inferential comprehension, paper reading, reading mode, screen reading


Contemporary ESL education focuses on screen-based reading, specifically in the context of university level learners. Also, a sudden shift from paper bound reading activities to screen-based tasks was inevitable in response to COVID-19 outbreak. In this regard, the case of inferential and advanced level reading among the young ESL learners of developing countries appears researchable because of the general low digital literacy of these learners. This study, therefore, aimed to investigate the effect of reading mode shift from paper to screen on the inferential comprehension performance among Pakistani ESL learners at university level. A total of 426 undergraduate learners were sampled from Bachelor of Science program in a well-known university in the city of Lahore, Pakistan. Inferential reading was conceptualized through Bloom’s higher order thinking skills, that is, Analyzing, Evaluating and Creating skills therefore the reading comprehension test was based on these skills. Adopting sequential test administration, first the paper-based reading test, and later the screen-based reading was given. The obtained data were analyzed using Rendell’s XCALIBRE and SPSS V. 26.0. The logit scale descriptives, learner ability (θ), and mean scores demonstrated that the learners performed significantly higher in the paper test as compared to screen test. Moreover, it was found that the effect of reading mode shift was larger in Creating skill. The findings may have significant implications for reading mode selection, enhancement of digital competence and effect of reading mode shift in developing countries with inadequate IT facilitation and digital literacy among ESL learners.


Download data is not yet available.


Alderson, J. C., Figueras, N., Kuijper, H., Nold, G., Takala, S., & Tardieu, C. (2006). Analysing tests of reading and listening in relation to the Common European Framework of Reference: The experience of the Dutch CEFR construct project. Language Assessment Quarterly: An International Journal, 3(1), 3-30.
Anderson, L.W., Krathwohl, D.R., Airasian, P.W., Cruikshank, K.A., Mayer, R.E., Pintrich, P.R., Raths, J. and Wittrock, M.C. (eds.) (2001). A taxonomy for learning and teaching and assessing: A revision of Bloom’s taxonomy of educational objectives. Addison Wesley Longman.
Biancarosa, G., & Griffiths, G. G. (2012). Technology tools to support reading in the digital age. The Future of Children, 139-160.
Bol, T. (2020). Inequality in homeschooling during the Corona crisis in the Netherlands. First results from the LISS Panel. 
Brookhart, S. M. (2010). How to assess higher-order thinking skills in your classroom. ASCD.
Brookhart, S. M. (2015). How to make decisions with different kinds of student assessment data. ASCD.
Brown, M. (2023). Exploring Digital and Print Text Experiences of Adolescent Readers (Published Doctoral Dissertation, Boise State University). ProQuest. 
Chen, G., Cheng, W., Chang, T. W., Zheng, X., & Huang, R. (2014). A comparison of reading comprehension across paper, computer screens, and tablets: Does tablet familiarity matter?. Journal of Computers in Education, 1, 213-225.
Clinton, V. (2019). Reading from paper compared to screens: A systematic review and meta‐analysis. Journal of Research in Reading, 42(2), 288-325. 
DeStefano, D., & LeFevre, J. A. (2007). Cognitive load in hypertext reading: A review. Computers in Human Behavior, 23(3), 1616-1641. 
Dolenc, K., Aberšek, B., & Aberšek, M. K. (2015). Online functional literacy, intelligent tutoring systems and science education. Journal of Baltic Science Education, 14(2), 162. 
Douglas, M., Wilson, J., & Ennis, S. (2012). Multiple-choice question tests: a convenient, flexible and effective learning tool? A case study. Innovations in Education and Teaching International, 49(2), 111-121. 
Golan, D. D., Barzillai, M., & Katzir, T. (2018). The effect of presentation mode on children’s reading preferences, performance, and self-evaluations. Computers and Education, 126, 346-358.
IELTS. (2022). The world’s most trusted English test.
Jones, M. Y., Pentecost, R., & Requena, G. (2005). Memory for advertising and information content: Comparing the printed page to the computer screen. Psychology and Marketing, 22(8), 623-648.
Kalyuga, S. (2012). Instructional benefits of spoken words: A review of cognitive load factors. Educational Research Review, 7(2), 145-159. 
Kanwal, F., & Rehman, M. (2017). Factors affecting e-learning adoption in developing countries–empirical evidence from Pakistan’s higher education sector. IEEE Access, 5, 10968-10978. 
Kong, Y., Seo, Y. S., & Zhai, L. (2018). Comparison of reading performance on screen and on paper: A meta-analysis. Computers and Education, 123, 138-149. 
Kyllonen, P. C. (2017). Rethinking how we define and measure 21st century skills. Oxford
Livingston, S. A. (2009). Constructed-Response test questions: Why We Use Them; How We Score Them. R&D Connections. Number 11. Educational Testing Service.
Mangen, A., Walgermo, B. R., & Brønnick, K. (2013). Reading linear texts on paper versus computer screen: Effects on reading comprehension. International Journal of Educational Research, 58, 61-68.
Ocal, T., Durgunoglu, A., & Twite, L. (2022). Reading from screen vs reading from paper: does it really matter?. Journal of College Reading and Learning, 52(2), 130-148.
Pace, C., Pettit, S. K., & Barker, K. S. (2020). Best practices in middle level quaranteaching: Strategies, tips and resources amidst COVID-19. Becoming: Journal of the Georgia Association for Middle Level Education, 31(1), 2-13.
Pallant, J. (2020). Survival manual. A step-by-step guide to data analysis using SPSS, 4(4).
Ramirez, R. P. B., & Ganaden, M. S. (2008). Creative activities and students’ higher order thinking skills. Education Quarterly, 66 (1).
Reise, S. P. (1990). A comparison of item-and person-fit methods of assessing model-data fit in IRT. Applied Psychological Measurement, 14(2), 127-137.
Ronconi, A., Veronesi, V., Mason, L., Manzione, L., Florit, E., Anmarkrud, Ø., & Bråten, I. (2022). Effects of reading medium on the processing, comprehension, and calibration of adolescent readers. Computers and Education, 185, 104520.
Santos, A. I., Ferreira, C. M., Sá, M. J., & Serpa, S. N. F. D. (2019). Reading on paper and scrolling text on a screen in academic learning. Academic Journal of Interdisciplinary Studies, 8(3), 135-143.
Scholarship Online.
Shaw, S. D., & Weir, C. J. (2007). Examining writing: Research and practice in assessing second language writing (Vol. 26). Cambridge University Press.
Støle, H., Mangen, A., & Schwippert, K. (2020). Assessing children’s reading comprehension on paper and screen: A mode-effect study. Computers and Education, 151, 103861.
Wilson, A. D., & Williams, S. (2018, June). Autopager: Exploiting change blindness for gaze-assisted reading. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (pp. 1-5).
Wolf, M. (2020). Screen-Based Online Learning Will Change Kids’ Brains. Are We Ready for That?. The Guardian. (Dated 24 August, 2020).
Wyse, A. E., Stickney, E. M., Butz, D., Beckler, A., & Close, C. N. (2020). The potential impact of COVID‐19 on student learning and how schools can respond. Educational Measurement: Issues and Practice, 39(3), 60-64.
Zhi, M., & Huang, B. (2021). Investigating the authenticity of computer-and paper-based ESL writing tests. Assessing Writing, 50, 100548.