Technology-enhanced items in the context of large-scale assessments

Authors

  • Liamara Universidade Federal de Juiz de Fora, Instituto de Ciências Exatas, Programa de Pós-graduação em Gestão e Avaliação da Educação Pública, Juiz de Fora, Minas Gerais, Brasil http://orcid.org/0000-0001-6825-4945

DOI:

https://doi.org/10.34019/2237-9444.2023.v13.40259

Keywords:

Technology-enhanced items, Digital items, Digital item format, Large-Scale Assessment

Abstract

The article presents a systematic and bibliometric review with the objective of identifying and presenting characteristics, formats and methods for the development of Technology-enhanced items in the context of large-scale assessments. To this end, the process of selection and analysis of texts from two databases in the period from 2000 to 2021 is detailed. The results show common points that guide the agreement on the influence of technological advances for the transition of evaluations from print to digital and the emergence of new formats of items with different characteristics. In addition, they bring criticism to organizations regarding the adaptation of items used in printed assessments to digital assessments without a clear understanding of their differences and the belief that digital in itself adds value.

Downloads

Download data is not yet available.

Author Biography

Liamara, Universidade Federal de Juiz de Fora, Instituto de Ciências Exatas, Programa de Pós-graduação em Gestão e Avaliação da Educação Pública, Juiz de Fora, Minas Gerais, Brasil

Graduada em Informática (UnC). Mestre em Ciência da Computação (UFSC). Doutora em Engenharia de Produção – Mídia e Conhecimento (UFSC). Professora Associada III do Departamento de Ciência da Computação (DCC) da Universidade Federal de Juiz de Fora (UFJF). Atua nos Programas de Pós-Graduação Stricto Sensu em Educação Matemática (PPGEM) e em Gestão e Avaliação da Educação Pública (PPGP) da UFJF. É pesquisadora da Fundação CAEd/UFJF, coordenadora do curso de Graduação em Licenciatura em Computação (LiCOMP/UFJF) e membro dos grupos de pesquisa: GRUPAR – Grupo de Pesquisa Aprendizagem em Rede, NIDEEM - Núcleo de investigação, Divulgação e Estudos em Educação Matemática - Tecnologias de Informação e Comunicação na Educação Matemática e LApIC - Laboratório de Aplicações e Inovação em Computação. É autora do livro: Objetos de Aprendizagem pela editora CEAD/UFJF.

Currículo Lattes: http://lattes.cnpq.br/9104271477506670

References

ARSLAN, Burcu; JIANG, Yang; KEEHNER, Madeleine; GONG, Tao; KATZ, Irvin R.; YAN, Fred. The Effect of Drag‐and‐Drop Item Features on Test‐Taker Performance and Response Strategies. Educational measurement: issues and practice, v. 39, n. 2, p. 96-106, 2020. Disponível em: https://onlinelibrary.wiley.com/doi/abs/10.1111/emip.12326. Acesso em: 12 ago. 2021.

BENNETT, Randy Elliot. On the meaning of constructed response. In BENNETT, Randy Elliot; WARD, W.C. (Eds.) Construction versus choice in cognitive measurement: issues in constructed response, performance testing, and portfolio assessment. Hillsdale, NJ: Lawrence Erlbaum Associates. 1993. p. 1-27.

BOYLE, Andrew, HUTCHINSON, Dougal. Sophisticated tasks in e-assessment: what are they and what are their benefits? Assessment & evaluation in higher education, v. 34 n. 3, p. 305- 319, 2009.

BRAUN, Tibor; GLÄNZEL, Wolfgang; SCHUBERT, Andras. A Hirsch-type index for journals. Scientometrics, v. 69, n. 22, p. 169-173, 2006. Disponível em: https://www.researchgate.net/publication/220365396_A_Hirsch-type_index_for_journals. Acesso em: 20 out. 2021.

BRYANT, William. Developing a Strategy for Using Technology-Enhanced Items in Large-Scale Standardized Tests. Practical assessment, research, and evaluation, v. 22, 2017.

BRUNO, Adriana Rocha. Relatório de atividades de pesquisa em avaliação 2018-2019/outubro. CAEd, 2019

BURGA LEON, Andrés Alberto. Aplicaciones de la tecnología a la evaluación psicométrica. Propós. represent. [online], v.7, p. 318-318, 2019. Disponívem em: http://www.scielo.org.pe/scielo.php?script=sci_arttext&pid=S2307-79992019000400006&lng=es&nrm=iso. Acesso em: 20 jul. 2021.

CRABTREE, Ashleigh. R. Psychometric properties of technology-enhanced item formats: An evaluation of construct validity and technical characteristics. 2016. Doctoral dissertation, 181 The University of Iowa. The University of Iowa’s Institutional Repository. Disponível em: https://iro.uiowa.edu/esploro/outputs/doctoral/Psychometric-properties-of-technology-enhanced-item-formats/9983777220902771. Acesso em: 30 jul. 2021.

FALAGAS, Matthew E.; KOURANOS, Vasilios D.; ARENCIBIA-JORGE, Ricardo; KARAGEORGOPOULOS, Drosos E. Comparison of SCImago journal rank indicator with journal impact factor. The FASEB journal, v. 22, n. 8, p. 2623-2628, 2008.

GIBSON, James J. The ecological approach to visual perception. Boston, MA: Houghton-Mifflin. 1979.

HALADYNA, Thomas; RODRIGUEZ, Michael. Developing and validating Test Items. New York, NY: Routledge. 2013. Disponível em: https://www.researchgate.net/publication/346346355_Developing_and_Validating_Test_Items. Acesso em: 25 jul 2021.

HEI, Qiwei, BORGONOVI, Francesca; PACCAGNELLA, Marco. Using process data to understand adults’ problem-solving behaviour in the Programme for the International Assessment of Adult Competencies (PIAAC): Identifying generalised patterns across multiple tasks with sequence mining. OECD education working papers, OECD Publishing, Paris, n. 205, 2019. Disponível em: https://www.oecd-ilibrary.org/education/using-process-data-to-understand-adults-problem-solving-behaviour-in-the-programme-for-the-international-assessment-of-adult-competencies-piaac_650918f2-en. Acesso em: 30 jul. 2021.

HUFF, Kristen L.; SIRECI, Stephen G. Validity issues in computer-based testing. Educational measurement: issues and practice, v. 20, n. 3, p. 16-25, 2001. Disponível em: https://www.researchgate.net/publication/229806166_Validity_Issues_in_Computer-Based_Testing. Acesso em: 30 jul. 2021.

JIANG, Yang; GONG, Tao; SALDIVIA, Luis E.; CAYTON-HODGES, Gabrielle; AGARD, Christopher. Using process data to understand problem-solving strategies and processes for drag-and-drop items in a large-scale mathematics assessment. Large-scale assessments in education, v. 9, n. 2, 2021. Disponível em: https://largescaleassessmentsineducation.springeropen.com/articles/10.1186/s40536-021-00095-4#citeas. Acesso em: 20 jul. 2021.

JODOIN, Michael G. Measurement Efficiency of Innovative Item Formats in Computer-Based Testing. Journal of educational measurement, v. 40, n. 1, p. 1-15, 2003. Disponível em: http://www.jstor.org/stable/1435051. Acesso em: 20 dez. 2021.

KANE, M. Content-related validity evidence in test development. In: DOWNING, Steven M.; HALADYNA, Thomas M. (Eds.) Handbook of test development. Mahwah, New Jersey: Lawrence Erlbaum Associates, 2006. p. 131-154.

KITCHENHAM, Barbara. Procedures for performing systematic reviews. Keele university, v. 33, p. 1-26, 2004.

KOCH, D.A. Testing goes graphical. Journal of interactive instruction development, v. 5, p. 14-21, 1993.

MACMILLAN, Neil A.; CREELMAN, C. Douglas. Detection theory: A user’s guide. New York, NY: Cambridge University Press. 1991. Disponível em: https://psycnet.apa.org/record/1991-97801-000. Acesso em: 30 jul. 2021.

MONCALEANO, Sebastian. Examining the comparative measurement value of technology-enhanced Items. 2021. 261 p. PhD thesis. Boston College Lynch School. 2021. Disponível em: https://dlib.bc.edu/islandora/object/bc-ir:109086. Acesso em: 20 jul. 2021.

MOON, Jung A.; KEEHNER, Madeleine; KATZ, Irvin R. Test takers response tendencies in alternative item formats: a cognitive science approach. educational assessment, v. 25, n. 3, p. 236-250, 2020. Disponível em: https://www.tandfonline.com/doi/abs/10.1080/10627197.2020.1804350. Acesso em: 20 jul. 2021.

MUCKLE, Timothy J. Web-based item development and banking. V In: LANE, Suzanne; RAYMOND, Mark R.; HALADYNA, Thomas M. (org.). Handbook of test development. 2. ed. New York, NY: Routledge, 2016. p. 241-258.

PARSHALL, Cynthia G.; DAVEY, Tim; PASHLEY, Peter J. Innovative item types for computerized testing. In LINDEN, Wim J. van der; GLAS, Cees A.W. (Eds.), Computerized adaptive testing: theory and practice. Kluwer Academic Publishers. 2000. p. 129-148.

PARSHALL, Cynthia G.; HARMES, J. Christine. Improving the quality of innovative item types: Four tasks for design and development. Journal of applied testing technology, v. 10, n. 1, p. 1-20, 2014.

PARSHALL, Cynthia G.; HARMES, J. Christine; DAVEY, Tim; PASHLEY, Peter J. Innovative items for computerized testing. In LINDEN, Wim J. van der; GLAS, Cees A.W. (Eds.) Computerized adaptive testing: theory and practice. 2. ed. Norwell, MA: Kluwer Academic Publishers. 2010. Disponível em: https://link.springer.com/chapter/10.1007/978-0-387-85461-8_11. Acesso em: 21 dez. 2021.

PARSHALL, Cynthia G.; STEWART, Rob; RITTER, Judy. Innovations: Sound, graphics, and alternative response modes. [Paper Presentation]. National council on measurement in education annual meeting, New York, EUA. 1996. Disponível em: https://files.eric.ed.gov/fulltext/ED421524.pdf. Acesso em: 22 jul. 2021.

PAULA, Samantha; ARAÚJO, Marco Antônio; SILVA, Júlio César da. Pesquisa científica baseada em uma revisão sistemática da literatura. RECM - Revista de educação, ciências e matemática, v. 6, n. 2. 2016. Disponível em http://publicacoes.unigranrio.edu.br/index.php/recm/article/view/4058. Acesso em: 20 dez. 2021.

QIAN, Hong, WOO, Ada; KIM, Doyoung. Exploring the psychometric properties of innovative items in computerized adaptive testing. In JIAO, Hong; LISSITZ, Robert W. (Eds.), Technology enhanced innovative assessment: development, modeling, and scoring from an interdisciplinary perspective. Information Age Publishing. 2017. p. 97-118.

RUSSELL, Michael. A Framework for Examining the Utility of Technology-Enhanced Items. Journal of applied testing technology, v. 17, n. 1, p. 20-32, 2016. Disponível em: http://jattjournal.net/index.php/atp/article/view/89189. Acesso em: 20 jul. 2021.

SAMPAIO, Rosana Ferreira; MANCINI, Marisa Cotta. Estudos de revisão sistemática: um guia para síntese criteriosa da evidência científica. Revista brasileira de fisioterapia, v. 11, p. 83-89, 2007.

SANTOS, Raimundo Nonato Macedo dos. Produção científica: por que medir? o que medir? Revista digital de biblioteconomia e ciência da informação, v. 1, n. 1, p. 22–38, 2003. Disponível em: https://brapci.inf.br/index.php/res/v/39961. Acesso em: 21 dez. 2021.

SCALISE, Kathleen; GIFFORD, Bernard. Computer-based assessment in E-learning: A framework for constructing “intermediate constraint” questions and tasks for technology platforms. The journal of technology, learning, and assessment, v. 4, n. 6, p. 1-43, 2006. Disponível em: https://ejournals.bc.edu/index.php/jtla/article/view/1653. Acesso em: 20 jul. 2021.

SCORTEGAGNA, Liamara. Relatório de atividades de pesquisa em avaliação 2020-2021/março. Juiz de Fora: CAEd, 2020.

SIRECI, Stephen G.; ZENISKY, April L. Innovative Item Formats in Computer-Based Testing: In Pursuit of Improved Construct Representation. In DOWNING, Steven M.; HALADYNA, Thomas M. (Eds.), Handbook of test development. Lawrence Erlbaum Associates Publishers, 2006. p. 329-347.

SIRECI, Stephen G.; ZENISKY, April L. Computerized innovative item formats: Achievement and Credentialing. In: LANE, Suzanne; RAYMOND, Mark R.; HALADYNA, Thomas M. (org.). Handbook of test development. 2. ed. New York, NY: Routledge, 2016. p. 313-334.

STRAIN-SEYMOUR, Ellen; WAY, Walter; DOLAN, Robert P. Strategies and processes for developing innovative items in large-scale assessments. 2009. Disponível em: https://www.researchgate.net/publication/349104257_Strategies_and_Processes_for_Developing_Innovative_Items_in_Large-Scale_Assessments. Acesso em: 30 jul. 2021.

TARRANT, Marie, KNIERIM, Aimee, HAYES, Sasha.K., WARE, James. The frequency of item writing flaws in multiple-choice. Nurse education today, v. 26, n. 8, p. 662-671, 2006. Disponível em: https://www.sciencedirect.com/science/article/abs/pii/S0260691706001067. Acesso em: 20 dez. 2021.

WAN, Lei; HENLY, George A. Measurement properties of two innovative item formats in a computer based test. Applied measurement in education, v. 25, n. 1, p. 58–78, 2012.

WISE, Steven L.; SOLAND, James; DUPRAY, Laurence M. The Impact of Technology-Enhanced Items on Test-Taker Disengagement. Journal of applied testing technology, v. 22, n.1, p. 28-36, 2021. Disponível em: https://eric.ed.gov/?id=EJ1296001. Acesso em: 20 jul. 2021.

ZENISKY, April L; SIRECI, Stephen G. Technological innovations in large‐scale assessment. Applied measurement in education, v.15, n. 4, p. 337-362, 2002. Disponível em: https://www.researchgate.net/publication/248580109_Technological_Innovations_in_Large-Scale_Assessment. Acesso em: 08 ago. 2021.

Published

2023-12-08

How to Cite

Scortegagna, L. (2023). Technology-enhanced items in the context of large-scale assessments. Pesquisa E Debate Em Educação, 13, p. 1–23, e40259. https://doi.org/10.34019/2237-9444.2023.v13.40259

Issue

Section

Applied research

Most read articles by the same author(s)