Mensuração em sistemas admissionais no ensino superior e avaliações de impacto

Autores

  • Tufi Machado Soares Universidade Federal de Juiz de Fora
  • Mariana Calife Nóbrega Soares Fundação Centro de Políticas Públicas e Avaliação da Educação

DOI:

https://doi.org/10.34019/2237-9444.2020.v10.32029

Palavras-chave:

Avaliação de impacto, Medidas, ENEM, Qualidade, SISU

Resumo

Este trabalho apresenta um estudo que revisa os principais sistemas de admissão ao ensino superior no mundo. Em particular, revisa os diferentes processos de admissão e os exames e provas usados além dos métodos de mensuração empregados. Além disso, aborda a necessidade de se avaliar a qualidade desses testes e suas medidas e revisa alguns desses métodos. Finalmente, uma análise contextualizada da prova do ENEM, assim como seu uso no SISU, é realizada utilizando os resultados encontrados no estudo.

Downloads

Não há dados estatísticos.

Referências

BAKER, F. B.; KIM, S. Item Response Theory. 2 ed. Statistics: Textbooks and Monographs, v. 129. New York, USA: Marcel Dekker, Inc., 2004.

BIRNBAUM, A. Some Latent Trait Models and Their Use in Inferring an Examinee´s ability. In: LORD, F. M.; NOVICK, M. R. Statistical Theories of Mental Test Scores. Reading, MA: Addison- Wesley Pub, 1968. p. 397-472.

COURVILLE, T. G. An empirical comparison of item response theory and classical test theory item/person statistics. (Unpublished doctoral dissertation). Texas: A&M University, 2005. Disponível em: <http://txspace.tamu.edu/bitstream/ handle/1969.1/1064/etd-tamu-2004B-EPSY-Courville-2.pdf?sequence=1>. Acesso em: 05 nov. 2007.

EDUCATIONAL TESTING SERVICE. 2014 ETS Standards for Quality and Fairness. Princeton, NJ: ETS, 2015.

______. How the Test Is Scored. c2017. Disponível em: <https://www.ets.org/gre/revised_general/scores/how/>. Acesso em: 04 out. 2017.

______. Reliability and Comparability of TOEFL iBT™ Scores. TOEFL iBT Research, Princeton, NJ, Series 1, v. 3, 2011.

EDWARDS, D.; COATES, H.; FRIEDMAN, T. A survey of international practice
in university admissions testing. Higher Education Management and Policy, OECD, v. 24/1, 2012.

FAN, X. Item response theory and classical test theory: an empirical comparison of their item/person statistics. Educational and Psychological Measurement, Santa Barbara, CA, v. 58, n. 3, p. 357-381, June 1998.

GREER, D. G. “Thuth-in-testing legislation” An Analysis of political and legal consequences, and prospects. Houston TX: Institute for Higher Education Law and Governance, 1984.

HAMBLETON, R. K.; JONES, R. W. Comparison of Classical Test Theory and Item Response Theory and Their Applications to Test Development. Educational Measurement: Issues and Practice, v. 12(3), p. 38-47, 1993.

INSTITUTO COLOMBIANO PARA LA EVALUACIÓN DE LA EDUCACIÓN. Sistema Nacional de Evaluación Estandarizada de la Educación. Alineación del examen SABER 11°. Lineamientos generales 2014 – 2. 2014. Disponível em: <http://www.icfes.gov.co/docman/instituciones-educativas-y-secretarias/saber-11/novedades/650-guia-lineamientos-generales-saber-11-2014-2/file?force-download=1>. Acesso em: 15 out. 2017.

JABRAYILOV, R.; EMONS, W. H. M.; SIJTSMA, K. Comparison of Classical Test Theory and Item Response Theory in Individual Change Assessment. Applied Psychological Measurement, v. 40(8), p. 559-572, 2016.

KOLEN M.; BRENNAN R. L.; Test Equating, scaling, and linking. New York: Springer-Verlag, 2004.

LAW SCHOOL ADMISSION COUNCIL. Law School Admission Test (LSAT). Score Bands. c2017b. Disponível em: <https://www.lsac.org/jd/lsat/your-score/score-band>. Acesso em: 04 out. 2017.

LI, D. L. O novo Enem e a plataforma Sisu: efeitos sobre a migração e a evasão estudantil. 2016. Dissertação (Mestrado em Ciências) – Programa de Pós-Graduação em Economia do Departamento de Economia da Faculdade de Economia, Administração e Contabilidade, Universidade de São Paulo, São Paulo, 2016.

LIU, J.; HARRIS, D. J.; SCHIMIDT, A. Statistical Procedures Used in College Admissions Testing. In: RAO, C. R.; SINHARAY, S. Handbook of Statistics 26: Psychometrics. Amsterdam: Elsevier, 2007, p. 1057-91.

LORD, F. M. Applications of item response theory to practical testing problems. Hillsdale: Lawrence Erlbaum, New York, 1980.

LORD, F. M.; NOVICK, M. R. Statistical Theories of Mental Test Scores. Reading, MA: Addison-Wesley, 1968.

MACDONALD, P.; PAUNONEN, S. V. A Monte Carlo comparison of item and person parameters based on item response theory versus classical test theory. Educational and Psychological Measurement, v. 62, p. 921-43, 2002.

MCGRATH, C. H. et al. Higher Education Entrance Qualifications and Exams in Europe: A Comparison. Brussels: European Union, 2014.

McKINLEY, R.; KINGSTON, N. Exploring The Use of IRT Equating for the GRE Subject Test in Mathematics. ETS Report, RR-87-2, 1987.

MISLEV R. et al. Concepts, Terminology and Basic Models of Evidence-Centered Design. In: WILLIAMSON D.; MISLEV R.; BEJAR I. Automated Scoring of Complex Tasks in Computer-based Testing. London: Ed. Lawrence Erlbaum, 2006, p. 15-48.

PALMER, N.; BEXLEY, E.; JAME, R. Selection and participation in higher
education university selection in support of student success and diversity of
participation. Melbourne: CSHE, 2011.

SALLE, C. M. Demonstrating the Difference between Classical Test Theory and Item Response Theory Using Derived Test Data. The International Journal of Educational and Psychological Assessment, v. 1, Issue 1, p. 1-11, April 2009.

SARGEANT, C. et al. INCA comparative tables. London: International Review of Curriculum and Assessment Frameworks Internet Archive, 2012. Disponível em: https://www.nfer.ac.uk/what-we-do/information-and-reviews/inca/INCAcomparativetablesMarch2012.pdf. Acesso em: ?????

THE COLLEGE BOARD. At a glance. c2017a. Disponível em: <https://collegereadiness.collegeboard.org/sat-subject-tests/about/at-a-glance>. Acesso em: 04 out. 2017.

______. Score Choice. c2017b. Disponível em: <https://collegereadiness.collegeboard.org/sat-subject-tests/scores/sending-scores/score-choice>. Acesso em: 04 out. 2017.

______. Score Structure. c2017c. Disponível em: <https://collegereadiness.collegeboard.org/sat/scores/understanding-scores/structure>. Acesso em: 04 out. 2017.

______. Understanding Scores. c2017d. Disponível em: https://collegereadiness.collegeboard.org/sat/scores/understanding-scores/interpreting. Acesso em: 04 out. 2017.

THISSEN, D.; WAINER, H. Test Scoring. Mahwah, New Jersey: Lawrence Erlbaum Associates Pub., 2001.

UNIVERSIDAD DE CHILE. Departamento de evaluación, medición y registro educacional. Prueba de Selección Universitaria. ¿Cómo se calcula el puntaje? [c2017]a. Disponível em: http://www.psu.demre.cl/la-prueba/que-es-la-psu/calculo-puntaje-psu. Acesso em: 20 out. 2017.

UNIVERSIDAD DE CHILE. Departamento de evaluación, medición y registro educacional. Prueba de Selección Universitaria. El proceso de admisión. [c2017]b. Disponível em: http://www.psu.demre.cl/proceso-admision/. Acesso em: 24 out. 2017.

VAN DER LINDEN, W. J.; PASHLEY, P. J. Item Selection and Ability Estimation in Adaptive Testing. In: VAN DER LINDEN, W. J.; GLAS, C. A. W. (Eds.). Elements of adaptive testing. London: Kluwer Academic Pub., 2013. p. 2-26.

WAY, C.; REESE, C. An Investigation of the Use of Simplified IRT Models for Scaling and Equating the TOEFL Test. ETS Research Report, RR-90-29, 1990.

WEISSMAN, A. Optimizing Information Using the Expectation-Maximization Algorithm in Item Response Theory. Law School Admission Council Research Report, 11-01, March 2011.

Downloads

Publicado

2020-06-30

Como Citar

Soares, T. M., & Soares, M. C. N. (2020). Mensuração em sistemas admissionais no ensino superior e avaliações de impacto. Pesquisa E Debate Em Educação, 10(1), 1190–1223. https://doi.org/10.34019/2237-9444.2020.v10.32029

Artigos mais lidos pelo mesmo(s) autor(es)