"English-learning students’ scores on a state test designed to measure their mastery of the language fell sharply and have stayed low since 2018 — a drop that bilingual educators say might have less to do with students’ skills and more with sweeping design changes and the automated computer scoring system that were introduced that year.

English learners who used to speak to a teacher at their school as part of the Texas English Language Proficiency Assessment System now sit in front of a computer and respond to prompts through a microphone. The Texas Education Agency uses software programmed to recognize and evaluate students’ speech.

Students’ scores dropped after the new test was introduced, a Texas Tribune analysis shows. In the previous four years, about half of all students in grades 4-12 who took the test got the highest score on the test’s speaking portion, which was required to be considered fully fluent in English. Since 2018, only about 10% of test takers have gotten the top score in speaking each year."

  • @jordanlund
    link
    121 month ago

    I’d think you could fix that by recording the test and flagging scores below a certain level for human review.

    • @[email protected]
      link
      fedilink
      English
      21 month ago

      FTA: This year, the TEA said that at least 25% of the TELPAS writing and speaking assessments were re-routed to a human scorer to check the program’s work. That number oscillated between 17% and 23% in the previous six years, according to public records obtained by the Tribune.