An open API service indexing awesome lists of open source software.

https://github.com/centerforassessment/ncme_2022_training_session

Data Validation and Analysis in the Era of COVID-19
https://github.com/centerforassessment/ncme_2022_training_session

academic-impact covid-19 learning-loss rstats sgp sgp-analyses

Last synced: about 2 months ago
JSON representation

Data Validation and Analysis in the Era of COVID-19

Awesome Lists containing this project

README

        

# NCME 2022 Training Session

# Data Visualization and Analysis in the Era of COVID-19

In this two, half-day NCME training session, participants will be introduced to suite of R based analyses that can be used to
address numerous education assessment data analysis and validation issues that arose due the the COVID-19 pandemic. One of the
consequences of the disrupted education due to the pandemic has been cancellation, interruption and modification of the
educational assessment of students. For example, in spring 2020, just after the pandemic began in the United States, all
state summative testing was cancelled after the United States Department of Education issued assessment waivers to states.
Similarly, as student education took place remotely, interim assessment providers have altered their products to allow students
to take tests while at home. These and other alterations to standard testing protocol present unique challenges to psyshometricians
and data analysts who validate and use these data.

Several practical issues emerged due to the pandemic that will be focal topics of instruction during this training session:

## Academic Impact Analysis

A common use of student assessment data is to try and determine the academic impact associated with the pandemic. Two complementary
ways of investigating academic impact are to look at change in academic attainment (i.e. status) and change in academic growth.
Due to the disruption of state assessment in spring 2020, many states were tasked with analyzing their state testing data across
a span of two years. Analysis of status and growth across a two-year time span was common in many states. Using approaches developed in
our work with states, we show how skip-year status and growth comparisons can be conducted in order to investigate the academic impact of
the pandemic on students.

## Missing data and changing enrollment

Due to the pandemic, numerous states experienced significant declines in student participation in state assessments. Additionally, due
to high student mobility during the pandemic, several states experienced high rates of change in terms of student enrollment.
Questions immediately emerged about what impact missing data and changing enrollment would have on comparability to 2019 results.
Comparisons between 2021 and 2019 are essential as part of any investigation into academic impact. If missing data or changing
enrollment are substantial, then those comparisons are threatened.

## Non-standardized testing situations

The pandemic forced states and assessment vendors to relax rigid test administration standards in order to collect achievement data.
Student receiving instruction from home, for example, took interim assessments from home. And in some states hardest hit by the
pandemic, students were administered the state summative assessment at home. Non-standardized testing conditions bring into question
comparability and the validity of the overall scores.

## Schedule

### Overview & Background: April 10, 1:00 to 2:00 pm [@dbetebenner](https://github.com/dbetebenner)

The COVID-19 pandemic caused numerous disruptions and alterations to education in the United States. In the first hour we provide attendees with an
overview of the training session followed by an introduction to the analytic approaches that will be investigated during the training session.
The COVID-19 pandemic caused numerous disruptions to student education and alterations to student testing. We will familiarize attendees with
some of the major ones (e.g., cancelled testing).

[Overview & Background Presentation](https://centerforassessment.github.io/NCME_2022_Training_Session/articles/presentations/Overview_and_Background.html)

### Software Preparations: April 10, 2:00 to 2:30 [@dbetebenner](https://github.com/dbetebenner) & [@adamvi](https://github.com/adamvi)

Training session participants will need to have R installed with several R packages in order to follow along with analyses being conducted as part of
the training session.

[Software Preparations Presentation](https://centerforassessment.github.io/NCME_2022_Training_Session/articles/presentations/Software_Preparations.html)

### Break: April 10, 2:30 to 2:45

### Academic Impact (Part 1): April 10, 2:45 to 3:45 pm [@dbetebenner](https://github.com/dbetebenner)

Investigating pandemic related academic impact on student learning. During the third and fourth hours, participants will be introduced to several ways
to investigate the academic impact students encountered due to the pandemic. Including: Skip year baseline referenced growth analyses.
Using a toy data set that mimic 2020 test cancellations, students will learn to calculate academic impact and use those results to investigate impact by
demographic subgroups.

[Academic Impact Part 1 Presentation](https://centerforassessment.github.io/NCME_2022_Training_Session/articles/presentations/Academic_Impact_Part_1.html)

### Descriptive Examination of Missing Data Patterns: April 10, 3:45 to 4:45 pm [@ndadey](https://github.com/ndadey)

Descriptive examination of missing data patterns.

[Missing Data (Part 1) Presentation](https://centerforassessment.github.io/NCME_2022_Training_Session/articles/presentations/Nathan_Presentation_1.html)

### Summary and next steps: April 10, 4:45 to 5:00 pm [@dbetebenner](https://github.com/dbetebenner)

Wrap-up question and answer of day 1 and an overview of what will be discussed during day 2.

### Missing Data (Part 2): April 11, 1 to 3:00 pm [@adamvi](https://github.com/adamvi)

Multiple Imputation with Missing Data: A substantial issue associated with assessment data from 2021 was whether aggregate results
(e.g., school level results) could be compared to previous year due to missing data (non-tested students) and changing enrollment. As part of
our work with states we developed numerous multiple imputation procedures to help understand missing data as well as propensity score matching
procedures to accommodate changing enrollment. Students will learn about these procedures and use example data to see how missing data can
interfere with inferences one makes from assessment data.

[Multiple Imputation with Missing Data Presentation](https://centerforassessment.github.io/NCME_2022_Training_Session/articles/presentations/MI_w_Missing_Data.html)

### Break: April 11, 3:00 to 3:15

### Academic Impact (Part 2): April 11, 3:15 to 4:45 pm [@dbetebenner](https://github.com/dbetebenner)

Investigating pandemic related academic impact on student learning. During the third and fourth hours of day 2, participants will conduct
status and growth based academic impact analyses.
methods of looking at impact based upon propensity score matching Andrew Ho’s Fair Trend method as used to look at academic impact.
Using a toy data set that mimic 2020 test cancellations, students will learn to calculate academic impact and use those results to investigate impact by
demographic subgroups.

[Academic Impact Part 2 Presentation](https://centerforassessment.github.io/NCME_2022_Training_Session/articles/presentations/Academic_Impact_Part_2.html)

### Wrap-up/Q&A: April 11, 4:45 to 5:00 pm

Wrap-up question and answer for the training session.