Maintained for Historical Purposes

This resource is being maintained for historical purposes only and is not currently applicable.

(General) Subject: Release of 2009-2010 Quality Assurance Program Data Analysis Report

Posted Date:September 20, 2010

Author: Jana Hernandes, Service Director, Operations, Federal Student Aid

Subject: Release of 2009-2010 Quality Assurance Program Data Analysis Report

We are pleased to announce the availability of the report, "Analysis of Quality Assurance Program Data: 2009-10." This report contains findings from 142 of the 143 schools participating in the Quality Assurance (QA) Program during the 2009-2010 Award Year. Despite being relatively few in number, QA Program schools disburse 12 percent of all Federal Pell Grant (Pell Grant) funds.

This report analyzes data from 148,290 students that schools participating in the QA Program selected for verification. It examines the corrections that schools and students made to the information students supplied on their initial Free Application for Federal Student Aid (FAFSA) during the verification process. The report also analyzes responses QA schools provided to an online survey. The survey asked schools about their satisfaction with the Institutional Student Information Record (ISIR) Analysis Tool and to describe how their school selected students for verification. Our key findings are summarized as follows:

  • QA school verification efforts prevented over-payments within the Pell Grant program equal to ten percent of the Pell awards based on the FAFSA information supplied on the initial application.

  • Verification efforts also prevented under-payments equal to six percent of the total Pell initial volume.

  • The majority (58 percent) of applicants selected for QA school verification did not experience a change to Pell or a change to Expected Family Contribution (EFC) of at least 400.

  • Nearly 90 percent of those applicants with an automatic zero EFC selected for QA verification did not experience a change to Pell or a change to EFC of at least 400.

  • Ninety percent of the schools indicated that they found the ISIR Analysis Tool somewhat or very useful.

  • We were unable to detect strong relationships between the efficiency of verification efforts in either the ISIR data elements schools referenced or the methods schools used in applying this information to the selection of students for verification. We did, however, find that schools that used the greatest number of the five identified ways of applying information in their verification criteria were the most efficient.

The full report is available on the Quality Assurance Web site at the following link:

http://ifap.ed.gov/qadocs/ToolsforSchools/0910QADataAnalysisReport.pdf

Contact Information

If you have questions about the report, contact David Rhodes at david.rhodes@ed.gov.