The web-based survey instrument contains edit checks to detect major reporting errors. The system automatically generates percentages and totals for each collection component and compares current responses to data reported the previous year. As edit checks are conducted through the DCS, survey respondents are alerted to correct errors detected by the system. If accurate data fail the edit checks, the survey respondents either confirm the response or explain why the data are out of the expected data range. All edit checks have to be resolved (confirmed or explained) before each survey is permitted to be locked. In some cases, the respondents cannot confirm or explain the edit failures, in which case they contact the IPEDS Help Desk for edit overrides. Many IPEDS survey component instruments also contain one or more context boxes that respondents can, at their discretion, use to explain any special circumstances that might not be evident in their reported data. In addition, IPEDS Help Desk staff manually review the data for additional errors. When necessary, the staff contact keyholders to verify the accuracy of the data.
The following are a few of the tailored edits for each IPEDS survey component.
Institutional Characteristics. IC edits examine the types of educational offerings (occupational, academic, continuing professional, avocational, adult basic, or secondary) and whether the institution qualifies as offering postsecondary programs and thus should be considered in scope for IPEDS.
Completions. The DCS preloads previously reported CIP codes in the current Completions component. IPEDS requires institutions to report Completions data using the most recent CIP taxonomy. The system checks the award levels reported for each CIP code against a predetermined list of valid award levels for each 6-digit CIP code developed by subject matter experts.
Edits also check the award levels against those indicated on the prior year’s Institutional Characteristic component and the prior year’s Completions component. For each award level, an edit compares the gender totals for each two-digit CIP with the information from the prior year. Within each award level, an edit compares the number of awards for each race/ethnicity and gender combination with the corresponding value from the prior year. Finally, the total number of completers (students) earning an award is expected to be less than or equal to the total number of completions (awards) reported.
12-month Enrollment. The E12 survey component also has several automated edit checks. The edits compare student counts, by level, with prior year counts to ensure consistency. They also check instructional activity hours to ensure that hours are reported if the institution reported students at the same level. Total instructional activity is also compared with the unduplicated headcount, for each student level, to ensure that the reported activity is appropriate for the number of students reported. That is, the clock and credit hours reported are expected to fall within a specific range defined by the institution’s calendar system and unduplicated headcount enrollment. Keyholders must explain any discrepancies or data reported outside the expected ranges.
Cost I. Edits compare application fees, tuition and fees, and food and housing charges with the prior year’s data for consistency. The system flags large changes for follow-up.
Admissions. Edit checks for the ADM component of the survey are performed to ensure that there is a response to each item on the Admissions Consideration page and that “Required” is selected for at least one of the considerations. On the Applicants/Admissions/Enrollment page, edit checks are performed to ensure that the total for each field is greater than zero and also greater than or equal to the sum of the values separately reported by gender. The total number of admissions is expected to be less than a percentage of the number of applicants; the percentage used in this edit varies by institutional sector. In addition, the number of admissions is required to be greater than or equal to the total number of students who enrolled. On the Test Score page (which is applicable only when SAT or ACT scores are required for admission), the edit checks ensure that the total number of test scores (both SAT and ACT scores) submitted by enrolled students is greater than or equal to the total number of enrolled students. In addition, the edit checks ensure that data are entered for each of the fields on the page. Edit checks ensure that test scores are within the range of valid scores for each test and test component. Additionally, if 25th percentile scores are reported, a 75th percentile score is required to be reported for that test score component, and vice versa. Edit checks also ensure that the reported 75th percentile scores are greater than the corresponding 25th percentile scores.
Graduation Rates. The GR component uses preloaded data from the Fall Enrollment component from the applicable year for the initial cohort of full-time, first-time degree/certificate-seeking students to ensure consistent reporting. Revisions to the initial cohort are permitted if better data have become available. To ensure that the sum of individual cells does not exceed the revised cohort for any race/ethnicity or gender classification, the system sums the individual cells and compares the result to the appropriate revised cohort values. The edits require institutions reporting very high or very low numbers of completers (as a percentage of the total cohort) to explain this anomaly. Finally, if any cohort members (i.e., the bachelor’s or equivalent degree-seeking cohort or the other-than-bachelor’s or equivalent degree-seeking cohort), are reported for either section of the Graduation Rates component, then data are expected to be reported in each applicable cohort section.
200 Percent Graduation Rates. The DCS preloads data into the GR200 component on the cohort of full-time, first-time degree/certificate-seeking students; exclusions from the cohort; and completers within 150 percent of normal program completion time from the GR component covering the appropriate cohort year. Edit checks compare the sums of individual cells with the revised cohort. Additionally, the edit rules require institutions that report very high or very low numbers of completers within 151 to 200 percent of normal program completion time, or report high numbers of additional cohort exclusions (as a percentage of the cohort), to explain the anomaly and make necessary corrections.
Outcome Measures. The OM cohorts of full-time, first-time and part-time, first-time degree/certificate-seeking students are required to be greater than or equal to the corresponding cohort(s) reported in the EF component for the appropriate cohort year in order to ensure consistent reporting. To ensure that the sum of individual cells does not exceed the revised cohort for any group, the DCS sums the individual cells and compares the result to the appropriate revised cohort value. Additionally, cross-component comparisons with the appropriate GR, GR200, and SFA components are conducted to ensure consistency between OM reported data and prior reported data on full-time, first-time students from the applicable cohort year.
Student Financial Aid. The number of full-time, first-time students in the SFA component must be less than or equal to the total number of undergraduate students enrolled. The number of full-time, first-time students who received any financial aid during the full academic year has to be less than or equal to the number of full-time, first-time undergraduate students, and the total aid received by the full-time, first-time students has to be less than the total aid received by all undergraduates. For public institutions, the sum of in-district, in-state, and out-of-state full-time, first-time undergraduate students cannot exceed the number of full-time, first-time undergraduate students as reported in Part B. The number of full-time, first-time undergraduate students receiving federal grants cannot exceed the number of full-time, first-time undergraduate students who received any financial aid during the full academic year. The same criteria apply to state/local grants, institutional grants, and loans to students.
Cost II. The number of aided students is compared to the corresponding type of students in the Student Financial Aid component. The average amount of aid awarded to full-time, first-time grant-aided students is compared with the average amount of aid from the previous year, and the keyholder has to justify large discrepancies in the edit explanations. The average grant aid awarded to Title IV students in each income category is compared with that for the next lower income category, and the keyholder has to justify (via edit explanations) instances where higher average aid is awarded to students with higher incomes.
Academic Libraries. Edit checks in the AL component ensure that a value is entered for all fields in Section I: Library Collections/Circulation, Interlibrary Loan Services, and Library Staff. In Section II, Expenses, edit checks ensure that a value is entered for all applicable fields. If the institution indicates that fringe benefits are paid out of the library budget, a value greater than zero is required to be entered for Total Fringe Benefits. In addition, if the institution indicates that fringe benefits are not paid out of the library’s budget, a value of zero is required to be entered for Total Fringe Benefits.
Fall Enrollment. The EF component has several automated edit checks designed to ensure internal consistency. Among them, the number of full-time, first-time degree/certificate-seeking undergraduate students has to be less than or equal to the total number of students. The checks compare student counts, by level, with activity hours reported in E12 to ensure that the numbers of undergraduate and graduate students are consistent with previously reported data. When Part B is required (or data are provided voluntarily during optional years), total students from Part B must equal the number reported in Part A. When Part C is required (or data are provided voluntarily during optional years), total first-time degree/certificate-seeking students in Part C has to equal total first-time degree/certificate-seeking students in Part A. If the DCS detects discrepancies in the numbers reported in Parts B and C compared with Part A, it generates balance amounts and enters data into “unknown” fields. For all sections, where large discrepancies exist between current year responses and data from previous years, the keyholder must justify the discrepancy via edit explanations.
Finance. If the DCS detects large changes in the reported Finance data when comparing current year data with the previous year’s data, the keyholder explains the reasons for the differences. In the version of the Finance component for private nonprofit institutions, total net assets has to equal total unrestricted net assets plus total restricted net assets. Total net assets also have to equal total assets minus total liabilities. For all versions of the Finance component, the DCS generates selected fields using predetermined formulas—such as other sources of revenue and other expenses. Institutions are instructed to review the generated totals and resolve any data entry errors.
Human Resources. The HR component has edit checks that compare the current year data for the full-time and part-time staff sections with the previous year’s data. If the edit fails, the keyholder has to explain any large discrepancies. Within Part G, the sum of the full-time instructional staff reported across the contract lengths has to be less than or equal to the corresponding total number of full-time instructional staff reported in Part A for each of the academic ranks, by gender. For each academic rank, average annual salaries are calculated, and the system performs checks to detect unusually high or unusually low averages. Total part-time staff data reported in Part D are checked for consistency with the total part-time staff data reported in Part E, by occupational category.