EDORA Learn â Methods
Administrative Data vs. Survey Data: Strengths and Limits
Youth research draws from two main sources: administrative records collected by agencies and survey responses gathered from individuals. Each offers unique insightâand distinct blind spots.
Administrative Data
Administrative datasets are created as by-products of service delivery or legal processes: court filings, school attendance, or program enrollments. They are typically large, longitudinal, and officialâbut not designed for research.
- Strengths: High coverage, objective timestamps, and long time spans.
- Limits: Missing context (motivation, perception), variable definitions driven by policy, and inconsistent reporting across sites.
- Bias sources: Reflects only those who touch formal systemsânon-contacted populations are invisible.
Survey Data
Surveys intentionally collect self-reported information about experience, attitudes, and conditions. They can capture perspectives that systems never record but are limited by design and recall bias.
- Strengths: Flexible content, subjective insight, population-level representation when sampled properly.
- Limits: Smaller samples, recall errors, nonresponse bias, and potential self-censorship on sensitive topics.
- Temporal gaps: Usually cross-sectional; may miss within-person change over time.
Integrating the Two
Linking administrative and survey data combines depth with breadth. Surveys fill in contextâsuch as mental health, school climate, or family supportsâwhile administrative data supply timing and verified outcomes. Integration, however, raises privacy and linkage challenges that must be documented.
- Align identifiers carefully and note linkage rates.
- Handle consent and confidentiality under clear governance.
- Document measurement overlap to prevent duplication of indicators.
Comparability & Bias
When interpreting differences between survey and administrative results, consider:
- Administrative coverage bias (who is captured by systems).
- Survey response bias (who answers, and how honestly).
- Definition drift across instruments or agencies.
- Differences in recall periods or event definitions.
Data & Methods
The research text notes that the best studies treat administrative and survey data as complementary lenses: one describing recorded events, the other describing lived experience. Transparency requires documenting both sources, linkage quality, and any weighting used to correct for differential coverage.
Related
Transparency note: Always identify whether figures come from administrative or survey data, and specify coverage, recall period, and any linkage or weighting used. Mixed sources need clear boundaries.