Imagine this: Tomorrow’s alumni will enter a workplace where AI tools are as common as email. Diagnosing patient symptoms, analyzing market trends, optimizing supply chains, or designing new infrastructures. From healthcare to marketing to engineering, almost everything is being transformed. Is our school prepared for this new reality? And is there an effective way to assess such preparation?
in Gwinnett County Public Schools (GCPS), educators are determined to ensure that both answers are “yes.” Their mission is to ensure that all students are “AI ready.” No matter where these careers are, you are ready to use emerging technologies such as generator AI in an ethical and responsible way in school, life, and future work. To support this goal, GCPS led the development of both the AI preparation framework and companion diagnostic evaluation.
In 2019, GCPS worked with several partners to create an AI preparation framework focusing on six core areas: data science, mathematical reasoning, creative problem solving, ethics, applied experiences and programming. This framework was developed with opinions from district subject experts (including computer science, mathematics and science teachers) and external partners.
To make the framework useful and practical, districts ISTE Research Team In 2025, we developed a diagnostic assessment tool that measures student AI preparation across selection skills outlined in the framework. Diagnostic assessments, in contrast to overall assessments, measure students’ current knowledge and skills, help educators identify gaps and fields for growth, and may require additional guidance, resources or support to students and school leaders to meet their learning outcomes.
A systematic approach to test design
Here’s how the district and research teams achieved AI preparedness diagnostic evaluation:
Define objectives and develop frameworks
The team had to explain practical considerations. Who will take the test? How will it be delivered? What time constraints were there?
The AI Preparation Framework covers Prek-12, but the team began by designing diagnosis for high school students in grades 9-12. They knew the ratings were digital (to maximize flexibility) and quick 10-15 minutes. These factors influenced the type of questions used. To support autoscoring, the team included multiple selections and Likert scale questions.
Create a draft question
First, the ISTE Research Team and GCPS Partners worked together to identify the framework constructs they wanted to measure within each of the six core regions. This ensured consistent coverage across all areas.
Once components were defined, the team worked with subject experts (both district educators and external AI and education experts) to draft three to five items tailored to the expertise of each composition.
Reviews and revisions
After drafting the items, the researchers reviewed them for consistency, confirming that each measure only one skill. Through the improvement process, they narrowed the set to two items per construct over a total of 26 structures, creating two versions of the pilot evaluation. The district then built a pilot assessment on its research platform, Qualtrics, to facilitate distribution.
Put the pilot in the test
Seckinger High School students (around 1,200 total) participated in the pilot. They were divided alphabetically by last name into two groups, and evaluated items from two “parallel” sets. The district confirmed that the two groups had similar demographics. Students completed the pilot during the homeroom period.
Analyze the results
While expert input ensured strong composition validity, it was necessary to assess the reliability of both the item and overall test. The research team conducted a series of psychometric analyses, including test reliability, empirical item analysis, and item response analysis. These analyses helped us identify which items worked well and which items needed improvement or removal.
Prior to analysis, the researchers cleaned the data to quickly complete the assessment and eliminated suspicious response patterns, such as students who did not read the items carefully.
Where this work is headed
With items and test analysis in hand, the research team and the district worked together to create a final version of the diagnostic assessment designed for high school students. They are currently exploring ways to adapt the tools to other grade levels and incorporate more complex items, such as performance-based tasks that allow students to demonstrate their skills in real-world contexts.
Going forward, the district hopes that the results of this diagnosis will contribute to a more comprehensive picture of students’ AI preparation, along with other data points such as teacher assessments, computer science coursework and the CAPstone project. These combined learning inform the district-wide curriculum development and student support strategies.
Reflection
Diagnostic measurements of AI Preparation provide districts with critical data for strategic planning and resource allocation, ensuring students are prepared for an AI-saturated world. Collaboration between district leaders and research teams demonstrates the importance of thoughtful design and strict assessment practices. GCPS and ISTE+ASCD hope that work as models for other districts preparing students for the future with Generating AI will be useful.