Because of the way GenAI uses data, it can store, process, and reveal personally identifiable information (PII). According to the 2023 report, Artificial Intelligence and the Future of Teaching and Learning, published by the U.S. Department of Education Office of Educational Technology, most AI models do not consider educational use or student privacy. Therefore, their products or educators' use of them may put student data at risk, diminishing an LEA's efforts to comply with student privacy mandates. Data privacy may also justifiably be a concern for parents and caretakers. Being transparent with the community about data protection practices can go a long way in building trust and credibility with the community.
An LEA's compliance with student privacy laws may be put at risk by using certain tools and applications in a school setting.
Introducing GenAI in the school environment may raise questions from parents and families regarding hte protection of their children's data.
Privacy issues can occur when humans input sensitive data into GenAI tools.
GenAI tools and platforms may be susceptible to security breaches, hacking attempts, or unauthorized access which could compromise the confidentiality and integrity of student data.
Responding to a frequently shared need for "how do we vet an AI-powered tool for Privacy and Ethics??" the Alliance, in collaboration with stakeholders and experts around the state and country, developed this Privacy Checklist that districts are encourage to send to prospective vendors, or vendors who are adding GenAI features to existing products.
This checklist is a general guide and not an exhaustive list. Final decisions rest with the school district, which is responsible for ensuring compliance with all applicable policies, regulations, and laws.
Children's Internet Protection Act (CIPA): Ensure that AI tools and platforms align with internet safety policies, web-filtering measures, monitoring requirements, and provisions established to protect students from accessing obscene or harmful content online.
Family Educational Rights and Privacy Act (FERPA): Safeguard student educational records to protect student privacy and confidentiality. Train teachers to securely manage student records, such as grades and attendance, and avoid the disclosure of personally identifiable information without proper consent.
Children's Online Privacy Protection Rule (COPPA): Only use tools that adhere to COPPA age and parental consent requirements, noting that some platforms' terms of service require users to be at least 13 years old or with parental consent, while others may have age restrictions for students under 18.
Protection of Pupil Rights Amendment (PPRA): Be mindful of AI use that requires students to provide information on the protected areas.
Arizona Revised Statute 15-142: Ensure that the use of generative AI tools safeguards student directory information and school property data.
Arizona Revised Statute 15-117: Adhere to survey protocols, obtain parental consent, and maintain informed consent procedures when employing GenAI tools.
Arizona Revised Statute 15-1046: Implement robust student data privacy measures to protect sensitive information, respect privacy boundaries, and secure student data confidentiality.