This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 1 minute read

Key Webinar Takeaways: Artificial Intelligence and Mental Health Monitoring – Legal Considerations

F3 Partner Gretchen Shipley spoke to a group of educators about how AI can impact mental health issues in schools. She shared that although AI tools can have a positive impact in helping prevent problems arising from a student’s mental health, educators and administrators must be mindful that the AI software must meet certain standards. The fact that AI can bring a benefit to all students does not diminish a school’s duties to its students, including its duty of care, duty to supervise, and duty to provide a safe environment, which includes providing safe online AI tools for its students.

Because Open-Source AI shares information to all users, educators and administrators must be careful when entering personal identifiable information (PII) into that software. Parent permission may even be required before that information can be shared. To ensure that any AI software purchased—or used at no cost—by a district meets the responsibilities of the district to its students, the district should have a process in place to review terms and conditions for any AI software, including free ones like open AI. Review will often reside with the district’s legal counsel, IT team or administrative team. When creating a process, districts should consider including asking the software company questions like:

  • If it’s an AI tool, does it have threat detection capability?
  • Can the threat assessment differentiate or escalate different levels of threat?
  • Is there human oversight?
  • To whom are threats reported?

You can usually find answers to these questions in the software company’s terms and conditions, but if you can’t find them there, ask your salesperson or the software company’s client service team. Gretchen provided four tips for reviewing and negotiating software agreements. 

  1. Remember that AI software are not necessarily tailored for education.
  2. AI software are not necessarily tailored for use by minors. 
  3. AI software are not necessarily tailored for state specific requirements. 
  4. School districts should feel empowered to question and negotiate terms as needed. 

You are there to advocate for students and fulfill your duties to them. These tips will help you achieve that. 

Gretchen will join The School Superintendents Association's (AASA) Student & Child Privacy Center and the Public Interest Privacy Center (PIPC) for the third session of their three-part webinar on AI in June.

Legal Considerations: AI in Special Education
June 18, 2024
10-10:45 a.m. PT

Register for the event below: