Board Policy 2112
Artificial Intelligence Systems and Tools Use
The introduction of AI offers unprecedented opportunities for enhancing teaching methods, expanding learning resources, and fostering innovative educational experiences. However, Artificial Intelligence (“AI”) also presents unique risks, challenges, and responsibilities, particularly in terms of ethical use, data privacy and security, and the accuracy and integrity of academic work. This Policy serves to responsibly harness the potential of these AI technologies while also safeguarding the interests and well-being of our students, teachers, and professional staff. Through this Policy, Northeast Tech (NT) endeavors to (i) prepare our students, instructors, and professional staff for the future and (ii) equip them with the knowledge and skills to use these systems and tools wisely and ethically. NT will continue to support our instructors in incorporating AI into their teaching practices in ways that enrich the teaching and learning experience while upholding the district’s educational standards and values.
AI systems and tools must comply with data privacy and security laws and policies.
AI systems and tools will serve to enhance the district’s commitment to high-quality learning.
Safeguards are essential to the use of AI systems and tools to minimize bias, promote fairness, and preserve the rigor and integrity of learning,
The use of AI systems and tools by students, instructors and professional staff must account for the context of teaching and learning and should be adopted, implemented and utilized in ways that maximize equity of access, use and benefit.
Evaluation of AI tools: Employees must evaluate the security of any AI tool before using it. This includes reviewing the tool’s security features, terms of service, and privacy policy. Employees must also check the reputation of the tool developer and any third-party services used by the tool.
Protection of confidential data: Employees must not upload or share any data that is confidential, proprietary, or protected by regulation without prior approval from the appropriate department. This includes data related to customers, employees, or partners.
Access control: Employees must not give access to AI tools outside the company without prior approval from the appropriate department or manager and subsequent processes as required to meet security compliance requirements. This includes sharing login credentials or other sensitive information with third parties.
Use of reputable AI tools: Employees should use only reputable AI tools and be cautious when using tools developed by individuals or companies without established reputations. Any AI tool used by employees must meet our security and data protection standards.
Compliance with security policies: Employees must apply the same security best practices we use for all company and customer data. This includes using strong passwords, keeping software up-to-date, and following our data retention and disposal policies.
Data privacy: Employees must exercise discretion when sharing information publicly. As a first step, employees must ask themselves the question, “Would I be comfortable sharing this information outside of the company? Would we be okay with this information being leaked publicly?” before uploading or sharing any data into AI tools. Second would be to follow 6) above.
Student Use Guidelines:
Certain assignments may permit, encourage or require the use of AI systems and tools. In each case, it will be clearly stated in the assignment or specified by the teacher. Use beyond the specified guidelines of the instructor or assignment should be understood as prohibited. It is each student’s responsibility to assess the validity and applicability of any AI output that is submitted with an assignment.
Students are allowed to use AI for explanations of concepts, exploration of new topics of interest, and seeking guidance on research directions. However, students should be mindful that some AI is prone to “hallucinations”, false answers/information, or outdated information. Accordingly, AI can generate erroneous, misleading, and/or biased information. Thus, students must always verify the information provided by AI using reliable sources such as textbooks, scientific papers, and reputable educational websites. Students must verify that any response from an AI tool that they intend to rely on, or use is appropriate, accurate, not a violation of any other individual or entity’s property or privacy rights, and consistent with the district’s academic policies.
Students should not upload or input any personal, confidential, proprietary, or sensitive information into any AI tool. Examples include passwords and other personal information such as names, likenesses, social security numbers, credit card or bank account numbers.
Offenses or violations of this Policy will be addressed by the teacher and professional staff. Procedures should be clearly established in the student discipline code or academic integrity policies.
Staff Use Guidelines:
Instructors and professional staff may consult AI for ideas, outlines and to enhance the educational experience, such as supplementing lesson plans, providing differentiated instruction, and aiding in curriculum development.
Instructors and professional staff must ensure that their use of any AI tool complies with applicable laws such as those governing data and student privacy and district policies, including, without limitation, those regarding student information. All tools are compliant if no protected information is entered into the tool.
Instructors and professional staff should not upload or input any confidential, proprietary, or sensitive information, including any such District or student information into any AI tool. Examples include passwords, personal information such as names, likeness, social security numbers, credit card or bank account numbers and other credentials, personnel material, information from non-public district documents, including those identified as or understood to be confidential or sensitive (based on their nature or context) or any other non-public district information that might be harmful to the district if disclosed.
Instructors and district/site professional staff should guide students in using AI.
Instructors and professional staff should carefully evaluate the appropriateness of AI for educational purposes on a case-by-case basis, considering their appropriateness for each educational context, accuracy, reliability, and alignment with curriculum standards.
Instructors and professional staff must supervise student use of AI to ensure it is being used appropriately and constructively in the learning process.
Instructors who suspect plagiarism or use of AI that violates district policy should first have a conversation with a student to ensure that they understand expectations for acceptable use. Instructors should consult with administration to determine appropriate steps to investigate any possible violation of policy. AI detection tools will not be the basis of information relied upon in an investigation when it is believed that policy has been violated with regard to the use of AI by students.
District Level Guidelines
Approved tools and their uses should be determined by the appropriate school district personnel after consideration of security, privacy, data usage, and academic integrity and quality standards, regulations, and values.
Adopted by the Board on November 9, 2015