Regulations Registry

Browse and explore ethical frameworks and compliance standards for EdTech

93

Total Regulations

6

Jurisdictions

9

Categories

93

Filtered Results

Filter Regulations

Search and filter regulations by name, jurisdiction, or category

General Data Protection Regulation

GDPR
Data Privacy

EU regulation on data protection and privacy

European Union, Global
Version: 2016/679
All users in EU
View Source

California Consumer Privacy Act

CCPA
Data Privacy

California law enhancing privacy rights and consumer protection

United States
Version: AB-375
California residents
View Source

Children's Online Privacy Protection Act

COPPA
Child Protection

US federal law protecting online privacy of children under 13

United States, Global
Version: 15 U.S.C. 6501–6506
Children (under 13)
View Source

Health Insurance Portability and Accountability Act

HIPAA
Healthcare

US law protecting sensitive patient health information

United States
Version: Pub.L. 104–191
Healthcare Patients
View Source

Web Content Accessibility Guidelines

WCAG
Accessibility

International standard for web accessibility

Global
Version: 2.1
Elderly Users (65+), All users
View Source

Americans with Disabilities Act

ADA
Accessibility

US civil rights law prohibiting discrimination based on disability

United States
Version: 42 U.S.C. § 12101
All users
View Source

Family Educational Rights and Privacy Act

FERPA
Education

US federal law protecting student education records

United States
Version: 20 U.S.C. § 1232g
Students, Minors (13-17)
View Source

Payment Card Industry Data Security Standard

PCI DSS
Financial

Security standard for organizations handling credit cards

Global
Version: 4.0
Financial Services Customers
View Source

Personal Information Protection and Electronic Documents Act

PIPEDA
Data Privacy

Canadian law governing how private sector organizations collect, use and disclose personal information

Canada
Version: S.C. 2000, c. 5
All users
View Source

Family Educational Rights and Privacy Act

FERPA
Data Privacy

A federal law that protects the privacy of student education records at schools receiving U.S. Department of Education funds. FERPA gives parents (and eligible students at 18+) rights to access and request correction of education records, and generally requires schools to obtain written parent consent before sharing a student’s personally identifiable information in those records (with limited exceptions). Schools that fail to comply may risk loss of federal funding.

United States
Version: 1974
Students
View Source

Protection of Pupil Rights Amendment

PPRA
Education

A federal law that governs student surveys and certain data collection in schools. PPRA applies to any school district receiving U.S. Department of Education funds and requires parental notice and consent or opt-out for students to participate in surveys, analyses, or evaluations that ask about sensitive information in eight protected areas (e.g. political beliefs, mental health, sexual behavior/attitudes, illegal behavior, religious practices). It also gives parents the right to inspect instructional materials and opt students out of activities like marketing surveys or certain physical exams.

United States
Version: 1978
Children (under 13), Students
View Source

Children’s Online Privacy Protection Act

COP
Data Privacy

COPPA is a federal law (enforced by the FTC) that protects the privacy of children under 13 online. It imposes requirements on operators of websites, apps, or online services directed to children under 13, or those with actual knowledge of collecting personal data from children under 13. Such services must post a clear privacy policy, obtain verifiable parental consent before collecting personal information from children, limit data use, and allow parents to review/delete kids’ data. EdTech companies targeting K-12 students often fall under COPPA and may allow schools to provide consent in place of parents for educational use only.

United States
Version: 1998
Children (under 13), Students
View Source

Children’s Internet Protection Act

CIP
Child Protection

Enacted in 2000, CIPA ties federal E-rate funding to internet safety in schools and libraries. It requires K–12 schools and libraries that receive discounted Internet service (E‐Rate) to implement an Internet safety policy with “technology protection measures” (filters) to block access to obscene or harmful content for minors. Schools must also monitor minors’ online activities and educate students about appropriate online behavior and cyberbullying. The policy must address topics like preventing access to inappropriate material, ensuring students’ online safety in email/chat, preventing hacking, and protecting minors’ personal information. Adult users can ask to disable filters for bona fide research or lawful purposes.

United States
Version: 2000
Children (under 13), Minors (13-17)+1 more
View Source

Individuals with Disabilities Education Act

IDEA
Accessibility

IDEA is the main federal law ensuring students with disabilities receive a “Free Appropriate Public Education.” It requires schools to develop an Individualized Education Program (IEP) for each eligible student and to provide the special education services and supports needed. Since the 1990s, IDEA has explicitly included assistive technology (AT) devices and services as part of the special education services schools must provide if a child needs them for a meaningful education. IEP teams are required to consider the need for assistive technology when planning a student’s program. This means EdTech tools must be made available in accessible formats or with accommodations if they are necessary for a student’s learning.

United States
Version: 1975
Children (under 13), Students+1 more
View Source

DOE Guidance: Student Privacy and Online Educational Services

DGS
Data Privacy

(Official guidance from the U.S. Department of Education) In Feb 2014, the Privacy Technical Assistance Center (PTAC) issued “Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices.” This non-binding guidance helps schools and EdTech providers understand how FERPA and other privacy laws apply to educational apps, cloud services, and other third-party online tools used in classrooms. It emphasizes best practices like avoiding excessive data collection, reviewing vendors’ terms of service for compliance, using contracts to clarify that vendors are “school officials” under FERPA, and ensuring proper security safeguards. It does not cover students’ personal use of social media or general websites (only use as part of school activities)

United States
Version: 2014
Students
View Source

DOE Guidance: Artificial Intelligence in Education

DGA
Data Privacy

(Official guidance from the U.S. Department of Education) In July 2025, the Department of Education issued a Dear Colleague Letter on the use of Artificial Intelligence (AI) in schools, alongside a proposed priority for federal grants. This guidance recognizes AI’s potential in personalized learning and administrative efficiency while emphasizing “responsible use” principles. It advises that any AI tools used with students should align with existing laws (like privacy and civil rights regulations), ensure transparency, protect student privacy, and involve parents and teachers in decision-making. The Department affirms that spending federal education funds on AI is allowable if it meets applicable requirements and ethical considerations. (This guidance is not a law but sets expectations and may influence future regulations or grant criteria.)

United States
Version: 2025
Students
View Source

Section 504 (Rehabilitation Act of 1973) & Title II (Americans with Disabilities Act of 1990)

S5(
Accessibility

These civil rights laws require equal access for individuals with disabilities. In the education context, any school receiving federal funds (virtually all public K-12 and colleges) under Section 504, and all public educational institutions under ADA Title II, must ensure that students with disabilities can access the same programs and content as others. This means school-provided technology and digital content (including EdTech software, websites, videos, etc.) must be accessible – or accommodations must be provided – so that students with disabilities are not excluded or at a disadvantage. For example, websites should support screen readers for blind students, videos need captions for deaf students, and interactive content should be usable via assistive devices. The U.S. Department of Education and Department of Justice have made it clear that emerging educational technologies must comply with these accessibility obligations (no new legal standard, but enforcement of existing 504/ADA requirements).

United States
Version: 1973&1990
Students, Users with Disabilities
View Source

Federal Trade Commission Act – Section 5

FTC
Data Privacy

Section 5 of the FTC Act broadly prohibits “unfair or deceptive acts or practices” in commerce. This general law applies to EdTech companies and education service providers (especially for-profit vendors). In practice, the FTC can take action against an EdTech provider that, for example, misleads consumers (schools, parents, or students) about its data practices or security measures, or fails to reasonably safeguard personal data resulting in harm. Even if no specific student-privacy law applies, an EdTech company’s privacy policy and representations must be truthful. Likewise, inadequate data protection that leads to a breach of student data could be deemed an “unfair” practice. Section 5 essentially backs up other laws by ensuring EdTech businesses adhere to their promises and don’t engage in harmful data practices.

United States
Version: 1914
Children (under 13), Students
View Source

Children and Teens’ Online Privacy Protection Act – “COPPA 2.0”

CTO
Data Privacy

A bipartisan bill to update COPPA for the modern era (led by Senators Markey and Cassidy). COPPA 2.0 would extend online privacy protections to teens ages 13–16 (not just under 13) and strengthen requirements for all minors. Key provisions include banning targeted advertising to children and teens, introducing an “eraser button” so young users/parents can delete personal data, requiring data minimization (companies can collect only what’s reasonably necessary), and closing loopholes in COPPA’s coverage (e.g. holding platforms accountable even when they claim they lack knowledge of kids on their service). It would also raise the age of consent to 16, meaning sites would need parental consent for any user under 17.

United States
Version: Proposed 2021/2023
Children (under 13), Minors (13-17)
View Source

Kids Online Safety Act

KOS
Data Privacy

Known as KOSA, this proposed law aims to impose a duty of care on online platforms to protect minors from harmful content and experiences online. It would apply broadly to “covered platforms” likely to be used by minors (social media, online games, etc.), requiring them to act to prevent and mitigate risks like sexual exploitation, self-harm content, bullying, or other harms to minors. Platforms would have to provide minors with privacy-protective settings by default and give parents new controls to supervise their child’s online experience. KOSA also mandates greater transparency – platforms would need to disclose if and how they use algorithms or targeted ads on minors, prohibit advertising certain adult products to minors, and submit annual reports on foreseeable risks.

United States
Version: Introduced 2022
Children (under 13), Minors (13-17)
View Source

Protecting Student Privacy Act (amending FERPA)

PSP
Data Privacy

Originally introduced by Senators Markey and Hatch, this proposal would modernize FERPA to better regulate education technology vendors and strengthen student data privacy. The bill would require K-12 school service providers to implement data security safeguards for any personal student information they hold, and prohibit companies from using student data for targeted advertising or marketing to students. It would also give parents the right to access and correct information about their children held by private companies, mandate transparency by requiring schools or vendors to list all outside parties with whom student data is shared, and include data-minimization and deletion requirements (companies must delete students’ personal info when it’s no longer needed for the purpose it was collected)

United States
Version: Proposed 2014
Children (under 13), Students
View Source

New York Education Law § 2-d – Student Data Privacy

NYE
Data Privacy

This law prohibits the unauthorized release of personally identifiable information (PII) from student records and imposes strict privacy and security obligations on both educational agencies and their third-party contractors (including EdTech vendors). Under Ed Law 2-d, school districts must ensure that any contract with an EdTech provider includes a data privacy agreement and the Parents’ Bill of Rights for Data Privacy and Security (see below). Vendors are barred from selling student data or using it for commercial purposes, and must implement robust data security measures (aligned with standards like NIST) to protect student PII. The law also mandates prompt breach notification procedures and annual privacy training for school staff.

United States
Version: Signed into law in 2014 (as part of Education Law), with regulations effective in 2020.
Students
View Source

8 NYCRR Part 121 – NYSED Data Privacy Regulations

NP1
Data Privacy

Part 121 of the Commissioner’s Regulations implements Education Law § 2-d by detailing specific requirements for schools and EdTech contractors. It requires each educational agency to appoint a Data Protection Officer and adopt a Data Security and Privacy Policy. It also spells out the contract requirements for third-party vendors (such as defining data security safeguards, breach obligations, data retention limits, etc.). Part 121 formally incorporates the Parents’ Bill of Rights and mandates that contractors adhere to supplemental information disclosures (listing what data they collect, for what purpose, how it’s protected and destroyed) on the school’s website. In essence, Part 121 strengthens data privacy and security in NY schools by holding EdTech providers to clear standards

United States
Version: 2020
Students
View Source

NYSED Parents’ Bill of Rights for Data Privacy and Security

NPB
Data Privacy

The Parents’ Bill of Rights is a document that each New York educational agency must publish and include in contracts with third-party vendors. It informs parents and students of their rights and the vendor’s obligations regarding student data. Key points include: no selling of student data or using it for marketing; the right of parents to inspect and review their child’s educational records; requirements to safeguard data with encryption, firewalls, and other best practices; a duty to notify affected parties of data breaches within 60 days; and transparency about what student data is collected by the State. Schools must also list all their EdTech contractors and the data elements shared, along with retention and deletion policies.

United States
Version: 2014
Children (under 13), Students
View Source

NYSED Model Data Privacy Agreement

NMD
Data Privacy

To assist with Ed Law 2-d compliance, NYSED developed a Model Data Privacy Agreement (DPA) for educational agencies to use in contracts with EdTech providers. This template addendum translates legal requirements into contract language. It covers vendor obligations such as: adhering to the Parents’ Bill of Rights; maintaining confidentiality and using student data only for the agreed educational purposes; implementing administrative, technical, and physical security measures; data breach notification duties; and ensuring subcontractors also abide by these terms. While use of the model DPA is not mandatory, most New York school districts use it or a similar standardized agreement when procuring EdTech services to ensure the vendor’s practices meet state standards.

United States
Version: 2020
Students
View Source

Biometric Identifying Technology Ban in Schools

BIT
Data Privacy

New York put a moratorium on the purchase or use of biometric identifying technology in all K–12 schools (public and private) until a study and guidelines could be issued. This was prompted by concerns over facial recognition and other biometric surveillance in schools. In August 2023, the NYS Office of Information Technology Services released a comprehensive report on biometrics in schools, and in September 2023 the Education Commissioner issued an order under this law. Result: facial recognition technology is banned outright in New York schools. Other types of biometric tech (e.g. fingerprint ID for cafeteria entry, etc.) are allowed only if a local school board carefully considers privacy, civil rights, effectiveness, and seeks parental input. The ban on facial recognition is now effectively permanent unless future action is taken. This law and policy directly affect EdTech firms offering biometric security or student tracking tools.

United States
Version: 2021
Students
View Source

“Phone-Free Schools” Law

“SL
Education

This new statewide policy (an amendment to Education Law) requires “bell-to-bell” restrictions on student use of personal internet-enabled devices on school grounds. During the entire school day (from first to last bell), students in K–12 may not use smartphones, tablets, smartwatches, or any personal device capable of internet access, unless specifically authorized by school officials for an educational or other exempt purpose. Schools must develop local policies and secure storage plans for student devices (e.g. requiring devices to be kept in lockers or Yondr pouches), and they must accommodate ways for parents to contact students during the day. There are limited exemptions – for example, devices can be used if a teacher or principal explicitly permits it for a class activity, for urgent health or safety reasons, translation for English learners, or other case-by-case needs. This law impacts EdTech providers whose services rely on student smartphones or personal devices; those tools will now need to be integrated into teacher-led instruction or on school-provided devices.

United States
Version: 2025
Students
View Source

New York SHIELD Act

NYS
Data Privacy

The Stop Hacks and Improve Electronic Data Security (SHIELD) Act is a statewide law that strengthened New York’s general privacy and cybersecurity requirements. It expanded the definition of “private information” (now including biometric data, online account credentials, etc.) and requires any person or business handling private information of NY residents – which includes EdTech companies – to implement a “reasonable” data security program. The law spells out administrative, technical, and physical safeguards (e.g. designate a security coordinator, risk assessments, employee training, network monitoring, access controls, data disposal policies). It also broadened the state’s breach notification law: companies must notify affected New Yorkers and the NY Attorney General of any security breach involving private information, including unauthorized access (not just acquisition) of data. Non-compliance can lead to Attorney General enforcement and civil penalties. For EdTech providers, SHIELD mandates a baseline of cybersecurity practices and incident response protocols beyond the education-specific rules.

United States
Version: 2019
Students
View Source

New York “Web Accessibility” Requirements

NY“
Accessibility

New York State law now requires that all state agency websites and any websites or software provided under state contracts conform to modern web accessibility standards. Specifically, state agencies and their contractors must ensure web content meets the Web Content Accessibility Guidelines (WCAG) (the most current version, e.g. WCAG 2.1 AA). This legal requirement (in Exec. Law 170-f and State Tech Law 103) led the NYS Office of IT Services to issue Policy NYS-P08-005, establishing minimum accessibility criteria for information and communication technology used by “state entities”. While K–12 public school districts are not state agencies per se, many interpret these rules – along with federal ADA obligations – as requiring accessible technology in education. For EdTech vendors, this means products used in New York should be usable by people with disabilities (e.g. providing alt text for images, keyboard navigation, captioning for videos, etc.). If an EdTech company contracts with any NY state agency or possibly with public schools, it must comply with these accessibility standards.

United States
Version: 2019
Students, Users with Disabilities
View Source

New York Child Data Protection Act

NYC
Data Privacy

The Child Data Protection Act (CDPA) is one of the nation’s most stringent privacy laws for minors. It extends privacy protections to all online users under 18 (unlike COPPA which stops at age 13). The CDPA prohibits online platforms from collecting, using, or sharing personal data of individuals under 18 unless it is strictly necessary for the service or the minor (13–17) has provided affirmative consent (an “opt-in”). In practice, for teens 13–17, an EdTech provider must either limit data processing to core service needs (listed in the law, e.g. providing the requested educational service, ensuring security, debugging, etc.)or obtain clear, revocable consent from the teen for any additional data processing. For children under 13, the CDPA defers to COPPA’s requirements (verifiable parental consent). The law flatly bans the sale of minors’ personal data and the use of their data for targeted advertising or profiling without consent. It also requires companies to honor browser or device signals that indicate a user is a minor, treating them as opted-out of data processing by default. EdTech businesses must review their data practices: many educational services directed at K–12 were already restrictive, but this law also covers general websites or apps with teen users. The NY Attorney General enforces the CDPA, and has issued guidance to clarify terms and encourage good-faith compliance during initial implementation

United States
Version: 2024
Children (under 13), Minors (13-17)+1 more
View Source

Stop Addictive Feeds Exploitation (SAFE) for Kids Act

SAF
Data Privacy

The SAFE for Kids Act is a first-in-the-nation law aiming to curb social media features deemed addictive for minors. It will require social media platforms to disable algorithmic content feeds and push alerts for users under 18 unless a parent or guardian provides consent. By default, under-18 users in New York will only see content in a chronological or non-personalized feed (e.g. from accounts they follow) rather than endless AI-driven recommendations. The law also prohibits sending nighttime notifications (from midnight to 6AM) to minors without parental consent. To enforce this, platforms must implement age verification mechanisms and obtain verifiable parental consent for under-18 users who wish to opt into algorithmic feeds. The NY Attorney General’s office is currently developing regulations on acceptable age assurance methods and other implementation details. While the primary targets are big social media companies (like TikTok, Instagram, YouTube), any EdTech or online service with user-generated content feeds or algorithmic recommendations might fall under the broad definition of “social media” if minors spend significant time on a feed. The law provides for penalties up to $5,000 per violation once in effect. If an EdTech platform includes scrollable content feeds, recommendation algorithms, or notification systems that could engage students similarly to social media, it may need to either turn off those features for minors by default or build in parent consent workflows. Additionally, robust age verification will be required, but with privacy safeguards (e.g. offering options beyond government IDs, and not retaining personal age data)

United States
Version: 2024-pending
Minors (13-17), Students
View Source

New York Biometric Privacy Act

NYB
Data Privacy

This proposal would impose strict rules on private entities that collect or use biometric identifiers (such as fingerprints, iris scans, voiceprints, facial geometry). Key provisions in the bills include: requiring companies to develop a public written policy on their use, retention, and destruction of biometric data; deleting biometric data within a set time (e.g. when the initial purpose is fulfilled or within 3 years of the individual’s last interaction, whichever comes first); obtaining informed written consent from individuals (or their parents, if minors) before collecting or sharing biometric identifiers; and prohibiting businesses from selling or profiting from an individual’s biometrics. Notably, the act would grant individuals a private right of action to sue for violations, with statutory damages (as BIPA does), which could significantly increase liability risks for companies. For EdTech providers, if this act passes, any feature involving biometrics – e.g. facial recognition for proctoring, eye-tracking for engagement analytics, voice recognition, or fingerprint-based login – would require careful compliance or possibly re-engineering to avoid using biometrics. The NYC area already has a local biometric ordinance (for retail establishments), but this state law would be broader.

United States
Version: Proposed 2025
Minors (13-17), Students
View Source

Student Online Personal Information Protection Act (SOPIPA)

SOP
Education

A landmark California law (Business & Professions Code §22584) that directly regulates K–12 online educational services. It prohibits operators of K–12 educational websites/apps from selling student personal data, using it for targeted advertising, or creating a profile on a student for non-educational purposes. It also requires such operators to implement reasonable security measures and to delete student information on school request.

California
Version: 2014
Students
View Source

Early Learning Personal Information Protection Act

ELPIPA
Data Privacy

This law (Assembly Bill 2799, “ELPIPA”) extends SOPIPA’s privacy protections to younger children in preschool and pre-K programs. It applies similar rules to operators of online services used by preschool and pre-K students, ensuring that even before kindergarten, children’s data is protected. In essence, it governs early childhood student data by prohibiting sale, profiling, or misuse of that data and by requiring security measures, just as SOPIPA does for K–12

California
Version: 2016
Children (under 13), Students
View Source

California Education Code §49073.1 – Contracts with 3rd-Party Providers

CEC
Data Privacy

Added by AB 1584, this law requires that any contracts between K–12 schools (local educational agencies) and third-party digital service providers include specific student data privacy safeguards. Schools must ensure contracts declare the school owns and controls the student records, and vendors must agree to use the data only for the purposes in the contract. The law spells out that contracts should address how students can access or export their work, require vendors to keep data secure and confidential, notify the school/district in case of any data breach, and delete all student records once the contract ends. In short, it sets mandatory privacy clauses for school–vendor agreements.

California
Version: 2014
Students
View Source

California Education Code §49073.6 – Social Media Monitoring

CEC
Data Privacy

AB 1442 addresses school districts’ use of third-party services to monitor students’ social media for safety or bullying concerns. It requires that before starting any social media monitoring program, a school district must notify students and parents and hold a public hearing. The law limits collection to information related to school or student safety, and any social media data collected must be destroyed when no longer needed. Importantly for vendors, if a district hires a company for such monitoring, that company is forbidden from selling or sharing the student information with anyone except the district (or the student or parent). In effect, this law ensures transparency and privacy in any school-sanctioned monitoring of students online.

California
Version: 2014
Students
View Source

Attorney General’s “Ready for School” EdTech Privacy Recommendations

AG“
Data Privacy

While not legally binding, the CA Attorney General’s 36-page guidance “Ready for School” outlines best practices for EdTech companies to protect student privacy. The guidance, developed in consultation with industry and experts, organizes recommendations into six principles: Minimize data collection (only gather what is needed and retain it only as long as necessary); Use data only for educational purposes; Ensure any data shared with third parties carries privacy protections (“make protections stick” beyond the first recipient); Individual control (respect students’ and parents’ rights to access and correct data); Data security safeguards (implement robust security appropriate to student data sensitivity); and Transparency (provide clear privacy policies in understandable language). EdTech businesses are encouraged to follow these as a code of conduct, complementing California’s laws.

California
Version: 2016
Students
View Source

Learning With AI, Learning About AI — California Department of Education (CDE) Guidance

LWA
Data Privacy

CDE’s central K–12 AI page frames AI as a tool to enhance learning while emphasizing human-led instruction, safety, privacy, equity, and accessibility. It states the page is informational rather than mandatory (per Ed. Code §33308.5), and curates resources: an educator-facing webinar series (e.g., safe use, bias and social impacts, accessibility & personalization, multilingual learners, academic integrity), foundational AI literacy material for teachers/students, and practical prompts to help LEAs craft ethical use guidelines and school-level policies. It encourages data-minimizing adoption, transparency about terms of use and data collection, and building capacity (teacher training, student voice). The page also points districts to the state AI in Education Workgroup developing statewide guidance/model policy.

California
Version: 2025
Students, Users with Disabilities
View Source

California Consumer Privacy Act & Privacy Rights Act

CCP
Data Privacy

The CCPA (Civil Code §1798.100 et seq.), as amended by the CPRA, is a comprehensive consumer data privacy law that applies to many businesses in California, including EdTech companies if they meet the applicability thresholds. It grants California residents new rights over personal information: the right to know what data is collected and how it’s used/shared, the right to delete personal data (with some exceptions), the right to opt out of the sale or sharing of personal data, and the right to non-discrimination for exercising privacy rights. For businesses, this means obligations to provide disclosures and honor consumer requests. Notably, CCPA/CPRA has special provisions for minors’ data: selling or sharing personal information of consumers under 16 years old requires opt-in consent (for under 13, consent must come from a parent/guardian; ages 13–15 can consent themselves). EdTech firms handling personal data of California students/parents may need to comply if they fall under CCPA (or if they share data with other commercial services).

California
Version: 2018
Children (under 13), Minors (13-17)+1 more
View Source

California Age-Appropriate Design Code Act

CAD
Data Privacy

The Age-Appropriate Design Code (AADC) is a sweeping new law aimed at enhancing online privacy and safety for minors under 18. It requires any business offering an online service, site, or app likely to be accessed by children to follow stringent privacy-by-design principles. Key requirements include: setting high privacy defaults for minors, minimizing data collection/use to only what is necessary for the service, and no profiling or precise geolocation of a child by default unless a compelling reason is demonstrated. The Act also bans using “dark patterns” or nudges that encourage children to provide personal data in a way that is detrimental to their well-being. Additionally, covered companies must conduct Data Protection Impact Assessments (DPIAs) for any new online service likely to be accessed by kids, to evaluate and mitigate risks to children’s privacy or health, and must provide privacy information and terms in clear, age-appropriate language. Enforcement will be by the new California Privacy Protection Agency starting mid-2024, with fines for violations. This law will significantly affect EdTech products used by children by pushing them toward child-centric privacy and safety design.

California
Version: 2022
Children (under 13), Minors (13-17)+1 more
View Source

Digital Accessibility Requirements (Government Code §§11135 & 7405)

DAR
Accessibility

Gov. Code §11135 originally enacted 1977 (amended to address technology in early 2000s); Gov. Code §7405 added 2003; further reinforced by AB 434 (2017) for state websites. California law mandates that technology used by government agencies (including public education institutions) must be accessible to people with disabilities. Government Code §11135 prohibits any program or activity that receives state funding from discriminating on the basis of disability, and California’s Section 7405 explicitly requires state entities to comply with Section 508 standards for electronic and information technology. In practice, if an EdTech product is used by a public school or provided under a state contract, it should meet accessibility standards (e.g. WCAG 2.0 AA or Section 508) so that students with disabilities can use it. AB 434 further required state agencies to certify their websites for accessibility compliance. For EdTech businesses, this means accessibility isn’t just good practice but often a procurement requirement in California schools.

California
Version: 1977-2017
Students, Users with Disabilities
View Source

Data Security and Breach Notification Laws

DSB
Education

California’s data breach notification law (SB 1386) took effect July 2003; the data security mandate (AB 1950) took effect January 2005. These have been amended multiple times (e.g. AB 825 in 2021 expanded the definition of personal information). California was the first state to enact a Data Breach Notification statute. Civil Code §1798.82 requires any business (or government agency) that experiences a breach of unencrypted personal information of a California resident to provide prompt notice to the affected individuals (and to the Attorney General if a large number of residents are affected). Complementing this, Civil Code §1798.81.5 requires businesses to implement and maintain reasonable security procedures and practices appropriate to the nature of the personal information they handle. For EdTech companies, this means if you collect personal data on students or teachers in California, you must ensure you have adequate cybersecurity measures in place, and if a data breach occurs (e.g. an unauthorized person accesses student data), you are legally required to notify the affected schools or individuals under California law. Non-compliance can lead to regulatory enforcement and legal liability, including a private right of action for certain breaches.

California
Version: 2002
Students
View Source

Leading Ethical AI Development KIDS Act

LED
Child Protection

Titled the “Leading Ethical AI Development for Kids Act,” AB 1064 is a response to concerns over AI-based conversational agents targeting children. If enacted, this law would regulate “companion” chatbots and generative AI systems designed for children. It defines a “companion chatbot” as an AI system that simulates a human-like relationship with a user (remembering past interactions, engaging emotionally, etc.). The bill bars companies from offering such AI chatbots to minors unless the AI is not foreseeably capable of causing harm – for example, the bot must not be able to encourage self-harm or violence, not provide unsupervised therapy or inappropriate advice, and not facilitate criminal or dangerous activities. In essence, the AI must be built with safeguards so it cannot knowingly drive a child toward harmful outcomes. AB 1064 would empower the state Attorney General to enforce these rules, with civil penalties up to $25,000 per violation.

California
Version: Pending-2025
Children (under 13), Minors (13-17)+1 more
View Source

Texas Education Code, Chapter 32, Subchapter D — Student Information (HB 2087)

TEC
Data Privacy

This law regulates online operators and service providers that collect student information through educational platforms, websites, or applications used primarily for a “school purpose.” It prohibits companies from selling or using student data for targeted advertising or for creating behavioral profiles unrelated to educational functions. The statute defines “covered information” to include any data that identifies a student or is linked to a student’s educational record. Operators are required to implement and maintain reasonable administrative, technical, and physical security procedures to protect such data from unauthorized access, disclosure, or destruction. They must also delete student information upon request from a school district or charter school, and they can disclose information only for specific, authorized educational purposes. The law mirrors federal FERPA principles and serves as the cornerstone of student data privacy compliance in Texas for all K–12 EdTech providers.

Texas
Version: 2017
Students
View Source

TEC §11.175 — District Cybersecurity (SB 820)

T§D
Education

Senate Bill 820 mandates that every Texas school district adopt a comprehensive cybersecurity policy to protect student and employee information systems from unauthorized access or data breaches. Each district must designate a “Cybersecurity Coordinator” who serves as the primary liaison with the Texas Education Agency (TEA) and the Department of Information Resources (DIR) in the event of a cybersecurity incident. Districts are required to identify risks, implement mitigation measures, and establish an incident response plan. Cybersecurity coordinators must report any breach that compromises district systems or student data as soon as practicable. The statute also encourages districts to provide annual cybersecurity training to employees. For EdTech vendors, this law implies contractual obligations to notify districts of breaches and to cooperate with district cybersecurity coordinators during incident response and remediation.

Texas
Version: 2019
Students
View Source

HB 1481 (89R) — Student Use of Personal Communication Devices

1(S
Data Privacy

Under Section 38.023, the Texas Education Agency is required to compile and maintain a list of Internet safety resources for school districts. These materials guide educators, students, and parents in understanding issues such as personal data protection, responsible technology use, intellectual property, plagiarism, and online ethics. The TEA’s Internet Safety Resource List provides districts with vetted programs, curricula, and instructional tools to incorporate digital citizenship and online safety into classroom instruction. The goal of this guidance is to promote awareness of data privacy, cybersecurity, and responsible digital behavior among students. Although not prescriptive, TEA’s Internet Safety guidance serves as an authoritative model for district Acceptable Use Policies (AUPs) and student digital literacy programs.

Texas
Version: 2025
Students
View Source

TEA K–12 Cybersecurity Initiative (Official Program Guidance)

TKC
Cybersecurity

The K–12 Cybersecurity Initiative, jointly managed by the TEA and the Department of Information Resources, provides statewide assessments, technical assistance, and cybersecurity improvement programs for public schools. It helps districts evaluate network defenses, implement managed security services, and align practices with state and federal standards. Participating districts may undergo vulnerability assessments, endpoint monitoring, and threat response coordination under state contracts. While participation is voluntary, the initiative sets the operational benchmark for how EdTech vendors must secure integrations with district networks. It also defines expectations for multi-factor authentication (MFA), system patching, and data segregation within cloud environments.

Texas
Version: 2021
Students
View Source

Texas Education Code §26.009 – Parental Consent for Recordings of Students

TEC
Education

This statute establishes that a school employee, contractor, or agent may not record the voice or image of a student without written parental consent, except under specific statutory exemptions. These exceptions include situations such as safety and disciplinary monitoring, regular classroom instruction, extracurricular activities, or media coverage where parental consent is implied. For EdTech companies that provide classroom cameras, video conferencing, or remote proctoring tools, compliance with this section is critical. Vendors must design their platforms to support parental consent workflows and ensure that video or audio recordings collected for legitimate educational purposes are securely stored and accessible only to authorized individuals.

Texas
Version: 1995
Students
View Source

Texas Education Code §44.031 – School District Purchasing and Procurement

TEC
Data Privacy

This section of the Education Code governs how school districts procure goods and services, including educational technology. Districts must use competitive purchasing methods that ensure “best value” for taxpayers. Approved methods include competitive sealed bids, requests for proposals, and interlocal cooperative contracts. The law requires that districts consider not only price but also reputation, quality, and compliance with student privacy and cybersecurity standards when selecting vendors. The TEA Financial Accountability System Resource Guide (FASRG) Module 5 elaborates on procurement practices for technology contracts, emphasizing data protection, accessibility, and interoperability.

Texas
Version: 1995
Students, Users with Disabilities
View Source

Texas Data Privacy and Security Act

TDP
Data Privacy

The Texas Data Privacy and Security Act (TDPSA) is the state’s first comprehensive data privacy law, applying to most entities conducting business in Texas and processing personal data of state residents. It grants consumers the right to access, correct, delete, and obtain copies of their personal data. Businesses must provide a clear and accessible privacy notice and implement reasonable security measures appropriate to the sensitivity of the information. The law prohibits the sale of personal data without consent and requires affirmative opt-in consent for processing sensitive data, including information about children. For EdTech companies, the TDPSA reinforces obligations to maintain transparent privacy practices, to enter into written data processing agreements with school districts, and to protect all student and educator information collected through educational platforms.

Texas
Version: 2024
Children (under 13), Students
View Source

Capture or Use of Biometric Identifier Act

CUB
Biometric Privacy

The Capture or Use of Biometric Identifier Act requires any private entity that collects biometric identifiers—such as fingerprints, facial recognition data, voiceprints, or iris scans—to obtain informed consent before collection. It prohibits companies from selling or disclosing biometric identifiers unless specifically permitted by law or with consent. Entities must use reasonable care to store and protect biometric data and are required to destroy it within a reasonable timeframe once the purpose of collection has been fulfilled. The law directly affects EdTech products using biometrics for attendance verification, proctoring, or identity management. Noncompliance may result in significant civil penalties enforced by the Texas Attorney General.

Texas
Version: 2009
Students
View Source

Texas Data Breach Notification Law

TDB
Education

Chapter 521 of the Texas Business and Commerce Code governs breach notification and data security requirements. Businesses that experience a breach of personal information affecting Texas residents must notify affected individuals “as quickly as possible” but not later than 60 days following discovery. If a breach affects 250 or more Texas residents, the company must also notify the Texas Attorney General within 30 days. The notice must include a detailed description of the incident, the type of data affected, and the remedial actions taken. The law requires companies to maintain records of breaches for at least five years. For EdTech firms, this law reinforces the duty to promptly report data incidents involving student or educator personal information, whether caused by unauthorized access, accidental disclosure, or cyberattack.

Texas
Version: 2003
Students
View Source

Accessibility of Electronic and Information Resources

AEI
Accessibility

This rule ensures that all electronic and information resources (EIR) developed, procured, maintained, or used by Texas state agencies and higher education institutions are accessible to individuals with disabilities. It aligns with Section 508 of the federal Rehabilitation Act and the Web Content Accessibility Guidelines (WCAG) 2.1 standards. Vendors providing technology to Texas schools or higher education institutions must submit a Voluntary Product Accessibility Template (VPAT) to demonstrate compliance. The rule also requires agencies to appoint an EIR Accessibility Coordinator and establish policies for remediation, testing, and procurement accessibility exceptions. EdTech companies selling into the Texas public sector are expected to meet the same accessibility benchmarks.

Texas
Version: 2004
Students, Users with Disabilities
View Source

Securing Children Online through Parental Empowerment (SCOPE) Act

SCO
Child Protection

The SCOPE Act establishes comprehensive obligations for digital platforms that are accessible to minors in Texas. The law’s goal is to limit harmful data practices, reduce addictive design patterns, and empower parents or guardians with control tools. Specifically, Chapter 509 requires online service providers, websites, and applications to implement age estimation or verification mechanisms to identify whether a user is under 18 years old.

Texas
Version: 2023
Children (under 13), Minors (13-17)+1 more
View Source

Information Security Standards

ISS
Education

The Information Security Standards under 1 TAC Chapter 202 establish mandatory cybersecurity and risk-management requirements for all Texas state agencies and institutions of higher education. These rules are issued by the Texas Department of Information Resources (DIR) to protect sensitive and confidential information held by public entities and their vendors. Chapter 202 requires agencies and higher-education institutions to: Implement an information-security program aligned with DIR’s Security Control Standards Catalog, covering governance, risk management, access control, system protection, and incident response; Designate an Information Security Officer (ISO) responsible for implementing and enforcing these standards; Maintain an incident-response plan for reporting cybersecurity events to DIR; Conduct annual risk assessments and review third-party vendor security; and Ensure that all contracted technology services—especially cloud-hosted solutions—comply with the Texas Risk and Authorization Management Program (TxRAMP), which verifies that vendors meet state cybersecurity requirements before agencies can use their services.

Texas
Version: 2003
Students
View Source

Cyberbullying Policies (David’s Law)

CP(
Education

Commonly known as David’s Law, Texas Education Code §37.0832 was enacted to combat bullying and cyberbullying in public schools. The law was named after David Molak, a San Antonio student who died by suicide following online harassment. It requires every school district to develop and enforce written policies, procedures, and strategies to prevent, identify, and address bullying—including online or electronic forms. Under David’s Law, school districts must: Define “cyberbullying” as bullying that occurs through electronic communication or digital devices; Implement prevention and intervention programs that address both on-campus and off-campus cyberbullying when it disrupts the educational environment; Establish mechanisms for reporting and documenting incidents of bullying and cyberbullying; Notify parents or guardians of both the victim and the alleged perpetrator promptly after an incident is reported; and Outline procedures for disciplinary actions, counseling referrals, or student transfers when warranted.

Texas
Version: 2017
Students
View Source

Personal Information Protection and Electronic Documents Act

PIPEDA
Data Privacy

PIPEDA is Canada’s commercial-activity privacy law. It embeds ten fair-information principles (accountability, identifying purposes, consent, limiting collection/use/retention/disclosure, accuracy, safeguards, openness, individual access, challenging compliance). It requires organizations to obtain meaningful consent, limit data to appropriate purposes, protect it with adequate safeguards proportionate to sensitivity, and provide access/correction. Even in provinces with “substantially similar” private-sector laws, PIPEDA still applies to interprovincial or international transfers. For EdTech, the law touches account creation, learning analytics, proctoring/monitoring tools, classroom collaboration features, parent/guardian portals, and cross-border cloud processing. Contracts with vendors/processors must flow down equivalent protections; privacy management programs should be risk-based, documented, and demonstrable to the Office of the Privacy Commissioner (OPC).

Canada
Version: 2001
All users
View Source

Breach of Security Safeguards Regulations

BSS
Child Protection

These regulations operationalize PIPEDA’s breach regime. Organizations must assess incidents for a “real risk of significant harm” (RROSH), notify the OPC and affected individuals as soon as feasible, and keep a breach record for 24 months. Notices must include prescribed content (circumstances, timing, personal information involved, steps taken, mitigation options, contact info). For EdTech platforms—often holding children’s and educators’ information—the regime effectively requires an incident response playbook, predefined severity criteria, executive decision-rights, and vendor coordination (since many incidents arise at processors). Schools and districts will expect evidence of drills and logs.

Canada
Version: 2018
Children (under 13)
View Source

Canada’s Anti-Spam Legislation

CAL
Education

CASL regulates sending commercial electronic messages (CEMs) and installing software on another person’s device. CEMs require valid consent (express or permitted implied), sender identification, and a functional unsubscribe processed within timelines. Separate rules govern software installation (e.g., if an EdTech client silently installs or updates a desktop agent on school machines). Penalties can be substantial, and record-keeping is essential (consent logs, unsubscribe processing). For EdTech, CASL impacts educator/parent marketing, product update notices, and any device-side agents used for remote testing or classroom management.

Canada
Version: 2014
All users
View Source

Privacy Act

PA
Data Privacy

The Privacy Act governs how federal institutions collect, use, disclose, retain, and provide access to personal information. While it targets government bodies, private vendors processing data for such institutions must meet equivalent standards by contract (purpose limitation, retention/disposal rules, access rights support). EdTech selling to federal agencies (e.g., training portals for federal employees) should expect Privacy Impact Assessments (PIAs) on the client side and contractual flow-down of duties.

United States
Version: 1985
All users
View Source

Accessible Canada Act (ACA)

ACA
Accessibility

The ACA aims for a barrier-free Canada by 2040 across areas under federal jurisdiction (employment, built environment, information/communication technologies, communication, procurement, program/service delivery, transportation). It requires regulated entities to publish accessibility plans, progress reports, and feedback processes, and empowers oversight bodies (e.g., Accessibility Commissioner). While most K–12 entities are provincial, ACA obligations apply to federally regulated organizations and often shape federal procurement, which EdTech vendors must meet when selling to the Government of Canada or federal Crown corporations.

Canada
Version: 2019
Users with Disabilities
View Source

Accessible Canada Regulations

ACR
Accessibility

These regulations under the ACA specify how accessibility plans, progress reports, and feedback process descriptions must be prepared, published, and provided. Notably, online publication must meet WCAG Level AA; alternate formats must be available on request within defined timelines. For EdTech vendors in scope (or those aligning to federal procurement expectations), this effectively sets a floor for accessibility plan content, web presentation, and ongoing reporting cadence.

Canada
Version: 2021
Users with Disabilities
View Source

Canadian Radio-television and Telecommunications Commission (CRTC) Accessibility Reporting Regulations

CRA
Accessibility

Issued by the CRTC under the ACA, these regulations impose accessibility-planning, reporting, and feedback obligations on broadcasting and telecommunications entities (with schedules and content requirements). While not typical for a pure SaaS EdTech, some platforms intersect with telecom/broadcasting (e.g., live video delivery via owned carriage or broadcasting undertakings). Vendors serving regulated carriers must understand the carriers’ compliance needs.

Canada
Version: 2021
Users with Disabilities
View Source

Directive on Automated Decision-Making

DAD
AI/ML Governance

A mandatory policy instrument for federal departments when automated systems (including ML/AI) make or support administrative decisions. Requires an Algorithmic Impact Assessment (AIA) before production, impact-level-based safeguards (e.g., human-in-the-loop, explanation, testing), transparency, recourse, and at higher levels, peer review and public reporting. Private EdTech is not directly bound, but vendors selling AI into the federal family must comply contractually; in practice, DADM is a respected benchmark for responsible AI in Canada (useful when building proctoring, grading assistance, admissions/eligibility triage, etc.).

Canada
Version: 2019
All users
View Source

Digital/Web Accessibility (Standard & guidance)

DA(
Accessibility

The federal Digital Accessibility Toolkit and related guidance recommend aligning to WCAG and the European EN 301 549 ICT accessibility standard. The Standard on Web Accessibility historically required WCAG conformance for federal web content; current TBS guidance recommends adopting EN 301 549 and following the Guideline on Making IT Usable by All. For EdTech, these set procurement-grade targets for UX, multimedia (captions, transcripts, audio descriptions), keyboard interaction, and assistive-tech compatibility.

Canada
Version: 2025
Users with Disabilities
View Source

Baseline Cyber Security Controls for Small and Medium Organizations

BCS
AI/ML Governance

The CCCS baseline distills a pragmatic set of 13 control families (e.g., MFA, asset inventories, patching, backups, secure configuration, logging/monitoring, incident response, vendor management) intended to achieve “80/20” risk reduction for SMEs. Many school boards and SMB EdTech vendors adopt this as a government-endorsed minimum. Mapping to NIST/ISO is straightforward, and CCCS publishes how-to briefs for implementation.

Canada
Version: 2022
All users
View Source

IT Security Risk Management: A Lifecycle Approach (CSE)

SRM
Cybersecurity

ITSG-33 provides Canada’s reference model for risk-based security in government systems, including control profiles and integration with the SDLC. Although written for departments, vendors selling into public bodies often align their controls and risk processes to ITSG-33 to ease authority to operate and procurement reviews. For EdTech platforms hosting sensitive learner data for federal programs, ITSG-33 alignment strengthens assurance.

Canada
Version: 2021
All users
View Source

Guidelines for obtaining meaningful consent

GOM
Data Privacy

The OPC’s binding-interpretive guidance details how to make consent meaningful: emphasize key elements (what, who, why, risks), provide layered notices, avoid dark patterns/bundling, and present age-appropriate explanations. It stresses context-based expectations for youth, including capacity and parental involvement, and complements separate OPC work on age assurance and a potential children’s privacy code. For EdTech, these are the gold standard for onboarding, in-product prompts, and parental dashboards.

Canada
Version: 2025
Children (under 13)
View Source

An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service

ARM
Child Protection

This federal statute imposes mandatory reporting duties on persons who provide an Internet service to the public when they (a) are advised of an Internet address where child pornography may be available to the public or (b) have reasonable grounds to believe their service is used to commit a child-pornography offence. It sets out reporting to police, record-keeping, and offences/penalties. While targeted to ISPs, modern platforms that host or transmit user content should assess whether they fall within scope or have analogous reporting commitments in contracts—especially platforms with uploads, messaging, or community features used by minors (common in EdTech).

Canada
Version: 2011
Children (under 13), Minors (13-17)
View Source

Children’s Privacy Code

CPC
Data Privacy

The code’s stated aim is to clarify private-sector obligations for services accessed by children, align with international best practice (e.g., age-appropriate design), and set out OPC expectations around capacity/consent, profiling, targeted advertising, default settings, notices, age-assurance, data minimization, and privacy-by-design. The consultation also builds on OPC’s earlier work on age assurance (proportionality, data minimization, avoiding sensitive identifiers, and considering UX impacts). For EdTech, the emergent direction signals that products “likely to be accessed by children” should implement child-centred defaults, comprehensible notices, limited profiling/behavioral targeting, parent/guardian engagement where appropriate, and demonstrably proportionate age-assurance. While not yet binding law, a finalized code would shape enforcement expectations under PIPEDA and become the de facto national baseline for child-focused services—highly consequential for K–12-facing EdTech, classroom apps, learning analytics, proctoring tools, and AI-enabled tutoring features.

Canada
Version: Exploratory consultation-2025
Children (under 13), Students
View Source

Act respecting the protection of personal information in the private sector

ARP
Data Privacy

Quebec’s private-sector privacy law—overhauled by “Law 25”—imposes privacy-by-default, mandatory privacy incident (breach) assessment and notification, a designated privacy officer, enhanced consent/notice rules (including a heightened standard for profiling/automated decision-making), DPIAs before communicating personal information outside Quebec, and special consent rules for minors under 14 (consent from the person having parental authority). EdTech vendors operating “in the course of an enterprise” in Quebec must structure data flows, SDKs/AI services, and cross-border processing around those rules; non-compliance can trigger significant administrative monetary penalties.

United States
Version: 2024
Minors (13-17), Students
View Source

Act respecting Access to documents held by public bodies and the Protection of personal information

ARA
Data Privacy

Quebec’s public-sector privacy statute governs school service centres, ministries (e.g., MEQ), and other public bodies that procure or operate EdTech. It sets confidentiality as the default, prescribes lawful collection/minimization, purpose specification, retention limits, and rights of access/correction. Amendments aligned to “Law 25” add duties like breach assessment/notification (via regulation), privacy governance (e.g., committees/policies for many bodies), and accommodations for persons with disabilities when exercising access rights. For vendors, this means contracts, data flows (including hosting outside Quebec on a public body’s behalf), logging, and role-based access must meet public-sector rules; the Commission d’accès à l’information (CAI) can order remedial measures and its orders are enforceable.

United States
Version: 1982
Users with Disabilities
View Source

Regulation on confidentiality incidents

RCI
Data Privacy

This regulation operationalizes breach management for both public bodies (A-2.1) and enterprises (P-39.1). It defines what a privacy-incident notice to the CAI and to affected persons must contain, requires a confidentiality-incident register, and sets reporting triggers tied to “risk of serious injury.” Public bodies (e.g., school service centres) and their vendors must align incident triage, record-keeping, and notification templates to this rule; vendor contracts should specify who assesses risk, who notifies, and how timeframes are met.

United States
Version: 2022
All users
View Source

Regulation respecting the anonymization of personal information

RRA
Education

It establishes criteria and methods for anonymizing personal information for both public bodies and enterprises. It requires that anonymization be the result of a documented process that considers context, residual risk of re-identification, and periodic re-evaluation. For EdTech analytics, research dashboards, or dataset sharing, this defines when data stop being “personal,” how to document tests, and when re-identification risk review is needed.

United States
Version: 2024
All users
View Source

Regulation respecting the distribution of information and protection of personal information

RRD
AI/ML Governance

It requires public bodies to publish certain information proactively and to maintain internal measures for protecting personal information. For EdTech vendors integrating with public bodies, this affects what may be proactively disclosed (e.g., certain classes of records), how access requests are handled, and what policy artifacts the body must have—implications for records classification, API exports, and retention tagging.

United States
Version: 2008
All users
View Source

Act to establish a legal framework for information technology

AEL
Biometric Privacy

Sections 44–45 impose strict controls on biometric identity systems: explicit consent, prior disclosure to the CAI at least 60 days before putting a biometric database into service, and CAI power to order suspension/destruction. This squarely hits EdTech using face recognition (e-proctoring, attendance), voiceprints, or palm/vein ID. It also contains technical rules on integrity, evidentiary value, and lifecycle of digital documents—relevant to audit logs and e-records.

United States
Version: 2001
Students
View Source

Government web accessibility standard

GWA
Accessibility

Quebec’s web accessibility framework requires government sites and services (including school service centres) to conform with WCAG-level criteria. Products procured for K-12 must support accessible UX (keyboard nav, captions, contrast, ARIA semantics) and procurement often references SGQRI-008. Vendors supplying portals/LMS/assessment tools must evidence conformance (e.g., VPAT/ACR-style documentation) and ensure continuous compliance as features evolve.

United States
Version: 2024
Students, Users with Disabilities
View Source

Act respecting contracting by public bodies and Regulation respecting contracting by public bodies in the field of information technologies

ARC
Data Privacy

They set the legal framework for how ministries, school service centres, and other public bodies buy IT/EdTech: thresholds, tendering methods, eligibility, evaluation criteria, and contract management. The IT-specific regulation tailors procurement to software/services, enabling requirements around security, interoperability, accessibility, service levels, and data residency to be embedded in tenders and contracts. Vendors must align to evaluation grids (including privacy/security/accessibility) and accept audit/oversight conditions.

United States
Version: 2006 and 2016
Users with Disabilities
View Source

Utilisation pédagogique, éthique et juridique de l’IA générative (K-12)

UPÉ
Data Privacy

The Ministry of Education’s guide frames ethical, pedagogical, and legal use of generative AI in schools—covering academic integrity, bias and equity, privacy, consent, transparency with students/guardians, and teacher supports. While aimed at educators, it strongly influences district policies and classroom tool adoption. EdTech features (e.g., AI writing aids, tutoring, grading support) should enable disclosures, consent options, opt-outs, data minimization, and bias-mitigation aligned to the guide.

United States
Version: 2024-2025
Students
View Source

La protection des renseignements personnels à l’école

PDR
Child Protection

This is the official guidance for schools on children’s personal information: purpose limitation, consent/parental authority, secure handling, disclosures, and rights. It helps operationalize provincial law in education settings and is often used by boards to set procedures for third-party tools. Vendors integrating with school systems should map onboarding, permissions, and parent communications to these expectations.

United States
Version: 2025
Children (under 13), Students
View Source

Education Act

EA
Education

The Education Act requires schools to adopt and apply an anti-bullying/anti-violence plan; boards implement codes of conduct and reporting/disciplinary processes. Where EdTech includes messaging, forums, or classroom social features, those features should enable monitoring/reporting pathways and align with school responsibilities to prevent and address bullying/violence, including online conduct that affects school climate.

United States
Version: 2000
Minors (13-17), Students
View Source

Government cybersecurity governance

GCG
AI/ML Governance

Quebec maintains a government-wide cybersecurity governance stack (directive, frameworks, and architecture guidance) that assigns roles/responsibilities and sets expectations for risk management, incident handling, and cloud adoption for public bodies. School service centres and MEQ are expected to align with these controls and with the government cloud-broker model, which shapes how third-party EdTech is integrated and secured. Vendors should be ready to meet these baselines (asset inventories, IAM controls, vulnerability management, incident coordination).

United States
Version: 2025
All users
View Source

An Act respecting the national digital identity

ARN
Biometric Privacy

This would create a national digital identity framework for Quebec, empower the Minister of Cybersecurity and Digital to serve as an official source of government digital data for identity, and set governance, authentication, and possibly biometric-related guardrails. Public bodies could be required to use designated services; CAI has recommended strong consent and alternatives for biometrics. If adopted, EdTech integrations with public systems (enrollment, identity proofing, SSO) may need to align with the national digital identity stack.

United States
Version: Reinstated-2025
All users
View Source

Freedom of Information and Protection of Privacy Act (FOIPPA)

FIP
Data Privacy

FOIPPA governs collection, use, disclosure, safeguarding and access to personal information held by BC public bodies (e.g., school boards). Amendments now require privacy management programs (policy framework, roles, training), mandatory breach notification to affected individuals and to the regulator when harm is reasonably expected, and privacy impact assessments for “data-linking programs.” FOIPPA also permits storage/processing and disclosures outside Canada in defined circumstances with safeguards (important for cloud/SaaS and AI hosting). These duties flow down to vendors through contracts when they handle personal information for public bodies; districts must ensure vendor arrangements satisfy FOIPPA security, notice, retention and cross-border conditions. EdTech products used by districts therefore need privacy-by-design defaults, breach handling, and transparent data-flows that withstand FOIPPA scrutiny.

United States
Version: 2025
All users
View Source

School and Student Data Collection Order (M152/89)

SSD
Data Privacy

M152/89 compels boards to collect and submit specified student and school information to the Ministry (e.g., Form 1701 enrolment data; SADE completions). It sets the legal basis for routine K-12 data reporting and establishes what boards must capture, how, and when—driving the minimum datasets and interfaces EdTech systems must support. For vendors, this means aligning schemas (e.g., PENs, program codes), synchronizing with district SIS (MyEducationBC) or approved data exchanges, and ensuring privacy notices reflect statutory collection purposes. It also ties to Ministry compliance audits; misaligned or excess collection can create funding and compliance risk.

United States
Version: 2024
Students
View Source

Provincial Standards for Codes of Conduct Order (M276/07)

PSC
Education

The Order requires boards to adopt codes of conduct that, among other things, reference Human Rights Code protections, define unacceptable behaviours (including harassment and bullying), outline progressive discipline and protect students from retaliation. The 2024 amendment (M89/24) requires restrictions on personal digital devices during instructional time (with defined exceptions), which impacts classroom technology use and device management settings. EdTech platforms and device programs should therefore support school-wide restrictions, classroom controls, inclusive practices, and reporting that aligns with a code’s due-process expectations.

United States
Version: 2024
Students, Users with Disabilities
View Source

Accessible British Columbia Act & Regulation (B.C. Reg. 105/2022)

ABC
Accessibility

The Act requires prescribed organizations to establish an accessibility committee, publish an accessibility plan, and maintain a public feedback mechanism about barriers. The Regulation explicitly prescribes school districts, francophone school districts, and independent schools as of Sep 1, 2023—so K-12 bodies must plan and report on removing/preventing barriers, including technological ones (web/app usability, procurement, communications). For EdTech, this elevates accessible design (e.g., WCAG-aligned UIs, assistive technology compatibility) from best-practice to a governance requirement that boards will scrutinize in vendor selection and rollout.

United States
Version: 2022
Students, Users with Disabilities
View Source

Personal Information Protection Act (PIPA)

PIP
Data Privacy

PIPA requires organizations to designate a privacy officer; limit collection/use/disclosure to purposes a reasonable person would consider appropriate; provide clear notices at or before collection; obtain valid consent; protect information with reasonable security; and provide access/correction rights. For EdTech selling to BC districts or independent schools, PIPA governs their direct relationships with customers and any B2C offerings (e.g., direct-to-student/parent apps). Contracts with public bodies will also incorporate FOIPPA-level requirements, but PIPA remains the baseline for the vendor’s broader BC operations, including employee data and marketing.

United States
Version: 2025
Students
View Source

Considerations for Using AI Tools in K–12 Schools — Ministry guidance

CUT
Data Privacy

This Ministry framework guides districts in evaluating AI tools across privacy/security, appropriateness, equity and bias, academic integrity, records management, and procurement. It encourages risk assessments and local procedures before adoption, and highlights safeguards for students (e.g., minimizing personal data inputs, explainability, opt-outs where feasible). Vendors seeking classroom use should expect districts to map product features to these criteria (e.g., clear data-flows, age-appropriate defaults, human-in-the-loop).

United States
Version: 2024
Students
View Source

Online Learning Policy

OLP
Accessibility

This policy page consolidates requirements that affect digital/online delivery—such as the Students with Disabilities or Diverse Abilities Order, School and Student Data Collection Order (M152/89), and independent school equivalents—providing the policy frame for virtual and blended programs. EdTech supporting online learning should align content standards, accommodations, and data submissions with these referenced instruments.

United States
Version: 2023
Students, Users with Disabilities
View Source

Independent School Framework — Independent School Act & Regulation (B.C. Reg. 262/89)

ISF
Data Privacy

ndependent schools are governed under their own statute/regulation, with requirements for program quality, reporting, records, and student eligibility/classification (including distributed learning). Many independent schools procure EdTech directly; vendors must meet recordkeeping and reporting duties compatible with the independent school framework and with PIPA (private-sector privacy law) rather than FOIPPA. Recent amendments underscore ongoing updates to operational details, so products should remain adaptable to classification, reporting and funding rules.

United States
Version: 2025
Students
View Source

Information Security Policy (ISP)

ISP
AI/ML Governance

The ISP sets high-level requirements for information security governance, risk management, system lifecycle controls, incident response, and supply-chain/cloud controls. While crafted for ministries, districts frequently align procurement and vendor security questionnaires to ISP expectations (e.g., documented roles, vulnerability management, encryption, cloud due diligence, incident management). Vendors should be prepared to evidence controls and support district business continuity and incident processes compatible with these principles.

United States
Version: 2025
All users
View Source

Accessible Service Delivery and Employment Accessibility

ASD
Accessibility

The next wave of binding standards under the Accessible BC Act is being developed, starting with Employment and Accessible Service Delivery. Once adopted, they are expected to set outcome-focused rules for service accessibility (including technology usability, communications, and procurement). Districts will likely update accessibility plans and purchasing criteria accordingly; vendors should monitor drafts and be ready to gap-remediate (e.g., WCAG conformance evidence, accessible support workflows).

United States
Version: Development in progress-2025
Users with Disabilities
View Source