Law 25 and AI: What It Changes for You
You may have heard about Quebec’s Law 25. Yes, it’s a data protection law. Yes, it affects AI. No, it’s not complicated to understand — I’ll explain it the way I’d explain it to my aunt.
What is Law 25, exactly?
Imagine you have a really secret poutine recipe. Law 25 is like telling someone: “You can’t just steal my recipe and sell it without asking.” Except the recipe is your personal data: your address, your preferences, your search history. And the person stealing it is a company training an AI model on it.
The real deal: this law strengthens your rights. It says companies must be transparent when they use your data, they must get your explicit permission, and you have the right to say no. Period.
How does it change things for companies using AI?
If a company wants to train an AI on your data, they can’t just do it quietly. They have to:
1. Tell you they’re doing it
2. Explain why
3. Get your permission
4. Let you change your mind
It’s like asking your permission before sharing a photo of you on Facebook. Except now it’s for training machines. Seems reasonable, right?
And for you, the regular person?
In practice, you probably won’t notice anything immediately. But long term, you have more control over your info. Companies can’t just quietly profile you to sell you stuff. It’s a small step toward a slightly less creepy internet.
Want more info on how it works? Check out Sherpa (it’s free) or dive deeper at Laeka Research.