Episode 6 | The Future of Student Data: Privacy, Security, and Success
Welcome to EduCast 3000. It's the most transformative time in the history of education. So join us as we break down the fourth wall and reflect on what's happening. The good, the bad, and even the chaotic. Here's your hosts, Melissa Loble and Ryan Lufkin.
Hello and welcome to Educast 3000. I'm your co -host Ryan Lufkin. And I'm your co -host Melissa Lobel. We have as usual another exciting guest today and one from our own teams, which is even more exciting. We are joined by Daisy Bennett, who is Instructure's Data Protection Officer. Welcome Daisy. Thank you so much for having me. I'm super excited because you know, I love to talk about privacy. We love it when you talk about privacy, so we are very excited as well. You do. And Daisy, so listen, I know you very well. We go way back. But before we jump into the questions, give us a little bit about your background. Tell us a little about who you are. So I've been here within structure for, gosh, I just crossed my four -year mark. So that's been super fun. Been in the legal field for about 20 years. Prior to being a lawyer, I worked in IT, technology. Really, all my jobs were technology related. I'm not a programmer, so I got to be super clear. You know, didn't do coding, but I was always technology adjacent. So. Yeah, I started, gosh, I started out doing privacy and security literally right out of law school, working for an insurance company, doing security auditing, writing policies and procedures for a healthcare company. And that really sort of, that really like pushed my career towards privacy in these highly regulated spaces. And it's just become a passion, so. Yeah, just cross. Yeah, May was like 20 years of doing this. So that's incredible. And I know you've seen tons of changes over all those years and we'll be chatting about those, which is super cool. Yeah, for sure. yeah. But before we go there, we have one more personal question for you. And we tend to ask all of our guests this. Do you have a favorite learning moment in your life? So it could be about you as a learner. It could be you maybe teaching someone else something. but we love to just connect with our guests a little bit around learning. Do you have a favorite learning moment you'd be willing to share? Yeah, I do. So I have a bunch of them, but one of my favorite ones is way back in 2004, in my last year of law school, I got to participate in a program through the Center for Computer Assisted Legal Instruction. That's a lot of words, but it's called CALI, and they offer a number of types of classes and programs in law schools for doing different things for students. And I got to participate in their Teaching Law in the High School Classroom program. which was absolutely amazing. I got to co -teach a history class for a semester at East High in Salt Lake City. I knew I liked teaching. I knew I loved education, but I didn't really understand that unless I know you were a former educator. And that experience of getting to really like help students make choices. We had a lot of at -risk students and I got to really, you know, we're talking about the law and how to protect themselves. And it was just such a great experience. I almost veered off and didn't become a lawyer and became a teacher instead because of that experience. And, but instead I came back. And maybe when I retire, I'll get to do that. That's my retirement job is being a full -time high school teacher. But one of the amazing things and why I came to Instructure, so to tie it all back, is that this gave me a chance to be part of the education community, support students, work for a mission -driven company, do all the cool things. Not that my other jobs weren't cool, but this really helped me get back to where I was, where I can take all these things I've learned and all these things I love and put them into one job that I could be super passionate about. And for our listeners who don't know, East High School, where you taught that class, was also where they filmed High School Musical. So many of those have probably seen it. Yep, absolutely. Yeah. I love that fun fact. And thank you for sharing that learning story. These are so powerful as we think about and reflect on our own lives. So thank you for that. Yeah, it totally affects those, funny how many of those paths you're like, I almost did this, but then this caught my eye and I went down that path. So it's incredible. We've talked a little bit about, as we go out and speak in the different events, we talk about how tumultuous this is. There's more transformation going on in education than literally the history of the world. So why is data privacy, like tell us a little bit about right now, data privacy is at the forefront, especially in education. Tell us a little bit about what, I mean, it's a big broad question. So I'll let you kind of steer how deep you go into that. Well, absolutely. I mean, I can talk about privacy for hours and I often tell people we can have an eight course about privacy and why it's important, know, but when I broadly speaking, sort of philosophically, know, data privacy is the right to control our information. And it's really about how it's collected and used, right? And if you think about maybe 40 years ago, that meant how you got mail or how doctors treated your stuff. Like it was a lot more sort of, it was a lot more simple, felt right. You know, historically and globally, privacy itself as a concept is recognized as a fundamental right. You know, the UN Declaration of Rights, we've even recognized it in the US Constitution, but it's become so important, I think, in the past decade, I'd say, you 20 years for sure. Really in the last decade, really in the last five years, because our lives are online, right? If you think about how much time you spend on a computer every day, your phone, you know, my Apple Watch, everything I do is usually captured somewhere. Even folks who don't use technology a lot, if you think of IoT devices, CCTV, cameras, every, you know, we're just, all this data is online, right? It's created just phenomenal opportunities for people. we think about distance learning and we think about our ability to communicate with people, organize, do things we weren't really able to do as effectively. So it's been great. First -place experiences, Yeah, all of that. But if you think about it, it's also really hard to understand. Like, I do this for a living. I don't even know where all my data is. And it's really, if you don't, like just understanding how advertising works and how you're followed around by the internet. feels almost mystical, right? You're like, they I talked to my cousin about red shoes and now I'm seeing all these red shoes. They're listening to my phone. You know where you do a Google search and you set thing comes up for the next six months. So the idea of understanding your data and where it goes and where it's moving and who has it and who's doing what with it, it's really hard to understand. so privacy laws are so important because it really sets the standard for everybody. Communities, people, companies, everybody understand what we can and can't do with data, what rights you have to your data. I mean, we taught we, when I started in HIPAA, we used to joke, you know, not joke, but talk about how prior to HIPAA, your employer could get information about you, whether you were sick, if you were having cancer treatments. There weren't really a lot of laws with respect to protecting your health data. So HIPAA came forward as a way to say, no, only certain people can have access to this really personal information, right? I I heard stories about employers finding out women were pregnant and firing them. because they didn't, they assumed that they were gonna have less work ethic or they were gonna do things. So privacy, security laws, data laws, these are all super important to make sure that data is used ethically and responsibly. Also people are more digitally literate than they used to be. Even though it's complicated, think, especially with the youngest generation right now, they understand technology in a way that like someone like me who didn't start using technology until high school They understand it in more almost to the more innate cellular level. I see that those digitally native students, right? see that with my kids. natives. That's the words I was looking for. so that's why we see, you know, people, one cares about privacy, but that's actually not true. There's lots of studies out there. Students care about privacy. College students care about. Young people do. Just because they're willing to share things online doesn't mean they don't care what people do with their data. So that's why it's so important. And, you know, with students bringing back to education, students understand technology, but they may not understand the harm, So one of my favorite stories, I love to joke, I've had my data breached, but you I'm in my 50s. So what, I got like 25, 30 years to worry about it, but imagine a 10 year old, right? Their social security number gets stolen, their digital identity gets stolen. They've got 60, 70, 80 years to worry about that. And the harms can just follow them through their life. And that is why, you know, when I talk to folks about student data privacy, why it's so important, because the harms are so substantial. Well, and that's a really great lead in to I think our listeners are going to come from different backgrounds and sort of different understandings of what some of the basic legislation out there to protect students in particular in both higher ed and then separately in K -12. So would you mind just sharing some basics with us? Because like I said, I think there will be people out there that aren't as well versed in this and would love to understand and love to understand in your terms because you have such a great way of making things really approachable. and accessible as we're to understand, as you said, a really complex area. When we come to you with questions, you're. Thank goodness she helps us work through it. Right. So what's some of the specific legislation that's currently in place in higher education and then in K -12 to protect students? Yeah. So in the U .S. and the U .S., we have federal laws and we have state laws. And at the federal level, FERPA, which is the Family Rights and Privacy Act, most people have heard about that. We also have something called the Protection of Pupil Rights Act that governs surveys and thinks schools can do with research data and some other things. We also have the Children's Online Privacy Protection Act, COPPA, which we're going to talk about a little more because it's super hot topic right now. But more importantly, we've got a bunch of states that have privacy laws specific to student data, not just privacy laws. We're going to talk about that too. So think there's probably 40 states right now. I could have that number wrong, 150 -ish different laws that affect student data privacy. I like to refer folks that are super interested in this to go to the Student Data Privacy Compass website by the Future Privacy Forum. They literally have a list of every single student data privacy law for K -12 and higher ed in the U .S. So all higher ed, all K -12 are subject to FERPA. COPPA really is only for kids 13 and under, know, for under 13. So it really only affects the K -12 space. And then, like I said, there's a lot of state laws. And in the past five years, we've seen states like New York, California, and Illinois, among a few others, pass really stringent student data privacy laws, which is great. And those apply to our K -12 space. But I feel for our higher education institutions, their compliance, their laws they have to worry about is so much broader. So they've got the, you know, if we think about, especially with the pandemic, how distance learning has grown. and remote learning has grown and it's provided so much opportunity for people to learn that they may have not had the opportunity. But that also means that a higher education institution, a state college that may have only had students from the US or primarily from their state now have to worry about privacy laws globally that are far more stringent than what we have in the US. And that's created a lot of complexity for them to keep up, to understand when laws apply, when they don't apply. So I say this over and over again, it's complicated. I call it the complicated web of laws. And we can talk about that for a long time. people that, again, this is a bit of a primer. So I'll throw out GDPR. What are some of the biggest, just so people when they hear acronyms or they hear reference, what are some of those big global privacy laws? So at a high level right now, what we call, you we always call the GDPR, which is the General Data Protection Regulation, is the EU privacy law. And fundamentally, it's totally different than the type of laws we have here in the US, because I'm going to jump ahead to what I was going talk about and I repeat myself. So bear with me. The GDPR protects something called personal data for everybody. It doesn't matter if it's what context it's in. It doesn't matter what company is using it. It doesn't matter what method, whether it's in the mail or a letter or via online, it protects anything that's considered personal data. And when we're in this space, you hear personal data, personally identifiable information, personal information, all these terms that can have nuanced definitions depending on the law. But at a high level, it's really any data that identifies or in combination with other data can identify you. So when you think about how broad that is, it really, it can mean all your data that's online within a certain organization. Super helpful, thank you. Yeah, you mentioned a little bit too about the difference between kind of K -12 in the US, K -12 privacy and accessibility concerns versus higher ed concerns. Like what's the difference? Why is there a divide between those two? Well, I think at the fundamental level, I the concerns are the same, right? Especially with privacy, especially with accessibility, we want to protect data and we want to ensure equitable access to resources, right? And in the privacy space, I just talked about sort of the difference in the complexity of the laws. With accessibility, the difference is a little more nuanced because accessibility, and we're talking about accessibility laws right now, you know, is how we treat accessibility or how the legal approach to accessibility in K -12 and higher ed's different. So if you think about it, in K -12, accessibility is really focused on student success. And in higher ed, it's really more focused about access. And there's also the laws are just a little bit different. There's some laws that apply to K -12 that don't apply to higher ed. So in K -12, districts are responsible for identifying a disability, determining eligibility for services, implementing accommodations, writing IEP or individual education plans or 504 plans for the students. They also help determine services on need. And it's really about promoting success of that student, right? Like helping figure out how do we help the student be successful. It's also way more regimented and monitored for the student as any parent who has had a student that has an IEP plan, right? But on the higher ed, the higher ed level, you we've got the ADA and the Rehabilitation Act that guides colleges and their, you know, their accessibility policies and what it is, and this is going to get super technical, but with an aim towards access for otherwise qualified students based on their admission criteria. So when a school admits a college student, their policies. with respect to accessibility are really focused on identifying services for those students that that student then has to go request, right? There's no automatic discussion. So like at the K -12 space, teachers often talk amongst each other. There might be a coordinator that helps organize services for these students because they're children, right? But legislatively and sort of philosophically, we think, college students are adults. They should be able to drive their own needs, describe their accessibility, initiate requests for services. talk about what kind of accommodations they need. And then the accommodations that are given at colleges tend to ensure access, right? Not necessarily success. And that's really the nuance between the two. And it's not something I think we always think about from an accessibility standpoint. It's not just, make it accessible. know, for K -12 students, we're really trying to help them be successful. And for college students, we're really making sure that they just have access to things. That's interesting because we even see that we talk a lot about mental health programs and things like that, support programs for students in higher ed, and often there's a disconnect between those programs being available and students knowing about them and actively leveraging them because of this, right? Yeah, absolutely. So when I was in undergraduate, I worked for the student, the disabled student services group, and I supervised test proctors for alternative accommodations for testing. And part of what we did was almost like guerrilla campaign to make sure students knew that if they had a documented disability, they could request alternative testing accommodations. Everything from more time to having it on paper or having it on a computer, having a reader. know, sometimes we read the test, sometimes we transcribed the test. So there were a lot of accommodations that students just didn't know about because they have to initiate that request. Yeah, yeah, that's interesting. And then on the K -12 side, there's even kind of a divide between students under 13 and students between 13 18, right? Like there's even some granularity and the challenges there, right? yeah, because you know the way a primary school or an elementary school approaches accessibility is much, much more sort of intensified than it may be at the high school level. so, and I get it, you when you're in high school, you're developing your own person and your own, you wanted your own thing, you don't want someone to tell you these are the things you have to do. So I know it can be hard for our schools to strike that balance. Well, as we continue to think about K -12, it's interesting because we see a lot more in the news about all the cyber attacks. and sort of the vulnerability of K -12. I'm sure they're happening in higher education as well. We've heard about a couple of those, but it's interesting that I feel like the world, or at least the bad actors in the world, I should say, have really targeted K -12. And I'm curious what your thoughts are on, like, why is that happening? And is legislation helping? Is it trying to help protect as well? these K -12 districts that feel vulnerable. Yeah, so there's always been bad actors and there's always been people looking to exploit vulnerabilities in every way in the most creative ways. Laws really, we always say the law has a really hard time keeping up with technology. But recently, you know, it breaks your heart to see the attacks that are happening ransomware. I mean, there's a report that cybersecurity firm EMSIsoft, I would say that wrong, did a really good report about this last year. based on what was reported, 109 districts in the US were targeted by just ransomware. So we're not talking about breaches, we're not talking about hacks, we're talking about just ransomware. And you think about that, like you're like, what that is, you know, what in the world would a bad actor want with kids with, with a school's data? Well, you know, there's kind of two main reasons now, just like the rest of the world, as we talked about earlier, our lives are online and schools collect a ton of data now, everything from health data to education data, social security numbers, because schools are online, right? And we have all the different. reporting requirements, data collection requirements that are federally mandated, and that creates an opportunity. It's really attractive. That's why healthcare, you know, I would say in the early 2000s, insurance companies, healthcare companies were the huge target for bad actors. And we're seeing that shift move over into education because of the amount of data that's there. And on the flip side, especially in our K -12 space, you know, many of our K -12 schools are just understaffed. And this is not a new idea. This is not something that folks don't know. But we always think about it in context of not having enough teachers, but we don't realize that it also impacts support staff, right? So COSIN did the 2023 state of ed tech leadership report. And that report said that only a third of educational technology leaders, know, that ed tech leaders felt their districts had sufficient resources to deal with cybersecurity issues. And many districts, almost 66%, didn't have a full -time cybersecurity position. So if you think about that, the strain that puts on, you know, you have one person who's doing the IT purchasing. and accessibility that might wear all those hats and they might have a speciality in accessibility, but they may not understand, might not have a cybersecurity specialty, which is super technical and it's not the school's fault. It really is. think that piece of it is just so much of a funding issue. know, if you don't have resources, you can't catch everything. That is so scary and also so unfortunate, especially as, as you were talking about earlier, we're trying as a collective in education. make sure students are successful. then we've got these kind of barriers, right? And by the way, for our listeners, we'll make sure to link to that report. And Daisy is mentioning some really great resources throughout this conversation. We'll make sure to put all of those in the show notes so y 'all have access to these as you're looking to follow up on these conversations. absolutely. And to that point, Melissa, you know, not everybody loves this idea, but I love this idea. really think with respect, especially for K -12 schools, it's really up to us, the tech providers like Instructure, all of us. to really work extra hard to make sure our systems are secure, right? Here, that's a key part of our privacy mission. We see ourselves as data stewards, not data owners of the data that we have. And we, as a company, we have the resources to build good security. So it's really up to us to make sure that what we're selling in the marketplace is secure. knowing that schools may not have the resources they need. we make sure that, and I know a lot of other tech companies in the space feel the same way. I know that's a little bit of an odd approach generally for companies. We're going to work really hard to be the best. But as you said, Melissa, we're a community and we're all trying to do the same thing. So into this complex environment, we throw AI. Melissa and I laugh because every show we have to talk about AI at some point, we need a bell or some sort of indicator. it has, you mentioned that it's hard for regulation and these protections to keep up with technology. Now, AI, we've taken this column leap forward. Where do we stand? How are the existing regulations supporting that? For AI, it's been really interesting. We were all looking at the European Union because there's sort of, we call it the Brussels effect, where they were the first to implement really comprehensive privacy laws. Everybody was talking about the EU AI Act, which is going to eminently be published and come into effect. Well, Colorado beat everybody to the punch. Colorado here in the US published their AI Act. So we have a lot of states in the US, even at the, we've got, you know, stuff at the federal level happening, but I think it's all the states driving this. think California right now has 11 or 12 different AI laws, regulating different aspects of AI laws. These are not education focused, but they're going to govern. AI regulation is really big and it's overlaps with privacy and security, but it's got a lot of other stuff in it, like intellectual property and digital rights and ethics in a way that's a little bit different than some of the other areas. I think legislators generally learn their lesson and are rapidly trying to regulate as fast as they can. But it's also kind of that balance between fostering innovation. leveraging these tools in good and positive ways, right? And we talk a little bit about when we present like the idea of eating our vegetables, right? It's something that Zach Pendleton and I know you've had a hand in that. this idea that we have FERPA and we have WCAG and we have all these different regulations in place. We really need to make sure that the AI tools we implement align with those. And that's to responsibility. Like you said, we take that really seriously. You know, we're rolling out things like our nutritional facts card to help govern our partners and our features. So provide transparency and build trust in AI, but it's going to be kind of a complex, especially individual states throwing out different regulations. It's going to be complex. and it is like privacy. It's going to be a path right now. It looks like it's going to be a patchwork. So, you know, I mentioned a little bit about this earlier. You know, the US we have a sectoral approach to regulation. So that means we tend to regulate the context of privacy, data types or industries. So if you think about FERPA is education data. HIPAA is healthcare data. The Gramm -Leach -Blyliak is financial data. That same data, so I always use this as an example, know, Ryan, you could send me a copy of medical report, but that doesn't make it subject to HIPAA because we are not subject to HIPAA as individuals. Now, if I were a doctor, I would be. So that's why that gets complicated. And AIs trying to avoid, what we've seen in AI legislation, trying to avoid that. Really what they're focusing on is risk, which really I think is the way to do it, right? Because there's a lot of AI out there that's low risk. Yes. questionable players out there in the space for sure. But it's been such an interesting ride over the last year and a half to see schools go from ban AI, try to block it off machines on campus, things like that. To now, we are definitely moving into the more productive phase of how do we leverage this? How do we make sure we're doing it responsibly? But how do we really bring some of the benefits to time savings for teachers, student success for students? How do we bring that to the forefront while still making sure we're protecting that data privacy and security? Yeah, and something we've seen that's been really amazing is I don't even know what the count is. A huge number of the districts here in the US and school and higher ed have published these guidelines, right? Guidelines were saved by AI use to help give their teachers and their students, you know, understanding on how to use AI safely. There's a lot of AI literacy projects. I created training for some, it's professional education for some AI literacy, you know, for some professional development brought junior high school in Illinois so they could train their teachers on just AI generally, like what is it, what is it not? So. schools are really working hard on that because we're not all just sitting around waiting, waiting for folks to legislate it. We're setting up these guidelines right off the bat to make sure that, you know, because schools can leverage it in a way that's safe. And there's some federal guidelines. There's a federal kind of recommendation report that will be coming to There is. is. The U .S. Department of Education published a really good report last year. They did a huge study on it. There's actually two reports, one's about study, one's sort of their guidelines and recommendations. And so I always it brought a lot of those issues and questions to fore. And I know that you and Melissa have done a lot had a lot of conversations with our schools and with our community on understanding like where are the risk points, what's important, what's not important, helping to demystify AI. I can't tell you how many conversations I have that like AI right now really is just good at guessing. It's not sentient. It can't. It's trying to give you what you want. It's not nefarious, right? Yeah, it's really good at guessing the next best word based on a bunch of training data, right? So it's not sentient. It's not thinking. It's not trying to do things. It's a really good guessing algorithm and that's where we're at. So it's safer than people think it is, if that makes sense. Yeah, absolutely. That safety concern and even just as you said earlier, the role that tech vendors play in not only navigating privacy and AI, but more largely privacy across the board is really critical. For districts out there and for higher education institutions, what kinds of tools or what approaches should they take to vetting? I know there's resources out there and you've got some really good ones. How do they make the right decisions about technologies or practices that they're doing, both from a larger privacy perspective and then even maybe a little more specifically around AI? Yeah, so, you know, I understand that the variability of resources between higher ed and K -12 and big districts might have better resources than smaller districts. And so what we always recommend when we talk to our schools is to use some of these not -for -profits and consortiums that have a lot of resources to help them. Melissa, you and I have talked a lot about this. EdTech has a great app vetting program. Districts can join One EdTech for a very low price. There is badging related to privacy. There's a rubric for security. There's a rubric for accessibility. There's a rubric for AI. And that's a great starting place for these schools that don't have a lot of resources to understand sort of the baseline things that they should be thinking about with respect to privacy, security, and AI. There's also the HECVAT, which was published by Educause, which is a great technical resource schools can use to send to their vendors that covers, it's really security focused, but it alleviates the work that you as a K -12 school don't have to come up with a security questionnaire. And there's a bunch of other security questions out there that are questionnaires out there that schools can leverage. And we see a combination of those coming from our schools. And of course, many ed tech vendors like us actually publish packages online, right? Like we have a compliance package online that covers all of our products. There's a lot of different information, white papers. documentation to help the schools with that vetting and to help them understand what we do and what we don't do. Additionally, our security team helps them a lot. And again, I'm not just trying to promote instructor here. This is really common and with our education technology providers, right? We're really trying to help the schools meet the needs because we know what they need to do too. So we're all trying to get there together. Yeah, that's super helpful. As a reminder, we'll link for all of our listeners. those vetting tools too, because there is a number of them. It's definitely becoming the space to play from nonprofit organizations and even government organizations, because it is so important that we put in the hands of our districts and our higher education institutions tools so that they can, with their lean resources, make really strong decisions for their students to drive success and access. And I also tell people from like a fundamental perspective, right? These aren't companies buying from other companies. Yeah, I'm not saying that things should be easier or harder, but it should be easy for schools, right? Most of them are taxpayer driven, paid by taxpayer. Like we want it to be easy for them to do business. We want that we don't want it to be burdensome for them to make sure that, you know, the LMS or the assessment tool that they use has appropriate security, has AI controls, has the right terms and conditions. So even PTAC. through the use of Department of Education has a bunch of different questionnaires and documentation that schools can use to to add. And they do three seminars in spring and fall that you can register for. We'll provide those. I have a whole bunch of stuff I'll send to Melissa to publish with this. And one of them that they do, you know, both in the fall and spring is what kind of questions to ask your vendors, how to vet privacy and security, how to read terms and conditions. So there's a lot of resources out there, even the government. That's great. So you've given some really good recommendations, but what else can schools, higher ed and the K -12 level, do to kind of future proof their learning environments and their data for students? Yeah, so that's hard because as I said, it's complex. There's a bunch of laws, standards change. What's considered secure today in five years is obsolete. so, as I said, look to your state organizations, look to these not -for -profits and develop close relationships with your vendors, right? With the other IT tech vendors we work with, they talk. to your vendors, get them to share with you what they do, what they don't do. And that helps, it increases trust and accountability and transparency. But it also helps you, the schools, right, to understand what's coming. Because like said, they may not have lawyers on staff. And so we always tell people like, please, please go, go reach out into your community and ask, you know, and see what's there. and see what kind of information you can get, know, a personal plug, if I may, you know, at InstructureCon this year. And we do it every year that I've been with the company. We do a 2024 update on privacy laws and we invite outside council that come in and we do these two presentations just specifically to help our schools that come to InstructureCon understand what's coming, what night they can anticipate. And there's a lot of other. You know, those type of opportunities exist, but we do it too. Those are dryer sessions, but we've gotten great feedback on the information you all provide. Trier? No, we make them fun. We try to make them fun. We try. We make them fun. You get three lawyers in a room and fun is not the word you think of. However, you like to do,wing the swag to bring people in, right? That's right. That's right. Well, I'd love to wrap this conversation up with one last question. So you've been in this space for a long time, which is amazing. What are educators not talking about, right? Are we missing something? And maybe we're not. Maybe we're talking about all the right things. But do you ever sit back and think about after a full day of why isn't anybody raising this? Or why aren't we putting our energy in this area when we think about privacy for students and teachers and every educator? Well, I think folks understand the issues. But I also think because of complexity, people get really bogged down and hyper -focus on things that, on one thing, that may actually not really be the most important thing in the scope of things. And we talk a lot to our students who panic about stuff because they don't understand what data is our school, they might be concerned because they don't understand. And so what we're always trying to assure is that all we're really talking about at the simplest level is protecting students, protecting educators, protecting schools, protecting the data, supporting them to be successful. We're all trying to do that right so students can learn in a safe environment. And when I have these conversations, I always try to help people not get bogged down in those details, because it can be so overwhelming, and we don't want people to be overwhelmed. I love that. And if I can add to that, I think I hear educators get lost in, don't want to be sued. And I think that's where they get into the minutiae, as opposed to, like you said, what we're really trying to do is lift up access and student success in all of this. And if we keep focused on that. It changes the conversation and you don't miss pieces, I would imagine. Yeah, yeah. mean, granted, I'm a lawyer. So, you know, coming up, I kill people. I'm the harbinger of doom. My job is to think about the worst thing that could possibly happen. But in privacy, you know, I that and it breaks my heart that educators are worried about that. Right. And really that we shouldn't be doing things because we're afraid necessarily just afraid of bad outcomes. We should be doing things because it's meaningful and it's ethical and it's the right thing to do. And I know that sounds super cheesy, but I think And keeping that approach and keeping that sort of mission and that thought keeps us, keeps everybody from being super overwhelmed by everything. And that's why we love you, Daisy, because that's what we're all in this for, right? Yes, absolutely. Whether it sounds cheesy or not, it's so true. And being able to bring. this conversation to a way that we can all understand it, no matter where we're coming from, also helps us do that. Yeah, absolutely. We don't tell you how much we appreciate everything you do enough, so the information you provide on the website around privacy, security, accessibility, it really isn't valuable. And I can't tell you how many times people come to me with those questions and I point them to the information that you've already put out there and it's incredible. So thank you again. absolutely. And you know, I always make this offer, you know, if people have questions, especially about what we do here at our company, please reach We have an email, privacy at instructure .com. Reach out, ask us the question. Our goal is radical transparency. We want to help you understand. Yeah, and I'll even say it's beyond just what we do, right? Ryan and I's mission, and Daisy is absolutely a great example and part of this, is to just bring knowledge out to the space or experiences or even questions or things for you all to be thinking about, whether they're related to us or not, doesn't matter. And I've seen Daisy do this too, just really lift up knowledge. in groups and teams and across organizations to help them understand these really complex topics as you've described, Daisy. So don't hesitate to reach out to us, whether it's about us or not, we'd be happy to help lift up that knowledge or have really good conversations around how we just all move this forward together. And with that, I think this is a wrap, right, Ryan? It is. Thanks again, Daisy. This has been amazing. Thank you both so much. We'll have you back on. I'm sure there will be… And thanks to everybody who won. And I thank everybody that's interested in even listening about privacy, you know, and is cares. It's so meaningful. Thanks so much. Thanks, Thanks for listening to this episode of EDUCAST 3000. Don't forget to like, subscribe, and drop us a review on your favorite podcast player so you don't miss an episode. If you have a topic you'd like us to explore more, please email us at InstructureCast at Instructure .com, or you can drop us a line on any of the socials. You can find more contact info in the show notes. Thanks for listening, and we'll catch you on the next episode of EDUCAST 3000.