AI Perceptions and Readiness Survey
Understand stakeholder priorities, familiarity, and readiness to use AI in schools.
Panoramic 2026 Virtual Summit
Discover how educators and leaders are putting purpose behind AI to drive real improvements in attendance, literacy, and student supports.
School districts face mounting pressures to boost student achievement and strengthen instructional coherence, all while navigating the accelerating adoption of AI in education.
In this episode, Georgia Bambrick, Senior Director of Product Engagement at Panorama Education, sits down with Sarah Jay, Executive Director of Equitable Multi-Tiered Systems of Support at Boston Public Schools. Together, they explore how Boston Public Schools is leading instructional change transforming AI integration from a challenge into an opportunity. Sarah shares how Boston is thoughtfully implementing AI tools, such as Panorama Solara, to reinforce strong instructional practices, support educators, and keep student learning at the heart of every decision.
Additionally, Sarah explains how her team prioritized inclusivity and instructional rigor by embedding clear pedagogical guardrails into Boston’s AI-powered tools. She shares how Boston intentionally piloted and refined AI use cases, partnered across departments, and focused professional development on meaningful educator workflows to ensure strong implementation. Through this approached, the district integrated AI into core MTSS and instructional systems while staying focused on long-term learning outcomes, rather than just adopting the latest technology.
Senior Director of Product Engagement | Panorama Education
Executive Director of Equitable Multi-Tiered Systems of Support at Boston Public Schools
This is the podcast where top K-12 education leaders and experts explore how AI is reshaping teaching, learning, and school leadership—one real story at a time. Hosted by Aaron Feuer, CEO and Co-Founder of Panorama Education, each episode offers a roadmap for implementing AI in your school or district, along with tools, lessons learned, and practical strategies you can bring to your team.
You’ll hear directly from leaders applying AI to solve big challenges like chronic absenteeism, literacy gaps, and teacher burnout in ways that are safe and secure, personalized, and anchored in driving student outcomes. Wherever you are in your school or district’s AI journey, this show is your guide to impactful AI in K-12.
Stay connected to the conversations shaping the future of learning.
Subscribe to The Leading & Learning with AI Podcast and join K–12 education leaders, innovators, and practitioners exploring how artificial intelligence is transforming teaching, learning, and school leadership.
🎧 New episodes release every other Monday. Each episode offers practical insights, real-world stories, and frameworks you can bring back to your team as you lead your district’s AI journey.
Subscribe on your favorite platform:
If you’ve been inspired by what you’ve heard, please take a moment to leave a review on Apple Podcasts. Your feedback helps other district and school leaders find the podcast and join the growing community of educators driving meaningful, secure, and student-centered AI innovation.
Georgia Bambrick:
Welcome back to the Leading and Learning with AI podcast. My name is Georgia Bambrick, and I'm the Senior Director of Product Engagement and lead our AI strategy at Panorama Education, and I am your guest host for today's episode. And so across districts right now, leaders are facing a complex reality. Student achievement pressures are real, instructional coherence matters more than ever, and at the same time, AI tools are showing up in classrooms. Faster than systems were designed to absorb them. So today I'm joined by Sarah Jay, good friend and Executive Director of Equitable Multi-Tiered Systems of Support at Boston Public Schools. And Sarah has played a key role in shaping Boston's approach to AI, curriculum, instructional coherence, and MTSS systems. Sarah brings a clear systems-level perspective on how districts can use AI thoughtfully not as a replacement for good teaching or judgment, but as a way to support educators, reinforce strong instructional practices, and ultimately keep students at the center.
Georgia Bambrick:
So Sarah, before we talk about AI or strategy, I would love to start with you. Can you share a bit about your role in Boston Public Schools and what originally drew you to teaching and district leadership?
Sarah Jay:
Thanks, Georgia. It's so great to talk to you today. So I started my career as a Spanish teacher. I'm a non-native speaker of Spanish, but being a citizen of the world, I really enjoyed learning a new language, getting to connect with people and places and cultures that were far different from my upbringing on Cape Cod here in Massachusetts. And as part of that work as a Spanish teacher, I really learned a lot about the process of communication, the process of learning. Like the multiple layers of not just like the grammar and the vocabulary, but the implicit meanings and structures and ways people make meaning as they communicate in reading, writing, listening, speaking. So in that work, I really became interested in how do we capture what kids know and are able to do? And that led me to think a lot about performance assessments and then eventually inquiry cycles for continuous improvement of student learning. And in that process, that's kind of how I really became a huge fan of multi-tiered systems of support, which I see as a sort of an equity framework, a framework that really acknowledges that all students have access to high-quality instruction, rigorous expectations, culturally responsive instruction, and that they get what they need as both an intervention or an enrichment dynamically.
Sarah Jay:
And I think that that's all part of, you know, a universal pathway that, that children are able to achieve their goals, but through whatever path or supports they need along the way. I think it's called targeted universalism.
Georgia Bambrick:
Amazing. Thanks. You bring such a unique background to the work and entry point, especially where you are now. And so I think really grounding in, in what brought you here and how we're now talking about AI is really powerful. And so if we were to rewind to around 2022, Boston was facing some very real challenges around student achievement. And so how were you in your role and your team experiencing that moment internally?
Sarah Jay:
So at that time, you know, post the big shutdowns of 2020 and 2021, we are really thinking about using this opportunity to restructure as a way of not just like accelerating learning loss or instructional loss from the pandemic, but also creating systems and structures that would really situate students to learn and situate students to achieve, you know, the greatness within them, as is our mission. So at that time, we thought about how we would move towards a more inclusive educational environment, bringing students who had perhaps been put in programs that were separate from the general ed curriculum and reintegrating them more robustly within sort of an inclusive educational environment. And that's not something that happens overnight. There's a lot of work that went into that. The adoption of high-quality instructional materials, first in ELA and then in math and then other subject areas to create sort of a universal Tier 1 experience that would allow for there to be a predictable thoughtful instructional space for students to be included into, and that created that foundation for inclusion. So really since 2020, we have been working on this goal, and 2022 provided a space where we were more actively naming it and really thinking about, in my work, MTSS as a potential roadmap to create some of those inclusive structures. For students regardless of their programmatic definition, I guess, within the system.
Georgia Bambrick:
Yeah. Yeah. And it's interesting to hear you as the leader of MTSS really bring us back to that foundation, right, of the Tier 1 and the high-quality instructional materials as kind of at scale as a foundation and really ensuring that that is happening across a large and diverse city and what that really means. And, and I hear you talking about what a critical lever that really was for Boston. Again, not unique to 2022, but as kind of a stake in the ground moment as well. Yes. Okay. So at the same time, Sarah, AI tools are starting to show up and all over, right? ChatGPT in teachers' hands.
Georgia Bambrick:
And there's a big moment around 2022 as well. So how did that intersect in that moment with your curriculum implementation efforts? What were you noticing? And what was the impact there?
Sarah Jay:
Well, I will say, you know, the biggest challenge in moving towards curricular adoption for Tier 1 at scale is just that people have to learn new things, and it's hard, right? Like, folks know the standards. We've been working on assessment data to understand where students were performing in relation to standards, but how do you ensure that curriculum I guess for lack of a better word, becomes the floor and not the ceiling of student, the student experience. So I was really trying hard in my MTSS implementation to stress that Tier 1 is MTSS, it's the foundation, and that how we use universal screeners and curriculum-based measures to understand sort of students' readiness for Tier 1 instruction as well as the additional nuances that they would need in their instruction in order to achieve grade-level standards. So when you ask about AI, I cannot say that I was so aware of AI as a potential pathway to kind of addressing some of these challenges until that field developed more robustly. You know, ChatGPT, again, when it came on the scene, was a really interesting tool to ask questions, to understand the internet. But for data and, and privacy, you know, as educators, and we'd never want to put information in that system that could compromise students' backgrounds, instructional pathways, privacy, period. So it wasn't until later on as these AI tools really evolved that we and BPS, I think, started to explore instructional pathways.
Georgia Bambrick:
For AI use. So interesting. I remember also working with you all at the time, and actually, you know, we're going to get to how you're leading the way in AI, but I think at the time too, AI was actually sort of exacerbating and, and making it more challenging in some ways to implement with coherence. And I think we're seeing districts, states, and leaders across the country in different ways seeing this pop up when, you know, it's easier to use an AI tool in some cases than to use the curriculum.
Sarah Jay:
Yeah.
Georgia Bambrick:
We hear the, like, teachers, it's just Teachers Pay Teachers in a kind of different form, almost a faster way, or it can actually exacerbate bias and sort of undermine the cohesive efforts that you're trying to deliver. I remember a specific example that you all talked about, which is you have a core pillar, right, of daily work with complex text. Yes. In Boston. And I think it was you or a colleague saying, and folks are going to AI and adjusting the reading level of texts. So like, talk to me a little bit about that and how AI was in some ways, you know, it was exciting. And you talked about how you, there was challenges for sure. You maybe didn't see it, but also how is it kind of almost undermining at first some of your efforts? Yeah.
Sarah Jay:
Thank you for bringing that up. That was the real challenge. When you blankly say, differentiate this to an AI chatbot without actually an acknowledgement of who the students are, you are really running the risk of lowering the rigor or actually watering down, weakening the intended learning for students. And when you're maybe saying, make this more culturally responsive without acknowledging who the children in the class are, you are also not actually addressing the instructional need of the students that you're serving. So AI is really incredible in the rapidity and the complexity of the work it can do, but if it doesn't have a person guiding its use who's really being thoughtful about what the chatbot knows and what the intended outcomes are, there is a real risk of lowering expectations and then actually exacerbating opportunity gaps or giving children unequal access.
Georgia Bambrick:
So I think you were in this moment and, and again, very, very kind of real moment of implementing high-quality instructional materials, sort of stake in the ground, tier 1 foundation, and then there's this kind of other thing popping up.
Sarah Jay:
Yeah.
Georgia Bambrick:
And many districts, or leaders, I should say, would respond in that moment probably by trying to restrict or slow down. That's maybe a fairly typical response there. You and your team chose a different path, and I remember some conversations about this where you kind of flipped the script a little bit and you leaned into AI. Can you talk to me a little bit about some of the leadership conversations that led to that shift? And helped you reframe AI from a real risk into a potential asset for Boston and the students of Boston?
Sarah Jay:
Yes. Initially, I saw AI, like those things that it does well, like find patterns, see information that maybe is a challenge for an individual who's overwhelmed by the quantity of information. I saw this as AI is like, oh, maybe this is a tool that can help us to deeply like understand what the actual needs are of, you know, quantitative data sets. But to do that, you have to have a very secure system where that those data sets can exist. So one of the tools we use in BPS is Panorama's Solara tool, and that is, you know, as they say, a walled garden that allows us to put student data in and leverage actually the data from student success in order to really tightly differentiate instruction because it knows who the children are and it has multiple data points, both math, ELA, some social-emotional, in order to guide that thinking. So that was a big piece, like sort of having the data element. But I, I do want to underscore the way that we think about AI in BPS is we have sort of two running metaphors that are really helpful. AI is almost like a really eager student teacher.
Sarah Jay:
A really eager student teacher who has like a lot of background information, really wants to do a good job, but really needs a lot of guidance. And so using AI from that lens actually requires the educator to develop their thinking and their precision in how they're going to plan and what they're going to do. I won't overstate that it becomes like professional development for the educator, But I will say my thinking was it would help codify sort of thinking patterns that were essential to quality instruction. Like if you can prompt well in AI, you can really plan well in your own thing, 'cause you're thinking through what someone would have to know and be able to do to provide really strong instruction. So that like student-teacher intern is a really strong way of sort of thinking about, okay, I want to, to educate well for my children, so how do I need to like hone my thinking? The other metaphor that I think is helpful is the idea of a traditional bike versus an electric bike. So a traditional bike, you gotta learn how to do it, you gotta learn how to balance, and it's gonna get you exercise, right? It's gonna get you exercise. You can see the world as you travel along. That's a really important skill to have.
Sarah Jay:
However, if you want to get somewhere really quickly with limited sweat, an electric bike is a better tool, right? So we've been thinking a lot about like, okay, we need to ensure that educators can ride the traditional bike, but then the electric bike of AI is the thing that's gonna get them to do that work faster and more efficiently and sort of hopefully reduce the amount of overwhelm. So kind of bringing this together. I was seeing a real need in our implementation of high-quality instructional materials and MTSS to leverage our copious amounts of student data more effectively to meet student need. And I was seeing, like, AI as a potential pathway of not only clarifying for teachers how they need to think about planning, but also providing them, once they have that clarity, with a tool that would help them get at that goal more quickly and efficiently.
Georgia Bambrick:
I love that. I don't think I've ever heard you use the, the electric bike metaphor. And not to take the metaphor too far here, but I think something that's notable is that both the electric bike and the manual bike are bikes, right? One, you know, thing to layer on that I haven't heard you explicitly mention that I just want to call out is that we sort of started by talking about high-quality instructional materials and rigor. And one thing that Boston has done to keep people on the bike, if you will, again, continue that metaphor, is also embed your high-quality instructional materials into your AI tool so that folks are not just moving fast in any direction or on any vehicle, but actually within the sort of pedagogical guardrails and expectations of Boston. We use that leveling text for an example. So talk to me a little bit about the importance of guardrails that are pedagogical in AI. We can go into the, the text leveling, but I know you've put a number of things in place. I'm going to keep the metaphor going to sort of keep people on a bike, whatever type it is.
Georgia Bambrick:
Well, how have you been thinking about that? And we often think about guardrails from a fear-based perspective. I've been really inspired about how you and the Boston team have been thinking about them from a proactive perspective. To really steer folks to learning. So I'd love to hear about how you're thinking about that.
Sarah Jay:
Yeah, I appreciate you saying that because I hope it doesn't come across as a super big brother, but we do have guardrails in place because something I'm very aware of as a district leader is that anything that I promote or suggest can seem, if someone doesn't having a conversation with me about it, like if I'm, for example, promoting the Solara tool. If the Solara tool gives advice counter to sort of the BPS way of doing things, educators might assume that, well, that's what the district, quote unquote, is saying now. You have to be super careful that messages are not confused between, you know, our instructional expectations, our core values, and the outputs of an AI chatbot, for example. So that was something that was really on my mind when it comes to crafting tools for our educators, for our staff, that they would see alignment between what we were espousing to be true and the actual output and support that educators were receiving. So for example, you know, we had mentioned the text leveling. We don't want to level text, we want to provide kids access to grade-level text, so we were able to sort of ensure that our SOLAR system doesn't allow for text leveling. We were able to ensure that language for how we discuss students is asset-based, that we were able to ensure that many of our curricula, our strategies for supporting students, whole child development were included, you know, in terms of the language and the actual practices. And even within these little AI curriculum tools that we've built, we were able to prompt them to always refer back to the standard addressed so that any output would show educators exactly how that standard was going to show up in the lesson and how the potential scaffolds or enrichments offered by the AI were in alignment with that standard, just to allow for a gut check so that we weren't as you were saying before, able to like spiral out and go beyond the scope of what we were hoping to achieve in Tier 1 instruction.
Georgia Bambrick:
Yeah, it's been really inspiring to watch, and I know it doesn't feel Big Brother. I think one word I'm trying to use instead of guardrails is steering, right? And steering towards excellence. And I think really ultimately the process that can be learned that we're seeing replicated is starting with defining excellence and then kind of validating, and then there's continual refinement because there's so many components to excellence, right? And so one of the beauties of AI, in addition to the accelerant and the sort of quick content generation and everything we've named already, is this ability to kind of do seamless change management. And actually, you know, you framed it as Big Brother, but I think there's also folks are asking for it. They want their outputs to be aligned. They want the output to kind of meet expectations that are being set forward. It's confusing if they're not. And so being able to put the, steer the output is just so important.
Georgia Bambrick:
So yeah, it's really exciting. And I think also replicable in other ways beyond just instruction. All right. I'm going to switch a little bit to hearing a bit about your professional development here. And so we've talked a bit about the product. We've talked about some of the learning too. How did your professional learning and intentional use cases play a role in some of the rollout of this across Boston?
Sarah Jay:
Yeah, it's really interesting. So I am kind of mostly interested in how we're using AI, for example, in the MTSS world, but I will briefly talk a little bit about like these really exciting ways that my colleagues are thinking about AI. Like we have colleagues who are thinking about improving kind of operational systems in the district. We have colleagues who are thinking about how they're supporting students in multiple different tools or configurations or pathways. So there's a lot that's happening in the district. For me, since I was working on this MTSS space, I really wanted to think about how can I use AI, in this case the Solara tool, to support with not only Tier 1 instruction, but also Tier 2, Tier 3 interventions, if you will. Because I had an ultimate huge goal of having a data set that would allow me to think about how BPS was meeting the needs of students sort of at the systems level. So I wanted to make sure that we were really doing the best we could at the individual level in order to think about the systems level and have sort of a dynamic way of really interrogating our systems and ensuring that we were doing right by our students for whatever they need individually and broadly.
Sarah Jay:
So I'm fortunate in that, and as part of my role, I have stipended educators in every single BPS school to sort of be a coordinator of MTSS. This is not their whole job. This is like in addition to their roles as social workers, instructional coaches, classroom teachers, you know, they're just people who are passionate about bringing these systems to their schools. I, with your help, Georgia, used those folks as almost like pilot users of the program. We introduced it to them, we kind of gave them an idea of like, what is AI, in collaboration with my colleague, Rhiannon Gutierrez, who she and her team in our instructional technology department really think through how we get educators to understand what AI is and isn't. Really great baseline understanding of the tool. And then thinking through what are the potential use cases. So we spent about half a school year last year really having folks try out AI, try to build intervention plans, give feedback, think it through.
Sarah Jay:
And that even led to some folks joining an AI fellows program that my colleagues Rhiannon and Tony were overseeing as a way of just seeing what educators were going to do with AI., and we got some amazing feedback from folks. We got folks who were using AI to really differentiate mathematical lessons, like my colleague Amelia did. We have folks using it to think through how reading data could be used to create differentiated small groups that are skills-focused but also lead to comprehension. Just a myriad of different ways that the tool could benefit teaching and learning. And then from there, we also thought deeply about this curricular adoption and how AI could be supported with the actual curricula that BPS implements across schools. And from there, we did some test cases around building these little AI curriculum chatbots, having teachers try them out, giving feedback, to the point that by the beginning of this school year, we were able to universally provide access to Solara and have a couple of different pathways for educators to learn about the tool. The demystifying AI, the general, like, how to use Solara, and then from there, we're really actually refining our professional development to get into not just, this is how you use the chat, this is how you use the tools, but this is how you deeply plan for your curriculum. This is how you use the intervention planning tool.
Sarah Jay:
To help you think through what a child might need. And we're getting to that point where we're really thinking about how we support educators in that sort of prompting, that talking back to the program, thinking through what their desired outcomes are. And that all is really exciting to me because I see as our use of the platform improves, the quality of the plans and the supports we put into place for students improves, And then my big goal of then stepping back and seeing, okay, who are we serving? How are we serving them? Where do we need to allocate resources, time, and further PD? I feel like I'm getting closer to that goal, that systems goal of ensuring that everything is doing what we intend it to do. And if it's not, what can we learn and how can we further improve?
Georgia Bambrick:
Yeah, I think what's really striking about the way that you as a team and I, and your colleagues and you have approached professional development around AI in Boston is it's really, yes, there's some foundational work that Rhiannon and team have done about AI literacy and sort of basics, but ultimately it's not another thing. You're embedding it into your core goals and outcomes. And I heard you talk about those pilots and those narrow use cases up front. That really actually allow you to do that validation. Because without the narrow use cases, you know, we talked about it's impossible to validate everything, right? I mean, quite honestly, it is. And so I hear a lot from district leaders, I don't have any more PD time, or how can we fit AI into this already packed calendar? And one of the things that's really powerful that you all have done is it's not another thing. It is part of the workflows, part of the work. And I think that's a really powerful takeaway that you might not see while you're in it, but want to make sure I call out that you're, you're sharing.
Georgia Bambrick:
And so, you know, I'd love to hear a lot of takeaways here that you've shared, but what advice would you have if another district leader kind of feels overwhelmed by AI implementation? What would you want them to take away from Boston's experience?
Sarah Jay:
Well, I think it's important to have a good team and a cross-functional team. People who maybe see the same challenges, but from different perspectives and have like different potential solution pathways, because having that team come together allows you to really grapple and think things through. I will just briefly say, I was like, Solara is it. It's it. That's it. We're done. And then I, it's helpful to have other people say like, okay, but you know, you have to do right, the right tool for the right task. And like, how do you narrow and get really clear on intentionality, use case, and make sure that people are equipped to choose the right tool for the problem.
Sarah Jay:
I want to be very aware that I haven't really talked about student use of AI because we really thought about like, okay, educators need to understand this deeply, and then that's going to help us understand how students are utilizing this. And we're going to go through the same situation, or same routine for lack of a better word, of with students as we did with adults, like, what are the potential uses? How might we use this? What are your recommendations? To kind of create a, or co-create with students rather than do to students in this way. And again, that requires a team of people to think through. So the idea of a cross-functional team guiding the work, trying things out, creating these use case pathways with educators, Getting feedback in sort of a dynamic give and take allows for when policies are created or when implementation guidance is put forth, it to be feasible and responsible and really grounded in the lived experience of the people who are going to be enacting the work. So like, that is really important. And I think ultimately my big advice would be like, what is your goal? Is your goal that all kids learn? Is your goal that every child unlocks the greatness within them? Is this ours? Then if that's your goal, then you have to kind of keep that as your north star and ensure that your AI use is sort of supporting you to move to that. Because, I mean, I'll admit it, it's sometimes really tempting to use the cool new thing and not have that cool new thing get you closer to your goal. Yeah.
Sarah Jay:
So you have to. Invite the voice in, you have to encourage critical feedback, and you just have to pressure test all along the way.
Georgia Bambrick:
I love it. I couldn't agree more. And I think it's really inspiring the way you all have set the bar of excellence sort of systematically and then really demanded transparency and then just had the right people in the room. And that's not easy, right? It's not easy to have the instructional leader leading the instructional AI work. It's oftentimes it's the technology folks. And so, You know, you want to make sure you have the right people, SPED leaders leading SPED work, right? Getting the right folks to kind of really validate towards outcomes is so powerful. So, there is so much that we could learn from you. We could talk forever.
Georgia Bambrick:
So, thank you. I hate to kind of end here, but thank you so, so much for sharing Boston's story so openly. I think what stands out in this conversation is not just strategy and not just some of these sort of technical you know, and/or adaptive decisions you've made, but really the leadership choices behind it at really key moments. The decision to center instruction, to be clear about steering and guardrails, and also to trust educators while staying focused on student learning.
Sarah Jay:
Thank you, and I can't thank you enough for your partnership in this, Georgia, and for Caitlin and the team. It is always helpful to have folks who will provide critical feedback as well and not just be cheerleaders, but also, you know, push us to do better thinking. So I'm really excited to see where this work can go.
Georgia Bambrick:
Thank you so much. And I'm really excited too, in our next episode, we're actually gonna continue this story, but from inside schools and classrooms. And so Sarah talked a bit about some of the incredible work that's happening with educators on the ground, directly with students in classrooms, in coaching cycles. And so we're gonna get the opportunity to hear from Boston educators who have really been a part of this work. And they'll share what this shift has meant for their day-to-day practice, how collaboration has changed, and what it actually feels like to use AI in ways that align with curriculum and student needs and to be on that electric bike, if you will. And so if you're a district or school leader wondering how system-level decisions really show up in real classrooms and what that impact is, you, you won't wanna miss that next conversation. So thank you again so much for listening to Leading and Learning with AI podcast, and make sure to hit that follow, and we'll see you next time.