Back to Newsroom
Shape it to your context
EducationEmpathy AI Team

Edge AI for Education: Hands-On Workshop on AI Governance

Visit Empathy AI's net-zero GPU infrastructure in Asturias. Build self-hosted AI on your own curriculum materials in a one-day workshop. No code required.

Edge AI for Education: Hands-On Workshop on AI Governance

AI is already in education. Students reach for it to explain concepts, summarise texts, and work through problems. Teachers use it to prepare materials and explore new approaches.

The question stopped being 'should AI be in schools?' some time ago. The question now is whether the AI being used was built for education or just available to it.

Most commercial AI tools schools use today were designed for scale, not classrooms. They run on infrastructure that no faculty member can inspect, are trained on data no school selected, and are sustained through data-collection models that raise serious consent and privacy concerns for minors. And the problem goes further with algorithmically driven platforms that are optimised for engagement rather than well-being, built to hold attention rather than develop it.

MOLLY vs. THE MACHINES, a documentary Empathy AI co-produced, follows this exact story: how algorithmic systems shaped for commercial purposes can profoundly affect the lives of pre-teens and teenagers who are still in the process of forming their understanding of themselves and the world.

That same architecture, built for engagement, opaque by design, governed by someone else's incentives, is now being offered to schools. Recognising it is the first step. Knowing what the alternative looks like is the next one. You can explore the broader thinking behind this shift, from using AI tools to governing infrastructure, through the Edge AI in Education article.

That gap is what led us here. Not just to write about the problem, but to do something about it. Building private, domain-based, edge AI for education is the goal, but knowing where you want to go isn't the same as knowing how to get there. So, we've created this workshop to transform that thinking into practice. It's your 101 on Edge AI for Education.

What is the Edge AI for Education workshop?

This is a one-day, on-site workshop that takes you behind the screens of AI.

At its core, the workshop is about empowering educators through AI and exploring pedagogical possibilities. An in-person experience at Empathy AI's headquarters in Gijón, Asturias, in the north of Spain, inside the region's first net-zero energy bioclimatic building.

The in-person format is deliberate. One of the central arguments for edge AI in education is that AI should be something the educational community can see, understand, and hold to account, and not a black box running somewhere out of reach.

We put that thinking into practice from the moment you arrive at our building, where solar panels on the façade power the GPUs inside. You walk through the actual server infrastructure and see and touch the real, tangible hardware.

You meet the engineers working on it. You get hands-on with workflows that put educators in control of AI: from feeding it curriculum materials and shaping how it behaves in line with educational values, to reviewing its outputs and learning how to evaluate it.

Being face-to-face with the physical reality of AI, the racks, the machines, the people, the energy infrastructure behind it, is something no slide deck can replicate. It makes the abstract tangible, changing how you think and talk about AI.

And crucially, that understanding is what makes it possible to teach students to think critically about AI too. You can't explain what you haven't seen.

Empathy AI's values on responsible AI, privacy-first design, sustainability, and transparency run through every session. All materials are produced for a private, on-premise environment, and the workshop is designed to comply with AI sovereignty principles from start to finish.

This workshop also aims to serve as a safe space to discuss AI in education honestly: the concerns, failures, possibilities, and the questions that don't yet have answers.

Who is the workshop for?

The workshop is designed for teachers, educators, school administrators, and EdTech coordinators or anyone in the education community who makes decisions about how AI is used, teaches students to engage with it responsibly, or is trying to establish what responsible AI use means at an institutional level.

No technical background is required. The goal isn't to turn you into AI engineers. It's to give you the direct experience, strategies, and working vocabulary to govern AI, rather than simply consume it.

What does the workshop cover?

The workshop runs across one full day, structured in five phases that build on each other.

  1. Demystifying AI
    The day opens with a welcome circle, a structured conversation where we can all share what we teach or support, our current challenges or concerns about AI at school, or any other questions you want to bring into the room.
    It's a simple format, but it helps us surface the real concerns before any technology is introduced and sets them as the starting point, not an obstacle to get past.
    From there, we work through the foundational distinction that everything else builds on: the difference between cloud AI and edge AI and what that difference specifically means for institutional autonomy in education. Not as an abstract technical comparison, but as a practical question: who controls what, who can see what, and who's accountable when something goes wrong.
  2. Tech tour and case studies
    This is where the abstract becomes physical. You walk through Empathy AI's facilities together with our AI engineers: the server racks, the GPU infrastructure... You can see the hardware that a self-hosted AI system actually runs on, and ask the questions that rarely get answered in vendor presentations and get direct answers from the engineers who built it.
    From the infrastructure, the session moves into case studies of what that infrastructure makes possible in practice. You see real projects running on Empathy AI's systems and explore how they could translate into educational contexts. Projects like Bring the Book to Life—Project Gutenberg AI or the Book of Books, for instance, can help students engage with literature through meaning, themes, and narrative, a fundamentally different relationship with texts than few discovery tools allow.
    The case studies are designed to open up possibilities, not prescribe solutions: the goal is for you to leave with a clearer sense of what becomes achievable when the infrastructure is yours to shape.
  3. Building
    After lunch, the workshop shifts to practice. You work directly with the system: defining what values an AI should prioritise for a learning context, uploading sample curriculum materials, and testing how the AI responds to student-like questions.
    You can evaluate the outputs, including the failures. Spotting hallucinations, identifying gaps, and recognising bias in a system you've just helped configure is a different experience from reading about those risks in a policy document. It gives you the practical vocabulary to make informed decisions about any AI system you encounter later.
  4. Master prompting
    With that hands-on experience in place, the workshop moves into a conceptual walkthrough, where no coding knowledge is required, of how these systems actually work: how AI remembers documents, how a query travels through the system, how it signals uncertainty rather than inventing an answer, and how it cites sources.
    This session is about building your capacity to shape AI behaviour through prompting and configuration, without writing a single line of code.
  5. Reflecting
    The day closes with you thinking through what a self-hosted AI system could look like in your own institution. What surprised you? What materials would feed it? What would it prioritise? Who in your educational community would have a role in shaping and governing it over time? What concerns remain? What possibilities do you see?
    This is where the workshop becomes specific to the people in the room, and where the questions that emerged throughout the day become something worth acting on.

What do you take away?

This workshop is hands-on by design, and so are its outcomes.

By the end of the day, you have a working understanding of how self-hosted AI works, how to train an AI on curriculum materials and resources your institution trusts, how to run it locally with no external data transmission, and how to prompt and configure it to reflect your pedagogical values and educational standards, without writing code.

More broadly, you leave with a clearer framework for keeping student data within your institution, reducing reliance on commercial platforms, and making AI governance a real, ongoing practice rather than a procurement decision made once and forgotten. You discover a different model of AI infrastructure, one where the educational institution is the operator, not just the customer or consumer.

The concerns that brought you to this conversation don't disappear after one day. But they become navigable once you've seen what the alternative looks like, talked to the people building it, and tested it with your own materials.

Who leads the workshop?

The workshop is led by Ángel Maldonado, founder and CEO of Empathy Holdings, and Pablo Cañal, Head of Product at Empathy AI, alongside the engineers who design and build Empathy AI's infrastructure day to day.

You're not sitting in a classroom listening to a presentation; you're working alongside the people who built the systems you're learning about, in the building where those systems run.

This is a workshop, not a masterclass. It's deliberately participative.

Your questions, your concerns, and your specific institutional context are what shape the day. The more you bring to the room, the more you leave with.

Interested in bringing this workshop to your institution or network? Get in touch to learn more.

Frequently Asked Questions

What is edge AI, and why does it matter for schools?

Edge AI runs on local or institutional infrastructure rather than remote commercial servers. For schools, this means student and teacher data stays within the institution's own systems, nothing is transmitted externally, and administrators retain full visibility and control over how the AI operates and what it stores.

Who should attend this workshop?

Teachers, educators, school administrators, and EdTech coordinators. The workshop is for anyone who makes decisions about AI in educational settings, teaches students to engage with it critically, or is responsible for establishing institutional AI governance. No technical background is required.

Does the workshop require any technical expertise?

No. All sessions are designed for educators without a technical background. The goal is a working conceptual understanding of how self-hosted AI systems operate: enough to govern them, define their limits, and evaluate their outputs, not to train engineers.

What workshop formats are available?

The workshop is designed as an in-person experience at Empathy AI's headquarters in Gijón, Asturias. Standing in front of the infrastructure and talking to the engineers is central to the experience. For institutions where travel is not feasible, alternative formats may be explored.

Is this only relevant for schools already planning to adopt AI?

No. It is equally relevant for schools trying to understand what they have already adopted, institutions sceptical of AI, and those wanting governance frameworks before any deployment. The most useful outcome is the capacity to evaluate any AI system against educational and ethical criteria.

Can teachers configure AI for specific subjects without coding?

Yes. Defining what an AI system draws from, the tone it uses, and the questions it declines to answer requires clarity about educational priorities, not coding expertise. The master prompting session gives educators exactly that capability, with no code involved.


Continue reading