Explainable AI is essential in education


Edtech adoption in our schools has increased during the pandemic, and if reports are to be believed, the adoption of Artificial Intelligence (AI) has steadily increased too. This is largely a positive development, but do principals and teachers really understand the technology? AI is often adaptive and self-learning, which means that what it already understands about a learner will lead it to draw certain conclusions about that learner's future needs.

As more schools use AI-powered technologies, it becomes more important that teachers understand how technology makes decisions. Teachers need to understand not only what a child has learned, but also how they have learned it. For this to be possible, AI-enabled technology vendors need to explain how to choose a particular course of action. For example, AI teaching a foreign language might advise you to revise some words more often than others based on data on the words you forgot the fastest.

AI ambiguity

It can be difficult to explain how AI programs make decisions or why they do what they do. Even if you have an explanation, it can still be unclear. Things like "The AI ​​made a decision because past data suggests this is an optimal choice" will not help a teacher explain to a student why he is now on a subject they believe is they got it.

This form of AI, often referred to as "black box AI," can be incredibly powerful and effective. But education is too important to be entrusted to an obscure black box - instead it needs explainable AI.

Explainable AI is not a new concept in AI circles; Indeed, the EU is putting forward new rules in this area, but their meaning needs to be understood by school leaders and teachers. Explainable AI means that the results of an AI-supported reaction can be understood by humans. This is in contrast to black box AI, where even the developers of a tool sometimes cannot explain why an AI program made a particular decision. The choice made by the AI ​​need not always be the same that a human would make, but a human should be able to understand the process by which the decision was made.

The choice made by AI doesn't always have to be the same that a human would make, but a human should be able to understand the process by which the decision was made

Improve AI, not obsolete

For classroom materials like Sparx, this means that a teacher using AI should be able to ask themselves, "Why is this student being asked this question?" and get a clear answer. If it doesn't, I wonder if it has a place in the classroom. Teachers bring their students' professional judgment and knowledge that AI will always lack. Effective and understandable AI must involve teachers and their expert knowledge.

Effective and understandable AI must involve teachers and their expert knowledge

The growth of AI in schools is to be welcomed; it can save teachers time and provide a level of personalized learning that would be challenging for any teacher. However, we need to make sure it is built on principles that allow it to be trusted over the long term. Teachers make decisions in the best interests of their students, and they need to be confident that any AI tool will reinforce, rather than refute, those decisions.

I encourage school principals to challenge edtech companies using AI to describe how a teacher at the individual student level can understand why AI selected a particular question or resource. It's a simple question, and the answer should be easy to understand. If so, you can be sure that they are committed to explainable AI.

You might also like this: Big data is our friend - this way it can contribute to a level playing field in education


continue reading http://dailytechnonewsllc.com/explainable-ai-is-essential-in-education/?feed_id=1959&_unique_id=61a6feda68fb3

Commentaires

Posts les plus consultés de ce blog

Projections for the gambling industry for 2022

Tremendous scope for India-US cooperation in health technology and pharma, says expert

Global Home Sleep Screening Wearable Devices Market Insights Report, Forecast to 2027 – KSU