69

Chapter 5: Teaching possibly which factors were influential) when reaching a particular decision. For example, some of today’s adaptive classroom products use limited recommendation models that only consider student success on the last three mathematics problems and do not consider other variables that a teacher would know to consider, such as whether a student has an IEP Plan or other needs. Our call for attending to equity considerations as we evaluate AI models requires information about how discriminatory bias may arise in particular AI systems and what developers have done to address it. This can only be achieved with transparency for how the tools use datasets to achieve outcomes and what data they have available or that a teacher could include in her judgement but are not available to the system (IEP status is offered as an example above). Teachers will also need the ability to view and make their own judgement about automated decisions, such as decisions about which set of mathematics problems a student should work on next. They need to be able to intervene and override decisions when they disagree with the logic behind an instructional recommendation. Teachers need protection against adverse ramifications when they assert human judgement over an AI system’s decision. “These systems sometimes are seen as a black box kind of a situation where predictions are made based on lots of data. But what we need is to have a clear view—to clearly show how those recommendations or those interactions are made and what evidence is used or what data is used to be able to make those recommendations so teachers and everyone involved know about why that kind of system is providing that type of 59 | P a g e

70 Publizr Home


You need flash player to view this online publication