Should we put cameras in every classroom? At first glance, this may sound like a dystopian nightmare set in a surveillance state. And yet, with careful deployment and strong regulations in place for data use, I argue that such an approach could help us avoid a future where students constantly have their heads buried in a device. At the same time, it could enable teachers to get the feedback they currently lack to make continuous improvements to their teaching and to the classroom environment—and even to do their own action research. Researchers and practitioners in learning science, privacy, and ICT must come together to build frameworks that will protect students and teachers, and shape this positive future. In this article, I discuss why this approach is needed, who benefits from it, and how we're building systems that improve education.
Herb Simon, the godfather of artificial intelligence and a multi-award-winning teacher, observed the following truth about the teaching and learning process: "A teacher can advance learning only by influencing what the student does to learn."
Influencing what the student does is, at heart, a cycle of detection and intervention—observing and interpreting before acting. To be able to provide the right intervention that guides the next steps in the learner's development, an instructor must know quite a lot about what is happening in any given learning situation.
This detection process is hard. In the moment, it's difficult to know what is going on in a learner's mind. As a learner struggles with material, cognitive processes can be opaque. And of course there's much more to learning than just cognition; a bored look may also contain frustration, or a lack of confidence in one's own abilities that makes it impossible to move forward.
In the longer term, even when we do perceive the things happening in the learning environment, our memories by nature are limited. For instance, in my lab we have found that, after class, college professors have a hard time accurately remembering how many questions they've asked during a session and how many questions the students actually answered. By the time an instructor has a chance to slow down and reflect—when we plan our activities for the next day—even simple actions that had been accurately detected in the moment may have slipped away.
Besides being limited, our memories are biased. Nathan and Koedinger's concept of the "expert blind spot," or the difficulty that experts have in putting themselves in the position of a novice , explains one way in which we revise our perspective on a situation as we go. This concept was developed from the finding that mathematics teachers made incorrect judgements about the difficulty of story versus equation problems for young students (contrary to popular opinion, in many cases story problems are easier!). Even more surprisingly, they then found that elementary school teachers were better at predicting how their students would do than high school teachers. The more knowledge the teachers had, the harder it was for them to imagine the cause of the struggle that their students were facing.
To be able to provide the right intervention that guides the next steps in the learner's development, an instructor must know quite a lot about what is happening in any given learning situation.
Having access to accurate, detailed data about the learner could reduce these problems. It would alleviate the heavy burden of doing constant detection for every student in the class, offering instructors the ability to reflect and better intervene based on a meaningful understanding of the situation. If we know that a learner is bored, we can switch up the activity or make the problem they are working on more challenging. If we can sense the exact misconception that the learner is having, we can give them a different explanation of the underlying principle or help them find a new path to discover it themselves. If we know that a student is low in confidence, we can offer assurances and build trust with a mentoring relationship. Teachers are of course continuously processing the scene for cues, including from their students' faces, bodies, verbalizations, and prior work, over time becoming experts at doing this detection for any individual. But can you imagine the challenge of doing this for 25, 40, or 200 learners at a time? More frequent, consistent data has in past centuries been hard to get and store for an entire group of learners in a chaotic environment.
In part, we have a solution to this problem. With the advent of digital technology in the late 20th century, the most logical way we have devised to achieve this data-rich detection space has been to put people on a device. When learners are working on a computer, phone, or tablet, we can collect a plethora of data points about what they are doing on a second-by-second basis. Everything from every attempt made at answering a problem, to their pauses and hesitations, to the resources they access along the way can be captured in this manner.
With this data, machine learning algorithms can drive analyses to better understand students' learning patterns and help provide an interpretation of the situation once the detection of a phenomenon is done. For instance, we've successfully been able to detect when a learner is truly struggling with a concept versus when they are attempting to "game the system" and avoid doing the hard mental work (see Baker et al.'s work in this area ). We've been able to detect when students are appropriately asking for help or when a teacher or peer should intervene. Researchers in the fields of learning analytics, educational data mining, and others have used these approaches to observe many important results that further our understanding of the learning process.
These positive results would make one argue that we should require more and more of the learning process to take place using technology. However, there is a lot of learning that doesn't happen on a device—and in fact, we may not want to relegate learning to a device. For instance, while discussion, collaboration, and other group-based activities can be conducted online when necessary, the bonds that are formed when learners work together in person and when they engage with tangible learning materials can be lost.
And yet, valuable interactions in the physical world have been hard or impossible to capture. So what can we do if we want to achieve the outcomes that digital data can give us but still allow learners to explore in freedom, untied to devices?
I argue that the future could lie in off-device sensing, where instead of instrumenting learners we instrument the physical space, leaving learners free to roam about the class. In theory, with the right technological advances, such sensors could detect all kinds of things that we want to observe about the learner—their conversations with other learners, the objects they interact with, their movements around the space, and more.
In fact, this type of work has been happening on an individual scale, in which a learner sits in front of a computer that's doing the detecting. Researchers like Sidney D'Mello and Art Graesser have found that with a simple camera we can accurately detect frustration, confusion, wandering minds, and other facets of the learner's experience [3,4]. Others, such as Louis-Philippe Morency, have built toolkits to allow researchers to explore multimodal sensing, incorporating aspects of the voice as well as facial expression and gesture to paint a more comprehensive picture of what's happening in the physical world .
But we're also moving to the capability to do such detection at a room scale. Researchers such as Pierre Dillenbourg have been exploring the instrumentation of the teacher rather than the students, by putting cameras and other sensors around their neck to observe what they see in class, including motion and student gaze (e.g., ). Industry has also taken up the challenge of classroom-level sensing. Companies such as EarShot and TeachFX have focused on audio rather than visual sensing, investigating the questions and other dialogue moves happening in class that are not captured by typing on a device.
My own lab, along with a large team of interdisciplinary collaborators, has now instrumented 40 classrooms at Carnegie Mellon University with classroom-sensing systems that use cameras and microphones to research the detection of behaviors in class. On many of our target classroom behaviors we have already found that we can achieve high accuracy in detection . This is currently one of the largest such systems in the U.S., which could have enormous benefits for understanding learner behavior and in turn provide them with better support.
So far, I've been painting a rosy picture of what could happen if classrooms move toward a future of technology-aided detection. As I noted above, however, that is not to say that this approach—which has the potential to know so much about the learner—does not at the same time generate legitimate concerns. There are several examples of classroom-camera scenarios happening in the real world right now that have raised the visibility of these issues around the world. In Delhi, India, government schools will have 120,000 CCTV cameras deployed across more than a thousand schools by this November. The main goal of this system can be construed as surveillance: It will allow parents to observe what is happening in their child's class throughout the day on an app on their phone, privileging parental rights to security over those of teachers or children to privacy.
|View of the instructor's teaching from the rear ClassInsight camera.|
|Screen captures of ClassInsight interfaces for exploring detailed views of the data.|
In schools in China, cameras have been installed that go beyond allowing observation to actually doing the detection I discussed above. Specifically, they detect facial expressions to process student attention and emotion. One student who was interviewed from a school after the system was installed told reporters from The Telegraph, "Previously when I had classes that I didn't like very much. I would be lazy and maybe take naps on the desk, or flick through other textbooks, but I don't dare be distracted after the cameras were installed in the classrooms. It's like a pair of mystery eyes are constantly watching me" . One goal of this sensing is to increase attention by eventually docking grades for students who don't appear to be engaged in class activities.
Neither of these scenarios are currently providing safe environments for all of the stakeholders involved to feel free to learn, explore, and grow. Without any frameworks in place to guide the use of such a system, what can we expect the future of classroom sensing to look like? Will all mistakes be recorded for eternity, leading to a debilitating fear of failure? Will learners be poked and prodded to concentrate if their mind briefly drifts during class? And remember the difficulty of knowing what is happening in the learner's head: What happens if the technology inaccurately detects what the learner is doing? Will learners who don't look the same as the majority be forced to conform in order to be served equitably?
The alternative paradigm I will propose still takes advantage of the data-forward approach of classroom sensing. What if, once we are able to sense a variety of behaviors in the classroom, we instead give power to the teachers, allowing them to make the decisions about when and where to intervene with their students? Having a human in the loop may give us an alternative that leads to better interpretations of the data than machines can achieve. Instead of reducing students to variables in an algorithm, it could allow us to creatively address the needs even of outlier students who may not fit the typical patterns—or help us find new patterns in the data to better serve all learners.
What if, once we are able to sense a variety of behaviors in the classroom, we instead give power to the teachers, allowing them to make the decisions about when and where to intervene?
In fact, we could go a step further. Rather than using such a system to monitor student behavior, even with a teacher making decisions, an inventive new paradigm may be to instead flip the focus of the sensors: putting the lens on the teacher and empowering teachers themselves to use data for improvement of their own practices in the classroom.
Even in a technology-free environment, students turn in tests, react to discussion about a concept, and perform many other actions that give teachers data about their current state. On the other hand, teachers get very little data about their own behaviors compared with what they get about their students. It turns out that what appear on the surface to be simple behaviors to observe are actually quite difficult when faced with the "blooming, buzzing confusion of sensory data" that is the classroom . It's hard to improve the cycle of teaching without this data. For this reason, teacher noticing—a study of what teachers observe, where, when, why, and how—has become a major field of study in the past decade (see  for a review), including differences in what novice and expert teachers perceive. This has also led to a plethora of low-tech tools, such as video-viewing software, that are intended to give teachers a look at what they themselves do in their classroom but may not see in the moment.
And yet, acquiring regular, accurate data on teaching practice is currently not scalable. Although it has major benefits, reviewing hour-long videos of your own class turns out to be incredibly time-consuming for teachers and can't be done frequently due to the incredible number of other constraints on their time . A different approach taken by many schools at every level is for an observer to come in and watch the class, later providing a digest of what they saw in a professional development session. When such a service is requested or required, acquiring this data currently relies on professional or trained human observers to provide this extremely valuable and individualized formative feedback. Unfortunately, the high cost of in situ experts precludes any continuous instructional feedback loop. Instead, perhaps one or two lectures can be observed for any instructor in a year—not nearly frequently enough to track any changes in pedagogical practices over time.
Imagine instead that we can collect this data on a daily basis for every class that is taught. You could end up with something like a professional FitBit for your own teaching practices. In our work, this has taken the form of a phone app that offers a glance at what happened that day—an extremely simplified dashboard that gives the teacher an opportunity to see and reflect on what happened, observe the results of their changes, and make plans for their next class while receiving resources for new strategies. As part of this planning, we let teachers set goals for what they would like to focus on in their data, whether for tomorrow or for the long term. The following scenario gives an example of how a college instructor might use such a system:
Anvi is a new professor who teaches a small class on introductory physics. After finding the first few weeks to be difficult, feeling like she doesn't connect with her students, Anvi visits the teaching center at her university and requests a consultation. In particular, she is worried that her students are not engaged with her lectures, and she wants to try to get them to show more interest. At the teaching center she takes a workshop and is recommended to download an app to help her track and develop her teaching skills.
The app, ClassInsight, begins with an introductory survey about the kind of class Anvi is teaching and her perspectives on teaching and learning. She finishes her survey and gets feedback from the system showing that she has an interesting mismatch between her beliefs and actions: She shows a tendency to use a transmission model of teaching, but she believes it would be useful to conduct a more developmental classroom. Anvi recognizes this perspective as similar to her own thought process, and is interested to find out how the app might help her bridge that gap.
|Preliminary classroom data visualization app that could be used by instructors.|
At this point a research team installs a set of classroom sensors that will collect and report data to the Data Collector in Anvi's app. The sensors track features in real time while Anvi teaches, such as where she and her students are looking, how many students speak up or raise their hands, and how much time Anvi leaves after asking questions. After class, Anvi gets a ClassInsight notification to review this data on her smartphone. She first takes a very short survey that relates directly to her goals: How well did she think she did today, and what does she think she could do better? She inputs a few words about how she thought she did a good job covering all the material, but that she still did not get much feedback from the students. The app then shows her a visualization of how much of the time she spent actually looking at her students versus the blackboard, and how much time she paused in her talking. Anvi notices that she never paused talking for more than two seconds at any point, and that she only looked at her students for 20 percent of the time she spent lecturing. Anvi is surprised at this discovery, and a bit disappointed that her style of presentation probably does not express her enthusiasm for physics and her students' learning. The app asks her to describe her perspective after seeing the data. She inputs some thoughts about how she did not realize that she looks at her students so seldom, and that she feels awkward waiting for students to answer questions, but she would like to do better.
The app asks Anvi if she would be interested in setting specific goals for the next class session. It recommends trying to increase the amount of time she spends looking at students to 60 percent and to extend her wait time after asking a question to the three seconds recommended in the learning science literature. Anvi takes a deep breath... and taps that she agrees. After the system logs these two new goals, it asks her to describe any strategies she might use while trying to reach them, and how well she expects to perform. It then offers her a chance to read about some other strategy ideas that relate to the given goals. She taps through to these short readings and is surprised to find a couple of simple but effective-sounding strategies: putting explicit markers in her slides for student discussion points, and quietly counting to 10 while she waits for students to respond. After she reads about them, she takes a quick quiz on the details of the strategies, and gets them right. Anvi thinks these are reasonable strategies, and is excited to try them in the next class. The app then helps her write out her learning goals for her next class and find some good places to insert questions to try out the wait-time strategy.
In the next class Anvi focuses on looking at her slides less and at her students more. As she does so, she begins to notice that she can infer when students are confused and when they seem bored. She gains confidence in her question asking, because she gets a clearer idea about what kinds of questions she will be more likely to get answers to. This makes waiting after asking a question more comfortable for her, as she finds that students are willing to answer if given enough time to think about the problem.
This approach draws heavily on Donald Schön's idea of the reflective practitioner . The premise of Schön's foundational work is that professionals who receive real-time coaching and encouragement to think carefully about what they do while they do it go on to learn and improve their practice in a more profound way. Again, such real-time coaching is currently inaccessible for most teachers. While reflective practice is a principle that many schools of education encourage, as of now it must be done with little support and less information. Injecting data into the conversation could significantly boost this reflection process. In fact, in our work we have found that even novice instructors are able to use their data to look back and reconstruct a teaching session almost minute by minute. While a data-based approach may sound rigid and inflexible, it opens up a space for interpretation. The teacher is the one who knows what conditions were like during class—maybe there was a fire drill that day, or she lost her voice and changed the planned lesson for an alternative activity. Reflection on her own data should take this into account.
Critical reflection of this type is also a hallmark of action research, a paradigm outlined by Kurt Lewin  that seeks transformative change through the simultaneous processes of taking action and doing research. In teaching, this takes the form of teachers alone or as part of a community of practice acting as experimenters, testing out new techniques in the classroom. Empowering a generation of teachers to better capture the outcomes of their research into what works in their classroom could let us move the science of learning forward much faster and for a broader set of contexts than academics would ever be able to do on their own.
Of course, with these exciting possibilities we still have not sidestepped the serious questions about ethics and fairness that should be examined closely—as with all new technologies. As researchers, our next steps should interrogate who has access to this data, what they might do with it when they have it, and who could possibly benefit from its use.
For instance, if students had greater access to their own data, they might reflect on what actually engages them in class and find productive ways to transform activities they don't find as interesting—or they might instead deflect blame for their own lack of understanding onto a "boring" lecture.
If parents access classroom data, they might use such data to better understand the progress of their child or to find ways to help them develop their confidence. Or they might use it to raise concerns about a teacher whose approach they do not like.
Alternatively, our own investigations have revealed teachers' deep fear that school administrations might access and use such data to evaluate their teachers. But on the other hand, they may use it to distribute support more equitably to various classrooms, or to reward teachers who show regular improvement.
Whether or not concerns over teacher and student privacy and data control are resolved, the future of sensors in the classroom may be happening organically without the input of security experts. The Amazon Echo, which collects voice data and sends it to the cloud for analysis, has been making an appearance in classrooms. As shown at the 2019 International Society for Technology in Education conference, the Echo has been trained to do things like announce teacher absences for the day, indicate whether any classrooms need a substitute teacher, and deliver notifications about any urgent forms to sign. The Echo can be used for administrators by providing a daily report of critical information, or could enable teachers to report an illness from home and whether they are able to make it to school that day. As EdSurge recently reported, "These demos come just one year after an Amazon representative said—at the same conference—that Alexa should not be used in the classroom due to privacy and compliance issues. But that warning hasn't stopped some educators...who presented at ISTE about how [they have] helped teachers in [their] district bring Alexa into their classrooms" .
In the meantime, while we create frameworks to help us make sense of this new world, I propose some guidelines to follow for more ethical use of classroom sensing:
- Redefine classroom sensing as the use of cameras, microphones, and other sensors to collect data about participants in the learning ecosystem, not just students.
- Ensure that the person whose data is being collected is in control of what happens with that data; always anonymize and/or aggregate other participants' data if they are not in control.
- Data should be used for formative understanding aimed toward improvement, rather than in an evaluative capacity.
- Participants should always have the choice to reflect on the particular data captured and use it or not use it.
Like all powerful technologies, classroom sensing can be used in ways that empower stakeholders or oppress them. Before putting this technology in the hands of for-profit companies, this is a call to arms for the teaching and HCI researcher and practitioner communities to work together as stewards of the future, providing frameworks for how classroom sensing can be applied ethically and effectively.
1. Nathan, M.J., Koedinger, K.R., and Alibali, M.W. Expert blind spot: When content knowledge eclipses pedagogical content knowledge. Proc. of the 3rd International Conference on Cognitive Science (Vol. 644648), 2001.
2. Baker, R.S., Corbett, A.T., Koedinger, K.R., and Wagner, A.Z. Off-task behavior in the cognitive tutor classroom: When students game the system. Proc. of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, 2004, 383–390.
4. Graesser, A.C., McDaniel, B., Chipman, P., Witherspoon, A., D'Mello, S., and Gholson, B. Detection of emotions during learning with AutoTutor. Proc. of the 28th Annual Meetings of the Cognitive Science Society. 2006, 285–290.
5. Morency, L.P. MultiSense Live; http://multicomp.cs.cmu.edu/resources/multisense/
7. Ahuja, K., Kim, D., Xhakaj, F., Varga, V., Xie, A., Zhang, S., Townsend, J.E., Harrison, C., Ogan, A., and Agarwal, Y. EduSense: Practical classroom sensing at Scale. Proc. of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 3, 3 (2019), 71.
8. Connor, N. Chinese school uses facial recognition to monitor student attention in class. The Telegraph. May 17, 2018; https://www.telegraph.co.uk/news/2018/05/17/chinese-school-uses-facial-recognition-monitor-student-attention/
14. Tate, E. Alexa goes to ISTE: Edtech Companies—and Teachers—Debut New Skills for Learning. EdSurge. June 26, 2019; https://www.edsurge.com/news/2019-06-26-alexa-goes-to-iste-edtech-companies-and-teachers-debut-new-skills-for-learning
Amy Ogan is the Thomas and Lydia Moran Assistant Professor of Learning Science in the Human-Computer Interaction Institute at Carnegie Mellon University. She is an educational technologist making learning experiences more engaging, effective, and enjoyable. She has conducted field research on the deployment of educational technology across many international sites. firstname.lastname@example.org
Copyright held by author. Publication rights licensed to ACM.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2019 ACM, Inc.