The State of AI in Higher Education — Campus Technology

Artificial Intelligence

The State of AI in Higher Education

Both industry and higher ed experts see opportunities and risk, hype and reality with AI for teaching and learning.

Matthew Rascoff, associate vice provost for Digital Education and Innovation at Duke University, views the state of artificial intelligence in education as a proxy for the “promise and perils of ed tech writ large.” As he noted in a recent panel discussion during the 2020 ASU+GSV conference, “On the one hand, you see edX getting more engagement using machine learning-driven nudges in courses, which is pretty amazing. But on the other hand, we have all these concerns about surveillance, bias and privacy when it comes to AI-driven proctoring.”

Rascoff identified “something of a conflict between the way this stuff is built and the way it’s implemented.” In his role at Duke, he noted, “It’s really hard to distinguish [in AI] what’s real and what’s not.”

Rascoff joined other panelists in one of two sessions held back-to-back, examining the place of AI in learning: “AI Transforming Higher Learning” and “AI in Education, Hype vs. Reality.”

“Robots Are Going to Rule the World”

Fears abound, noted several of the speakers. “People think that AI agents are coming for teachers’ jobs, that robots are going to rule the world, and they’re going to teach our kids, that the kids will love the robots more than humans. There’s a lot of sci fi out there,” summed up Bethanie Maples, product manager for Google AI. “In reality, digital agents are a really interesting way to supplement learning. We need more tutors. And if there aren’t human tutors, there’s a place for machine tutors to help progress individual learners, There [are] really cool things we can do with personalization using machine learning and especially with adaptive assessment.”

Job fear is a biggie, acknowledged Stephanie Butler, product manager at ed tech company Turnitin. “We’ve moved from this world where technology takes basic rote work and automates it to a world in which technology is starting to encroach on intellectual work, on intellectual life and creative work. And you see that in many different places, not just education.”

Likewise, on the teacher side, Butler added, especially in the Western tradition of higher education, there’s a fear that AI could encroach on academic freedom, how educators “spend your time researching, what you publish, how you publish it, who you work with, and, of course, your content, the topics you decide are relevant for your students, how you craft the learning experience for them.”

On the learner side, there are fears that AI systems could “track a learner from a very young age and have this personalized lifelong learning experience for them,” suggested Maples. “The risk there is that they get put into a track that they can’t escape. That’s something that’s very near and dear to all of us, especially to the American spirit, the concept of ‘rebirth.’ We believe that you can start over and that you can fail and then wake up the next day and shrug it off and maybe go found another company or start a new job.” The problem arises, she explained, when learners can’t delete their data “and say, ‘I’m a new person today.'” For that reason, “it’s extremely important that we allow for user data control, because we all change, and we have to build the allowance for change into this assessment.”

Source Article