Navigation
Recherche
|
My robot teacher: The challenge of AI in computer science education
lundi 6 janvier 2025, 10:00 , par InfoWorld
Over the past two years, generative AI has been a force for transformation—and disruption—everywhere it’s landed. Education was no exception; if anything, schools were among the first institutions to grapple with AI’s implications. As students embraced ChatGPT and similar AI technologies for research, test preparation, and academic writing assistance, among other things, educators found themselves at the forefront of a sweeping societal change—and a growing ethical dilemma: Should AI-assisted learning be accepted as a new normal in education, or are students cheating themselves by not learning basic skills?
If anything, these debates are more acute in computer science education than elsewhere. Professional developers are among the most enthusiastic early adopters of generative AI tools. To ask learners—whether they’re in high school, college, or taking a professional course—to go without AI assistance seems almost as quaint as making them use punch cards to input their test responses. But there’s still a reasonable question: How can we ensure students develop the foundational knowledge to be able to understand and evaluate AI suggestions? We spoke to professionals with a foot in computer science education to find out how AI tools are transforming the learning process. Fundamentals matter more than ever Seth Geftic, VP of product marketing at the cybersecurity firm Huntress, is heavily involved with mentorship in computer science and cybersecurity. To him and nearly everyone we spoke to, one of the biggest risks of AI in a learning environment is that it can help students bypass the knotty problem-solving exercises crucial to their education. “AI in the learning experience makes it extremely easy to seek help as soon as you come up against something you find difficult or strange,” he says. Ideally, in a learning environment, “you’ll need to improvise, think outside of the box, and find unusual (and productive) ways that your building blocks of knowledge can interact and help you solve a question. These harder segments where you have to think for yourself are the real teaching moments in a computer science course and are the moments that will set apart students that are okay from students that are fantastic.” “When AI is right there, it becomes extremely easy to turn to the machine if you come up against these moments that require additional thinking,” he adds. “When people rely too much on artificial intelligence, I worry that there becomes less and less moments that make you develop that creative muscle. I’m always going to champion teaching people to fish, rather than teaching them how to interact with an AI that can fish.” Michael Wilson, COO of GenTech, a community tech hub with school- and community-based STEM programs, says it’s largely a matter of how AI is used. “For students, AI makes searching for an answer easier, but when used as a forum rather than a search engine, it becomes harmful,” he says. “Asking ‘Write me a program that prints hello world constantly’ is different from ‘How do I print to the screen?’ followed by ‘How do I repeat a section of code forever?'” Iterative prompts like these can lay the groundwork for fundamentals. At the university level, students are expected to understand the importance of learning fundamental concepts and not taking shortcuts. Dr. Tirath Ramdas is the founder of the education-focused GenAI company Chamomile.ai, and also teaches a software development undergraduate course at a major university in Melbourne, Australia. In both roles, he has a front-row seat to how generative AI impacts software development teaching and learning. “The university’s policy is for students to use any tools available as a learning aid but to take responsibility for their own learning, to ensure they really understand the material rather than rely on code generation,” he says. “Overall, I think university students deserve some credit for their attitude towards generative AI. A survey by the Harvard University Undergraduate Association found that the number one reason students abstain from generative AI use is to avoid becoming over-reliant on it, so their instincts are right.” AI guardrails in the classroom At a fundamental level, instructors need to know students are completing assignments in ways that help them improve their skills without cutting corners. The experts we spoke with described various approaches to this problem. Some allowed the use of generative AI while others restricted it. Open book, closed prompts At Dr. Ramdas’s Australian university, most exams are open-book and allow almost unlimited Internet access. “Interestingly,” he says, “these exams have always had a policy of barring communications apps so that students couldn’t communicate with others to do their work. This policy has now been extended to include ChatGPT-like systems, and invigilators are instructed to look for cases of cheating with such systems. It may be detected when students are seen writing paragraphs of text as one does when prompting an LLM, but is not the norm for normal coding.” Keep it focused Elmer Morales is the founder and CEO at koderAI, and is training his daughter and other software engineers who are learning to code. “I’ve seen professors ask students to ensure their code only uses topics they’ve learned in class, which is something the AI won’t necessarily know,” he says. “This is not entirely bulletproof, but it does force the student to review the code before using it, which is a form of learning and helps noobs improve their coding skills.” Talk it out “One lever I use is ‘interactive grading,’ in which students need to explain their solutions to me,” says Greg Benson, a professor of computer science at the University of San Francisco and chief scientist at SnapLogic. “At least in my courses, because they are more advanced, it is often easy to detect machine-generated code or to observe if a student doesn’t understand a solution they have submitted through dialog.” Huntress’s Geftic points out that this sort of process isn’t just busywork meant to stop students from cheating: Explaining the code you’ve written (with or without AI help) is an important part of a developer’s professional skill set. “Conversational testing,” as he calls it, “goes hand-in-hand with the building of secondary communication soft skills, which are becoming more and more sought-after in the IT space.” Go hands-on Maksym Lushpenko is the founder and CEO of Brokee, which challenges students with devops labs consisting of broken systems they need to fix. “Our labs are designed so that you can’t just copy the description and have AI do all the work for you,” he says. “Important details about what’s broken are hidden in the test environment, so students need to explore—run commands, check logs, and figure out what’s going on.” He says that such environments allow students to use generative AI “responsibly” in a learning environment. “AI can definitely help, like explaining how a system works or how to run a specific command, but at the end of the day, the student still has to put it all together and fix the problem.” Appropriate reliance and responsible use Everyone we spoke to agreed that the “horse is out of the barn” when it comes to generative AI and computer programming: You’re not going to stop people from using it in their careers, so the trick is ensuring people know how to use it responsibly. As KinderLab Robotics’ director of curriculum, training, and product management, Jason Innes focuses on AI-assisted learning for young children. He told us he very much believes AI needs to be part of the curriculum from early education forward. “Young children need to understand what AI is before they learn to use it. They need to learn that AI is not alive, has no goals of its own, and is not always right—and that it is a tool created by human engineers,” he says. “If we want to prevent kids from taking shortcuts with LLMs, we need to help them develop an understanding of what AI is and what its limitations are. AI is a tool that can help people think and work better, but we still need to master the fundamental cognitive skills that we want AI to assist us with.” This lesson isn’t only for kindergartners. Danielle Supkis Cheek is VP, head of AI and analytics, at Caseware, and also a part-time faculty member at Rice University’s Jones School of Business, where she teaches data analytics in the Masters of Accountancy program. In her view, it’s crucial for students to understand AI as part of the landscape of tools and information in which they’ll be operating. “It’s not a realistic scenario for a student of mine to ever know every scenario and the pace of change of what is changing out there,” she says. Instead, she wants her students to understand tools like AI, focusing on “the metacognition concept of how to learn, how to understand, and how to be skeptical of the responses—and how to then follow up and make sure you can take appropriate reliance. That’s the skill set that I’m teaching.” In Supkis Cheek’s class, students are meant to learn the processes that real-world accounting professionals would follow, which might include AI. “The answer to me is not as important as the process by which the student got the answer,” she says, “and so I need them to use processes that are available to them in the real world so that they can use this in the university setting.” The experience is also meant to introduce students to the rigorous world of corporate finance, and help them learn when certain processes and techniques are acceptable. “The course is a safe place to fail so that when they get to the real world, they are more appropriately equipped,” she says. “Here, the worst thing that happens to you is to get a bad grade—versus the worst thing that happens if an auditor misses something may be a deficient audit report that results in significant fines, lawsuits, and erosion of public trust.” Asking the right questions For many educators, AI isn’t something to merely accommodate in their curricula: instead, they’re actively building their courses to teach their students how to use it effectively. After all, as Brokee’s Lushpenko puts it, “To get the most out of AI, you still need to know what questions to ask and how to apply the answers it gives you.” “Other kinds of training need to be developed with AI tools specifically in mind,” says Risto Miikkulainen, AVP of evolutionary AI at Cognizant AI Labs and a professor of computer science at the University of Texas at Austin. “Such assignments may be larger than current ones, with overall design done by students, and detailed low-level implementation done by AI tools. They may include new assignments such as upgrading software, debugging, and repair, that are currently tedious but where AI tools can help significantly. They may also include designing software so that AI can be most effectively used in the future to upgrade and maintain it as well.” Benson at the University of San Francisco is also thinking along these lines. “I will be teaching our upper division operating systems course next semester in a way that allows students to take full advantage of coding assistants on projects,” he says. “I will be making the projects more complicated than my previous projects, and I will also guide the students on how to use assistants to both develop solutions and how to use them to learn OS concepts and code more deeply.” AI is for educators, too So far, we’ve focused on student use of AI, but generative AI can also help educators. For instance, KoderAI’s Morales points out that it can help answer simple student questions that might otherwise eat up class time. “Experienced human software engineers (professors included) don’t have time to help entry-level coders every time they get stuck or when they forget something like indenting a line of code in Python,” he says. “This is where generative AI has been a game changer, allowing up-and-coming coders to simply share their code with AI and have the AI make suggestions or provide code that would help them continue their learning journey without having to wait for a human.” Plus, as GenTech’s Wilson says, it isn’t just students who need a quick answer sometimes. ” For teachers, AI gives them the ability to find an answer or different view to an obscure question that could be asked by a student,” he says. On the voyage of learning, teachers and students will be discovering how these new tools do (and don’t) work together.
https://www.infoworld.com/article/3631099/my-robot-teacher-tackling-the-challenge-of-ai-in-education...
Voir aussi |
56 sources (32 en français)
Date Actuelle
mar. 7 janv. - 22:07 CET
|