Should You Teach Your Students to Code Now That AI Can Do It For Them

The question whether students should be exposed to learning how to code had many ups and downs. It started in the 70s when Seymour Papert, a professor at the MIT Education Lab, invented Logo programming and the turtle it helped to program. Since then there were periods of widespread adoption of coding and periods of complete abandonment by most schools claiming that it was difficult to teach and only necessary for those who are interested to become programmers. The issue was raised again recently when Jansen Huang, the CEO of NVIDIA, declared that children should no longer bother to learn coding because in the future AI will do it.

A Market Need

As technology kept advancing the world towards a growing digital reality, the need for skilled programmers kept increasing and the need to train students to code became more urgent. Schools’ attitude towards programming started changing when research pointed out that programming is not just about a series of programming commands that students need to learn, but also about “computational thinking”–habits of mind that programmers use to structure and streamline the process. A researcher who fundamentally changed the narrative about programming is Jannette M. Wing, a consulting professor of computer science at Carnegie Mellon University and corporate vice president of Microsoft Research,  who said that “Computational thinking is a fundamental skill for everyone, not just for computer scientists.” She explained that this mode of thinking involves solving problems, designing systems, and understanding human behavior, by drawing on the concepts fundamental to computer science. It includes the ability to abstract, to engage in logical and symbolic reasoning, to take big problems and decompose them into many smaller problems, skills that everyone can use whether they are using a computer or not. Because it is such an essential skill, she believes that we need to add computational thinking to reading, writing and arithmetic.

As a result, schools started adopting teaching coding in growing numbers. This direction, which was clearly dictated by market needs, was challenged for the first time in November 2022, when OpenAI came out with ChatGPT, which among other things was capable of producing code based on a user’s prompt. Since then AI’s ability to code has advanced exponentially and its latest manifestation is Cognition AI’s software engineering platform named Devin, which is capable of completing entire software projects autonomously, a feat beyond the grasp of most AI coding assistants. 

AI Will Do The Coding

Due to this turn of events, the question whether schools should continue teaching students to code became relevant again and Jansen Huang’s declaration that children should no longer bother to learn coding because AI will do it, definitely brought the issue to the forefront. Huang also said that AI can  make everyone a programmer and in the near future all we will need in order to do programming is our native language. So, before you pack your bags and discontinue your programming class, let’s look at the implications for students  and for society in general of what Mr. Huang was saying. 

In a regular search on the internet I stumbled upon Mr. Huang’s declaration and I was intrigued. I decided to find more information about it and found an article written by Nathan Anecone, titled “Jensen Huang is Wrong About Coding.” In it Mr. Anecone lays out a systematic analysis of everything that is wrong with Mr. Huang’s declaration. First he points out that Mr. Huang is an electrical engineer who runs a hardware company and not a software company, which already makes him unfit to make such an outrageous claim. He then points out that relying heavily on AI while completely neglecting one’s own education and training can lead to “ignorance explosion” or “knowledge collapse.” He explains that “the idea that you shouldn’t bother to learn coding because AI can do it fits into a genre of unintentionally disempowering attitudes that promote ignorance and a loss of intellectual autonomy.” To prove his point, he brings the case of over-relying on GPS to find our way around. Even though it makes our lives easier when trying to get to a destination, not being able to apply our navigation skills when our GPS goes out of control can lead to many instances of navigational frustration. Research shows that drivers who rely on memory to pick a route have been shown to have bigger hippocampus — brain structures implicated in memory processes. GPS control is nothing compared to allowing AI to control our logical thinking skills and the ability to control computers on a root level. Those skills are the ones that we acquire while doing programming. Learning to program teaches us how to think critically and helps us “think like a computer” and understand what’s going on in it.

Should AI Replace Students’ Knowledge

As an educator I learned that students love a challenge and when they try to solve a problem, the experience is so gratifying, especially after overcoming a great amount of frustration and conquering a tough challenge. If we follow Mr. Huang’s advice, we will deprive children of this whole area of potential human experience and if we continue his line of thought, why even bother teaching students to paint or to write essays because AI can do the job for them. My intention here is not to discourage students from using AI. AI can be a great help to coding and education, but using it as a substitute for students’ own knowledge of subjects can be detrimental to students’ future and society at large.

When OpenAI released ChatGPT and teachers were panicking that it may lead to upend education as we know it, I pointed out that this may provide an opportunity for teachers to shift from a traditional teaching methodology that AI can easily replicate to a deep learning methodology that AI was less capable of taking over. A big disadvantage of AI is that it cannot learn to think outside the box. AI is capable of learning over time with pre-fed data and past experiences, but cannot be creative in its approach. But even if in an ideal world it may one day be able to perform deep learning, stripping ourselves from the ability to critique what AI is suggesting us to do can be very dangerous for society.

In: