feature article
Subscribe Now

Teaching Coding

Fast Track to Success or Failure to Get the Point

Many years ago, I was attending a series of classes on computing. It was intended to introduce librarians, of which I was one, to the brave new world of “library automation.” The first lecture was a description of the 80-column punch card and how holes in it represented characters. (For those of you lucky enough not to remember the era, we used to write code on coding sheets and then pass the sheets to a punch room where operators turned the sheets into card decks that were read by the computer. The program executed, and the results were printed out on continuous sheets of paper. You collected the paper, and with it, your coding sheet, and your card deck, you spent the next day or so trying to work out what had gone wrong.)

Knowing how the 80-column card stored information was interesting but gave no perspective on how computers could manage the library issue system or book catalogue.

This same misunderstanding of what computers are, their uses, and their implications for society and personal life is behind the rush to teach school children coding. This year, the British government has mandated that all pupils entering schools will, from the age of five onwards, be taught to write code. A few weeks ago, President Obama urged all young people to learn to code. Around the world there are similar initiatives. They have been greeted with joy by two groups of people: those who know nothing about computers, and programmers, who see this as a confirmation of their special abilities.

The reasoning behind these initiatives appears to be that computers are so important to everyday life that the entire population has to be able to program them. None of the discussion surrounding this debate seems to have looked at the fact that consumer electronics and PCs are the tip of the iceberg for processor deployment. None of the proponents seem to understand that writing code should be implementing a solution to a problem that has been defined and analysed and is not the solution itself.

In 2013 there were, around the world, some spectacular high-profile issues surrounding electronics and computer-based systems. In the US, perhaps the most public problem was the HealthCare.gov website.  Its teething problems were not due to poor coding. They appear to have been due to a standard problem with big systems: politics. In this case, it was national politics that drove the system to go live without adequate testing. In Britain, other government systems are facing problems through the classic project management issues – ill-defined initial objectives, mission creep as people ask for extra features to be included, and pressures to deploy the system before testing is complete.

Testing in this context includes not just debugging code, but also testing the user interface, as the developers will have made assumptions about how the user will behave, which will, in field use, prove to be inaccurate – normal people don’t always think in the same way as the people currently developing systems. (If you want to learn more about the problem, look for Alan Cooper’s book, The Inmates are Running the Asylum. Although it is now nearly twenty years old, it contains a lot of good sense.) If it is a website, then stress testing the site to see how much traffic it will bear is also necessary.

None of these failing projects would have been improved by a wider knowledge of coding. And many of these projects are failing because the focus has been on implementing rather than on solving.

Knowing how to code will not give any understanding of how GCHQ and NSA are using computing power to store and analyse your internet activities and your phone calls.

Another flaw with the argument to teach everyone to write code is that conventional programming will, possibly within the next ten years, become a quaint anachronism. Look at parallel field chip design. (If you are familiar with this area, forgive me for this Janet and John description.) Chip design started with engineers drawing circuits using conventional symbols for devices – transistors, resistors and capacitors. These were given to drafts-people who turned them into the polygons that represented these devices in a silicon layout. These stages were replaced by schematic entry, where conventional symbols were drawn on a workstation and translated by the computer into the polygons. Chip Design had transformed into EDA – Electronic Design Automation. Later, the schematic entry was replaced by hardware description languages (HDL), a higher-level abstraction. Verilog and VHDL described, not transistors, but the logic functions that the circuit was to implement.  The engineers using HDLs were effectively writing programs.

HDL is now largely replaced by Electronic System Level (ESL) tools, which use even more abstract languages, such as SystemC, to model the system that the chip is to implement. In some cases modelling tools, like National Instruments’ LabVIEW or Mathworks’ MATLAB, are used to build a graphic representation of the functions of the system that are needed, and a chain of tools converts these into the polygons for manufacture. As these tools have grown more abstract, so there has developed a range of verification tools that check that what is implemented is what the specification required, and test tools that measure how well the circuits actually work. Also, given the massive size of current designs, there is now a base of Intellectual Property – blocks of circuit design that carry out specific tasks. These may be for a whole processor, with ARM the clear market leader, or for interfaces, such as a USB controller. This means that the EDA equivalent of coding is less and less relevant.

The real challenge is not code writing; it is analysing the problem and designing a solution, for which the code is merely an implementation.

As was previously mentioned, the majority of the “teach children to code” enthusiasts have not realised that a huge amount of code is written for embedded systems – not for familiar IT type applications. Here again, there is the rise of model-based approaches to developing applications. The large, complex, and mission-critical systems that the aerospace and defence industries develop were the driver behind developing first UML (Universal Modelling Language) and then SysML (System Modelling Language). The model-based systems are linked to code generators and produce very high quality code.

A positive story in 2013 was the confirmation of the Higgs Boson. The Boson was detected by smashing two streams of protons into each other and looking at the particles that were created. The energy in these proton streams is enormous, and they have to be very tightly controlled. The software for running the controllers within the Large Hadron Collider, the massive underground facility on the border of France and Switzerland, was developed using NI’s LabVIEW graphic development system, so no coding skills were deployed.

Coding is not going to help people make better use of mobile phones and the apps that run on them. It is not clear how being able to code is going to help people reign in the efforts of security organisations to monitor the communications of all and sundry. For the relatively small percentage of the population that will enter the technology industry, there may be a marginal benefit in knowing a programming language. The religious wars on what language to teach are going to be great fun to watch, but none of them will help anyone do anything very useful.

4 thoughts on “Teaching Coding”

  1. School has a number of goals. At the most basic early level, it teaches us to be “human”, inculcating us with all that was laid down before in terms of language, history, and basic critical skills like arithmetic.

    Once you get to higher grades, then, ideally, you get exposed to all kinds of different things – manual skills like wood and metal shop, more advanced music or art or math(s) or physics or whatever. The idea being that you get a smattering of everything and understand a bit of what’s going on in a wide range of fields and then can decide what really resonates and dive deeper there. Adding some amount of coding in this context would seem to make good sense, especially since there’s a “way of thinking” that’s not bad to appreciate.

    But so often, education policy seems to react knee-jerk to the captains of industry complaining that they don’t have a big enough pool of talented worker candidates with a particular skill-du-jour. While school should definitely prepare students for employment, such imbalances seem to drive over-reactions. At an adult level, in the US, we had a call a few years ago for more nurses. Now there are so many nurses that they can’t find jobs. (Which keeps wages down.) As you describe it, this coding mania has that feel to me.

    That’s not to take away from your more nuanced discussion; it’s just that, in the political meeting rooms where people discuss how to educate children on topics of which they have no clue, I can’t imagine quite such a reasoned debate taking place.

  2. You got it wrong on the chip design tools. Those tools didn’t get replaced. Instead they forked in to their own specialties. Analog chip design still uses a preponderance of schematic capture. Mixed signal designs use both schematics and HDL. And these usages will be around for a long time.

    Back to school for you!!! LOL 🙂

  3. John

    Indeed the naughty corner for me. I did the classic thing that so many people do and equated chip design with digital chip design. I forgot that analog design is difficult and tools are still only just above real men chiseling structures in silicon with their bare hands.

    Sorry

Leave a Reply

featured blogs
Nov 5, 2024
Learn about Works With Virtual, the premiere IoT developer event. Specifically for IoT developers, Silicon Labs will help you to accelerate IoT development. ...
Nov 13, 2024
Implementing the classic 'hand coming out of bowl' when you can see there's no one under the table is very tempting'¦...

featured video

Introducing FPGAi – Innovations Unlocked by AI-enabled FPGAs

Sponsored by Intel

Altera Innovators Day presentation by Ilya Ganusov showing the advantages of FPGAs for implementing AI-based Systems. See additional videos on AI and other Altera Innovators Day in Altera’s YouTube channel playlists.

Learn more about FPGAs for Artificial Intelligence here

featured paper

Quantized Neural Networks for FPGA Inference

Sponsored by Intel

Implementing a low precision network in FPGA hardware for efficient inferencing provides numerous advantages when it comes to meeting demanding specifications. The increased flexibility allows optimization of throughput, overall power consumption, resource usage, device size, TOPs/watt, and deterministic latency. These are important benefits where scaling and efficiency are inherent requirements of the application.

Click to read more

featured chalk talk

ADI Pressure Sensing Solutions Enable the Future of Industrial Intelligent Edge
The intelligent edge enables greater autonomy, sustainability, connectivity, and security for a variety of electronic designs today. In this episode of Chalk Talk, Amelia Dalton and Maurizio Gavardoni from Analog Devices explore how the intelligent edge is driving a transformation in industrial automation, the role that pressure sensing solutions play in IIoT designs and how Analog Devices is reshaping pressure sensor manufacturing with single flow calibration.
Aug 2, 2024
60,230 views