Sure, we’ve had a few years of comfort where C-based programming could solve the majority of embedded design problems, and we’ve even developed a respectable infrastructure of tools and IP to support that methodology. We also now have a legacy of previously developed C software libraries atop which we can coast along, just stitching together a few convenient APIs when we want to whip up a quick GUI or database. Our laziness and complacency remain secreted safely away while we manage to look like heroes to management by hammering out new applications with record speed. Times are good, eh?
If somebody even suggests the possibility that a technology shift might require us to go back and re-tool, learn a new language or programming paradigm, or familiarize ourselves with a radically different underlying hardware architecture, do we cower in fear? Is our trade so set in long-standing tradition that, like the brakemen and firemen on the coal-fired railroads of the past, we just can’t see our jobs any other way? If so, shame on us! In my day, we programmed barefoot through nine feet of snow to get to our solution. We got up at four in the morning and built our own computers out of sticks and bailing wire, and we were happy about it. Okay, maybe not, but my own personal early real-world experience may help to make my point…
When I was a teenager I was thrilled when I got my first computer, a Radio Shack TRS-80 model 1. That machine had a BASIC interpreter in ROM, so I set about learning to program in BASIC. Of course, before long I wanted some of my algorithms to run in something less than geologic time, so I started to experiment with “poking” and “peeking.” These were BASIC commands that allowed one to place Z80 instruction codes directly into memory. I then used a special BASIC function call (USR if I remember correctly) that transferred execution from BASIC directly to my manually-assembled machine-language masterpieces. Soon, I was rocketing along, wind in my hair, enjoying the unmatched thrill of raw, unleashed, 1.77MHz 8-bit processing power. Wow.
One year and nine-million conceptual miles later, I was working with a dual-cyber CDC system at my university. The language of choice was FORTRAN, injected into the beasts by a pretentious pile of punch cards, which had to be meticulously encoded using ancient machines in the computing lab. Since I wanted to harness the almost incomprehensible computing resources represented by these formidable multi-million-dollar dream machines, I learned FORTRAN and dealt with the maddening inconvenience of punch-card programming. While punch cards seemed a giant step backward from the interactive environment of my trusty TRS-80, they were the price of entry into my new world of extreme-performance computing. I bit the bullet and learned the new rules, an endeavor that wasn’t without its pain.
For example, I distinctly remember running out of blank cards one morning at about 4AM in the computer lab, with about four bugs remaining in my FORTRAN program. The assignment was due four hours later, inconveniently earlier than the opening time of any of the stores that sold blank cards. OBSOLETE TIP: Did you know, if the operator isn’t looking, you can steal a stack of password and job cards from the ops desk, turn them upside down, and, with a little creative use of tape, make them work as regular FORTRAN cards? Desperation sometimes gives rise to extreme measures.
When I flip-flopped from the EE department to the CS school to get my programming credits for my dual majors, I had to also context-switch to Pascal. I didn’t mind this inconvenience so much, however, because Pascal people had free run of the programming labs and even got to use interactive editors on terminals. Either the university didn’t think that CS students were patient enough to deal with punch cards, or somebody in the CS department had been around long enough to get additional domain over the computing resources. I was happy. At least Pascal programmers didn’t have to deal with Hollerith codes.
Next, we fast forward to late undergrad days and the biomedical engineering lab. They had a custom- (student-) built computer, perhaps with a Motorola 6809 8-bit processor (I don’t remember for certain anymore) and almost no memory whatsoever. It seems like the machine started with 4KB and was upgraded at great expense to 8, or 16. The problem was, of course, that you couldn’t write much useful code in any normal high-level programming language in that memory footprint. In order to get my job done, I learned a fascinating and frustrating threaded interpretive language called FORTH. FORTH was elegant in concept and bare-bones on memory required. For at least the fourth time in my nascent career, I had started over in a new programming language with a completely different development environment and programming paradigm because the hardware system I was targeting required it.
Now, some of you might expect that this was the azimuth of my programming career. “How,” you may ask, “could you program in any other language once you were fluent in the world’s premiere threaded interpretive language? Wouldn’t that be a hard act to follow?” Nonetheless, my lifestyle required a paycheck, and to earn my next one, I needed to pop open the K&R and learn a little C. C is, of course, a thin abstraction layer on top of a generic Von Neumann architecture processor. I was pretty processor-savvy from my hand assembling, peeking and poking days, so C came pretty easily. Programming environment number five – learned and ready.
My next dialect overhaul came a full five years later. In 1989, object oriented design was the wave of the future. It still is, in fact. Our company was standardizing on C++, and everybody went back to impromptu U in order to forget about functions, subroutines, top-down design, and Ritchie, and to learn about classes, objects, methods, members, and Stroustrop. Through the miracle of Cfront, we were able to debug our newly- (and badly-) crafted C++ code with the full power and convenience of 100-character-long munged variable and function names. Programmer productivity skyrocketed.
You embedded developers stayed saner, of course. The lion’s share of embedded development has continued in C for some healthy double-digit number of years now. You settled in, kicked your shoes off, and made yourselves comfortable. No need to rock the boat, after all.
Tomorrow, however, new embedded hardware architectures may emerge that don’t fit your happy, C-based universe quite so neatly. Ask yourselves now, what will the embedded software development community do? Will you insist that compilers, debuggers, and development environments be sub-optimally stitched together in an effort to cater to your tired and lazy techno-luddite ways? “Sorry,” you’ll say. “We only program in C. You’ll need to find a way to make that work on these new machines.” Perhaps expert drivers of horse-drawn carriages said similar things at the time of the introduction of the automobile. “Sorry, we drive with reins. If you want us running those things, you’re going to have to put a couple of steering straps on ‘em.” If that’s you – I wish you well as you ride your C-based carriage off into the sunset of technical obsolescence. Drop a sad song recorded on vintage vinyl onto your turntable as you cry in your beer when your career takes a turn toward redundancy. You brought it on yourselves. Babies.