I recently posted a list of recent talks about C and C++ safety. As anyone who works in safety critical space knows, this isn’t a new issue. It’s just finally getting attention outside of the software engineering discipline. As Bob Martin has been predicting for something like 50 years, we either get our act together or the government will impose regulations. The latter is happening and is likely to continue.
C and C++ are unsafe languages. This isn’t news and hasn’t changed since the inception of these languages (1972 for C, 1979 for C++). They are not memory safe, they are not type safe, and they contain a long list of undefined behavior. They have foot guns. This doesn’t make them useless.
So what has changed since C became the system programming language for most of the computing universe? Well, pretty much everything, except our primary system programming languages.
Let’s start with the obvious. Software is everywhere today. Just looking around my office, I can count more than 25 devices with software in them. And I believe all of them are running code that was compiled from C. My workstations, the clock on the wall, the smart light bulbs, my watch, my oscilloscope, my power supply, my USB and Thunderbolt devices, my cell phone and my landline phone, my wireless headphones, my UPS, my ethernet switches, my monitors, the thermostat, Raspberry Pis, my e-bike in the hallway, etc.
Another obvious fact… almost all of these devices have Internet connectivity. They present an attack surface. And security is a prerequisite to safety.
And another… the scale of the systems we’ve built is astounding and has happened in a short period of time. Systems we’re using today were unimaginable 20 years ago.
A key change from a safety perspective is part of the first obvious change, which is that software has deeply penetrated many safety-critical systems, and has added an attack surface (Internet connectivity). This has happened fairly quickly. Cars in the 1970’s had no software at all. That was largely true in the 1980’s as well. Today, a typical luxury car has somewhere in the neighborhood of 150 microcontrollers running independently developed software. In the 1970’s we didn’t have ATMs. We didn’t have Wifi. We didn’t have the public Internet.
C and C++ simply aren’t the right tools in safety critical systems. I say this as someone who loves these languages and has spent a career writing code in these languages.
There’s been a lot of hand-wringing over this for some time, but much of it has occurred within the enclaves of particular industries, i.e. those where safety is required. In particular, where safety is regulated. Medical devices, aviation, automotive, industrial, et al. If you’ve never worked in one of these industries, the odds that you’ve thought at all about software safety are quite low. If you’ve worked in these industries, you’ve seen all of the process and tooling that’s required to certify C and C++ code. And in many environments, the power of these languages is dramatically reduced by only allowing a subset of the language and/or standard library, not to mention the “no dynamic memory allocation” rule present in many environments.
The reality is that C and C++ are going to eventually disappear from safety-critical systems. They will not become memory safe, nor type safe, nor free of most of their undefined behaviors. They won’t be what we think of as C and C++ if they do. We already have significantly safer languages, some of them quite mature (Swift, for example), and C++ inertia will prevent it from competing here. On cost alone, C++ will eventually lose here.
The other reality is that we don’t yet have a suitable replacement. The recent work to certify Rust at ASIL D (ISO 26262) is encouraging, but we’re still in a state where we can’t use it everywhere (deployed target microcontrollers that the Rust compiler does not target), and of course we can’t just rewrite everything. Is Rust worth a look on green field projects? Of course. I can mostly say the same for Swift. But in both cases, we don’t yet have all of the surrounding infrastructure to use either of these languages in a certification-required environment.
C and C++ are going to be present in safety-critical systems for a long time. Well beyond the end of my career and probably yours as well. Victims of their own success.
Which brings us to the real questions…
- Can we make our existing C++ code safer?
- When might we see a safe successor?
1: yes, of course.
2: I think a good successor is probably 10 years away. Which happens to match Chandler Carruth’s prediction for a fully usable version of Carbon (which is one of many potential successors). It seems an absurdly long time, but until we have regulated accountability, it’s going to be difficult to impossible to argue the return on investment in many environments. And without regulations, there’s going to be a much smaller group of us doing anything at all about it.
In the interim, I expect much “business as usual” in safety-critical systems. We’ll use MISRA, et. al. and all of the processes we see in the industries where safety is already critical. What might change here is the adoption of safety processes and standards in more industries. I kind of shudder at the thought of imposing third-party audits on (for example) all IoT devices. At the same time I’m very aware of the risks of unsafe and insecure software. And of course I worry about government making ignorant legislation. But at this point in civilization, the horse has left the barn. But if you think about it, as Robert Martin has pointed out a zillion times, we brought this attention on ourselves.
If a memory-safe successor to C++ is 10 years away… the hand wringing will continue and I would expect at least some liability legislation in the U.S. and the E.U. before 10 years transpires. And I expect some of it to be controversial, some of it to only be resolved in the courts, etc.
I’m not a gloom and doom sort of person. I’m actually fairly optimistic here. For one, it means those of us who already care deeply about the quality of our work will be valued. Secondly, civilization has always had a tendency to be reactive instead of proactive. Crisis is a mother of invention. We have many ideas and languages coming up to express and test those ideas. Swift, Rust, Val, Carbon, Circle, Cpp2 as examples. And of course we’ll get C++26 and C++29, but C++ will never be a memory-safe language.
Speaking of that, I encourage others to read “The Meaning of Memory Safety“.
Edit: I just saw Timur Doumler’s “C++ and Safety” talk from CppNorth 2023. He covered much of what I’ve been thinking the last 2 to 3 years. I think of all of this as a good sign. Most of us know where we are and the limitations. Some of us have to care because we work on safety-critical systems, while others (say a typical HPC application) have no safety requirements. Timur did a nice job of highlighting the tradeoffs and the audiences.