I am truly astounded by the hatred exhibited in some of the comments in this thread. Wow.
A few points:
If your software development experience is limited to web or mobile development, please understand you lack a massive amount of context when it comes to software in the context of complex hardware manufacturing. To put it in simple terms, you really don't know what you are talking about. It's like someone who writes code for embedded systems in clothes washing machines having a strong opinion about how to engineer, deploy and maintain a tech stack for a SaaS.
Next, the cults formed around programming languages are some of the funniest things I read on HN. Except, they are not funny at all. There is nothing wrong with C or C++. Nothing. The problem is people who don't know how to program. At the core, to perform a certain function the microprocessor will, for the most part, do pretty much the same things, regardless of what language might be used to describe that. Blink a light in response to a button being pressed? At some point you are sensing and de-bouncing the input and then setting an output pin. Maybe there's I2C, SPI or CANbus in between, but what you have todo is the same.
Languages like C and C++ have been around for longer than most people reading this have been alive. They have been used --successfully and without issues-- for more projects, products and systems than anyone can imagine and at scales from a watch to spacecraft and everything in between. This cult of languages as solutions to problems caused by bad programming is nonsensical. One can write bad code in any language.
Professional certifications. This isn't going to stop people from writing bad code. At all. Designing and implementing complex code for real time systems is far more complex than designing a bridge or a building. Certifications are not magical inoculants against bugs and system design problems. Trying to view and fit software engineering from the context of other engineering disciplines is a mistake. They are not the same thing.
When you are dealing with fault-tolerant hardware + firmware systems things can get exponentially harder. Folks who work on web products have the luxury of being able to make lots of mistakes and fix them as fast as their deployment system allows. Imagine if you got ONE CHANCE to ship a non-trivial codebase and it has to be guaranteed to work to a high degree of certainty for ten or twenty years. Once you ship it, there will be millions of users and you cannot touch it at all. In fact, you have to telemetry data of any kind. You don't know if there are issues, if it has random failures, race conditions or odd failure modes you cannot possibly simulate at scale in the lab.
Imagine this next time you deploy an update. Imagine you are not allowed to touch it at all after that and it has to work. Imagine that, and maybe you can get a sense of what some of this work can be like.
This isn't limited to automotive. This goes for commercial, industrial, aerospace and other domains. Hardware is a million times more difficult that pure software products, particularly web type software. We just shipped a thousand units of one of our new products for installation in a large industrial facility. The firmware had to go through a year of testing before we dared ship even a single unit. That's another thing to imagine; the idea of having to test and qualify your web SaaS --not with clients as test subjects, on your own-- for a year or more before you can launch your business (and you are not allowed to touch it after that).
Anyhow. Don't be language cultists. That isn't the problem. Bad programmers and bad process is far more of an issue. Changing languages isn't going to magically fix that. It isn't a solution. You can build anything with C or C++. If written correctly, it will perform better than just-about anything and be reliable within the bounds of system specifications.
Counterpoint is that C and C++ are increasingly removed from what the hardware is actually doing and are now more of an emulation language mired in legacy thinking rooted in 1970s era memory and processing models.
Modern languages are trying to introduce constructs to make safer, more performant software on average, lessening the tendency of hardware chip makers having to hack their way into performant execution of C/C++.
This is not to say C/C++ can’t make good software, it’s that we limit ourselves by clinging to them, and would be far better off with gradual adoption of modern low level languages, just as we shed Assembler for C over time.
Therefore, there are ways to write bad code. Do you think it's reasonable there may be better tools that prevent writing some cases of bad code?
I mean, I think we all can agree that, in general, a codebase with significant test coverage hass less bugs than a similar codebase without test coverage. So, I don't think it's crazy to require, potentially via a tool, some degree of test coverage. Yes, some people will cheat by constructing some bad tests, but, in general, I think we should expect some level of quality improvement.
Rust adds another tool that, on average, should improve quality.
Again, this is not a solution. This is not a silver bullet. Neither is depending on the programmer to write correct code.
> Therefore, there are ways to write bad code. Do you think it's reasonable there may be better tools that prevent writing some cases of bad code?
Bad code is written every day in every language. The root causes for bad code do not include the chosen programming language. If language was a root cause, every codebase using language <x> would have problem. Clearly that isn't an issue.
Why do we write bad code then?
- Unintended errors
- Bad design
- Not understanding the problem being solved
- Not understanding the constraints
- Not understanding the hardware/environment
- Lack of skill
- Lack of knowledge
- Limited experience with the tools, frameworks, libraries, domain, etc.
- No understanding of lower-level concepts
- Lack of education (formal or self-taught, does not matter)
- Not having a full grasp of fundamentals
- Not being thorough or detail-oriented
- Etc.
There are a lot of reasons for which bad code is written. Language choice isn't one of them. It's an excuse. If baseline languages like C and C++ were so impossible to use properly Linux would be an unmitigated disaster. It is not.
In the end, the language isn't the problem, at all, it's bad software developers.
Don't take my word for it, Linus Torvalds [0] has been quite vocal about this very issue. In fact, he recoils at C++. And, frankly, having seen what comes out of, as he put it, substandard programmers, I could not agree more. I believe part of the problem is how we've been teaching programming. The first thing anyone coming out of school reaches for are complex class hierarchies, complex data types and layers of unnecessary crud.
I remember a project a long time ago that used Objective-C to implemente a genetic solver. It was unusable due to just how slow it was. I re-coded it in C. It ran about 400 times faster on the same iPhone hardware. The first was the result of lazy uninformed programming. The new version was simple, to the point, performant and had no bugs. Not to mention it used massively less memory.
Here's something everyone needs to internalize: The processor --the hardware executing the code-- has no concept of object-oriented anything. No polymorphism, inheritance, complex data types, etc. When all the smoke and bullshit clears out, all it knows is a set of very simple and efficient operations we can weave together to do useful things. That's it.
All the abstractions provided by languages are for the benefit of the programmer and have nothing whatsoever to do with code quality, correctness, bug content or suitability for a purpose. Which means most, if not all of that, isn't necessary.
Modern programmers probably have no clue about the range an scale of projects that were completed without any issues while not having access to things like TDD, objects, massive libraries, complex data types, decorators, etc. I mean, we have written operating systems, sent people to the moon, saved lives with medical equipment, developed consumer and industrial products and more. Funny how all of that was possible and people today go on about needing to use a better language.
In my opinion, there's only one area that can justify a new language. Everything else is well covered with C and, if one must C++. That is, AI. And no modern language fits the bill yet.
It is hard to define what an AI-first language might look like. Being that I used APL professionally for nearly a decade, I happen to think that a language based on a notation developed for AI might be the best idea. The reasons are similar to the reasons for which musical and mathematical notation allow for rich expression of ideas. In other words, the justification isn't "I need it to write better code", but rather that, as the field advances, we might very well need better ways to describe what we want the computer to do.
> Bad code is written every day in every language. The root causes for bad code do not include the chosen programming language. If language was a root cause, every codebase using language <x> would have problem. Clearly that isn't an issue.
Every programming language has its own style of bad code. Java has enterprise OO overengineering. Haskell has monad obsession. C++ has enterprise OO template shenanigans. C has pointer crazyness.
You should pick the programming language thinking on minimizing issues for the task at hand. That means C is good for the stuff for which is good (obviously) but not for everything but AI.
Yes, C can cover everything. Now go write a web frontend in C, and you will see that, even though you somewhat can, it is a huge pain.
And don't mistake me: I agree with you on OO huge frameworks being terrible, but I disagree with you at C being the pinnacle of programming for everything but AI.
BTW: the Rust borrow checker just ensures that you have properly made your mind about what pieces of code own what data (and by owning, I mean who has the responsibility on writing and calling free on it). I think a piece of software ensuring that is a fair improvement. Worst case, the good programmers will just explain ot to the compiler via type annotations and just continue with their lifes. But they were going to do that already, wouldn't they?
Por of this is what I would describe as a self-fulfilling prophecy effect. C is hard. You can't use it without thought or you'll create code that is excellent at crashing entire systems. Given that current CS instruction graduates "OO-first" high-level-of-abstraction developers, it is only natural that the tools and frameworks they create start at that level, rather than a lower level that requires more work, design and planning.
One of my professors used to pound this idea that you should not write one line of code until you have devoted enough time to structural design and, more importantly, data representation. Of course, I am going back to a time when none of the modern languages existed. My choices at the time were assembler, Forth, C or APL. I won't even mention COBOL or FORTRAN as they were never part of my reality.
Today people think nothing of having functions create and pass entire objects, dictionaries and other complex data structures that consume memory, time and energy (in very real terms, Python requires nearly 80 times more energy to do the same thing when compared to C).
As this "contagion" doesn't seem to have an end in sight, CS reality today and in the future will move away from fast, space, time and energy efficient languages like C.
The energy component is something I have been more aware of over time as we start to become more concerned with carbon neutrality and related issues of climate change. While Python is great for what I am going to call "lazy" programming (you don't have to give it any of the thought required in C or C++) it is objectively terrible in terms of execution time, memory footprint and energy consumption. Do we want a world dominated by Python applications? At a large scale you would need more computers --lots more-- energy and resources to do the same things that could be done with more efficient options.
How many CS graduates come out of school with almost a Python-first education? Not to go too far, my own son, MS CS, spent nearly no time at all with assembler, Forth, C and C++ in school. It was crap like Java and utility like Python. Everything is worth learning, of course, but, give me a break. Thankfully he chose to listen to me and invested time and effort getting good at low-level programming. His first job out of school working for a very good company was C/C++ centric and he did great. Most of his school friends could not have landed that job.
To be clear. I am not opposed to reaching for a variety of languages. I have. I still do. What I do not do is blame a language for problems of design, logic and engineering I create. The language isn't the problem. It never is. I am the problem.
To beat it to death: It's like blaming a welding machine for making bad welds. So long as you don't have a truly horrible machine, someone with proper welding skills can make good welds.
I learned this one the hard way actually. I have a Miller 130XP MIG welder. It's a very basic entry-level 120 VAC MIG welder. I could not make a good weld to save my life. Believe me, I tried. I watched videos, practiced, even got some instruction. I sucked.
When I built my solar array I had to weld a number of custom brackets for the ground mount structure. Not trusting my capabilities I asked a professional welder I knew to do the welding. When he came over he saw my 130XP there and say "I'll just use your machine". He had monster machines in his truck. He proceeded to make stunningly good welds with a machine I was convinced was total crap. I could not believe my eyes.
I eventually took college-level classes on welding and got massively better. I can make good welds with that little machine now, because I know what I am doing. And, yes, I later bought a more efficient modern inverter-based ESAB machine. The new machine wasn't going to make me a good welder. Blaming the machine was the wrong mental framework.
> I remember a project a long time ago that used Objective-C to implemente a genetic solver. It was unusable due to just how slow it was. I re-coded it in C.
Quick question. Why would you do such a thing? How can you square this with your claim that switching languages is never the answer?
I agree with you that a good development process is essential. The thing is, some languages make for a worse development process, because they inhibit automation of important steps in code review. I expect you wouldn't trust the weaker type systems of Forth or BLISS, or the unstructured control flow of COBOL, for a job where you could use C instead. If there are analysis phases that C can automate and these other languages can't, isn't it obvious (given only a moment's thought) that there could be other languages that improve on C?
Because that was the only way to write iOS apps. It sucks. It's unnecessary. And yet you could not avoid it for a good portion of an iOS app.
> you wouldn't trust the weaker type systems of Forth
I used Forth professionally for many years. I can't think of a single end-product issue caused by Forth. The range of applications I wrote went from device drivers and low level robotics code to full console-based applications, like specialized text editors.
> isn't it obvious (given only a moment's thought) that there could be other languages that improve on C
At the core, as I said before, these things only exist for the benefit of the programmer and generally don't exist in what I am going to call the real world inside the processor.
Here's an example: A bytes() object in Python and MicroPython is not mutable. It also happens to be one of the lightest data structures you can use if doing communications work. In a recent MicroPython project we needed to manipulate the data in bytes() objects without causing reallocation of new objects. I wrote a set of routines in ARM assembler to do this just. For example, take a bytes() buffer, calculate a CRC-16 of the data, add it to the end and modify values in the front based on this CRC calculation.
In other words: There is no such thing as a non-mutable data type.
I think the simplest way I can put this is:
Bad code is the result of bad programming, not a consequence of the chosen language.
I should make it clear that I don't have a problem using any language (I have, many) or someone choosing to use whatever they like. My argument is that this idea of blaming solid, reliable, battle-tested languages like C for problems that are, in reality, the consequence of bad programming is dishonest.
> Bad code is the result of bad programming, not a consequence of the chosen language.
You've repeated this many times now. What I haven't seen you talk about is how to fix all this programmer badness. Do you think bad programmers should all be fired and blacklisted from the whole industry? Who will replace them? How do we make sure their replacements aren't just as bad as the ones who were fired? How do we even identify the bad ones before they write bad code? What if they want to become better programmers instead of getting fired? How do they figure out whether they're sufficiently good?
The value proposition of a new language is not measured strictly by how fast it lets you write new code. It also needs to be a medium for communication between programmers. Communicating the intent behind your code helps identify the ways it could be improved. If your intent can be codified in a way that even the computer can understand, that process speeds up dramatically.
Proponents of new languages are winning the debate over how to fix systemic problems in the software industry. The reason they are winning is because their opponents in this debate do not have a coherent solution. If you can suggest one, maybe you'll change everything.
> What I haven't seen you talk about is how to fix all this programmer badness.
Well, that wasn't part of the conversation until now.
In a sense, I have hinted at this. There are two elements, education and experience.
I firmly believe good programmers come from having a solid foundation built on low level code. That means a good progression might be assembler, Forth and then C. If I ask someone to explain how a list is stored and manipulated in memory in a language like Python, I expect at least a plausible explanation rather than a shrug. For me it doesn't even have to be be absolutely correct to show me they have gotten their hands dirty with low level code.
Forth is interesting not only because of the RPN paradigm; one can learn a lot from implementing it from scratch on any microprocessor. From there you go on to actually turning that into a useful console-based computer. For example, implement all the peripheral drivers, a file system, file manager, text/code editor, etc. I would not have anyone touch C until they have completed the prior work.
There's a reason for which companies like Google have seemingly crazy hiring processes for software developers: Our schools seem to be doing a crap job of training them. If that were not the case, there would be no need for such tests. A degree with a decent GPA would be enough.
> Do you think bad programmers should all be fired and blacklisted from the whole industry?
I am assuming that's not a serious question. There are bad doctors, attorneys, cops, teachers and carpenters. There is no such thing as equality of outcomes in anything. Hopefully the natural process in each domain expunges incompetence over time. That's the best we can hope for. Other than that, I can't tell you what we should do.
I have to go back to my premise (flipping it around a bit): A different programming language isn't going to magically turn a bad programmer into a good one; much like a $4000 computerized welder wasn't going to make me a better welder.
At the extremes, if someone doesn't know how to solve problems computationally, there's no language you can throw at them that will turn them into CS problem solvers.
> Proponents of new languages are winning the debate over how to fix systemic problems in the software industry. The reason they are winning is because their opponents in this debate do not have a coherent solution. If you can suggest one, maybe you'll change everything.
No. That's not correct. The reasons we keep taking crazy rides up and down a bunch of languages is that developers are coming out of schools with skills that require them to start at that level. As I said in another comment, the first thing every recent grad reaches for is a complex object structure. Because that's all the know. They don't actually know we were doing things like sending people to the moon without any of that stuff. They think it's necessary. And so, they develop tools and frameworks "in their image", if you will. Which means that the entire thing is a self-fulfilling prophecy.
I remember one of the most impactful examples my son (a recent MS CS grad) experienced while working with me on a project. He needed a serial communications library for a robotics system we were building. He reached for a library and used it. The thing did not perform well and was giving us problems. That's when I became involved. The library consisted of, I don't know, two to four pages of classes, methods, etc. After understanding what it was doing I re-wrote what we needed in something like ten lines of procedural code.
The unnecessary bloat in various programming languages ecosystems is something that should make anyone take pause. I mean, you see things like someone creating an entire object hierarchy with methods and properties for what amounts to managing a few thousand bytes of data in an array in memory. Instead of a raw close-to-the-machine for loop iterating through the data you end-up with a dozen objects instantiated, copious properties, layers of methods and...well, you get the point (I hope).
> Well, that wasn't part of the conversation until now.
Sure it was. You kept saying "this isn't a solution." All I did was point out the converse.
I experienced the kind of "modern" CS education that you consider a failure, and believe it or not, I have about as much scorn for it as you do. I did get some good exposure to serious analysis and low-level programming, but the other half of what I learned was pretty much a waste and I had to unlearn it the hard way after leaving school. So, I'm not here to defend the Java idiom of mile-high towers of superclasses, runtime polymorphism that nobody will ever need, or wild pointer goose chases that accomplish nothing but stalling the pipeline. All that stuff is a waste of everyone's time. I like my compiled code to stay lightweight and close to the metal.
I am here to defend expressive type systems that permit detailed annotations of what should or shouldn't be done with a particular piece of data. I shouldn't have to rely on comments alone to say "the pointer returned by this function must be freed by the caller" or "the pointer returned by this function must NEVER be freed by the caller." I want the compiler to understand me when I say these things, and I want it to enforce my rules.
C doesn't have that, obviously, but I'm sure you already know that it was a huge change from its immediate predecessors, which didn't have types. Even early C didn't have structs. It added these features because they made programming less error-prone. Nobody wanted to manually calculate field offsets and risk getting it wrong.
That was 50 years ago. There have been missteps since then (I think every PL theorist counts OOP among these) but that doesn't mean C has to be the absolute last systems language ever in the history of computing.
It isn't a failure. It simply isn't optimal. Just my opinion, of course. Which means I could be very wrong.
> "the pointer returned by this function must be freed by the caller" or "the pointer returned by this function must NEVER be freed by the caller."
Those two are examples of bad programming. Both constraints are almost guaranteed to create, at a minimum, an unmanageable mess and, at worst, very dangerous software (think embedded controller for a robot or a rocket).
While I have not looked, I would be very surprised if something like that existed in the Linux codebase, which is C.
> that doesn't mean C has to be the absolute last systems language ever in the history of computing.
I don't think I have suggested this at all in this conversation.
My position is very simple: Don't blame the language for bad programming. This is rarely the problem.
The fact that someone can cause a mess using pointers does not mean pointers are the problem. They simply don't know what they are doing. The Linux codebase uses pointers everywhere, right? Is it a mess? No. Maybe that's because they are using the language correctly.
I am also not saying that all work in evolving programming languages should stop because C is perfect. Not the case I have made at all. What I will say is that --again, my opinion-- quite a few of the modern paradigms are complete nonsense in support of lazy programmers.
The question to ask might be something like:
Would you have been able to create the software without bugs using C?
If the answer is "yes", then it is likely the newfangled language was not needed or justified.
As an observation, pretty much all of these languages look like C. Outside of assembler, the only three languages I have used in my career that explored other paradigms were Forth, LISP and APL. I used these languages professionally for about 10 years. Not one of the modern languages people rave about have done anything to truly push the mark forward. In many ways APL was, in my opinion, the pinnacle. It' problem was that it surfaced way ahead of hardware being able to embrace it. For example, this code would take half a page of crud to create with any of the C-derivative modern languages:
+/⍉10 10 ⍴ ⍳100
And we were able to do this FORTY years ago.
What does it do?
This creates a vector of 100 consecutive numbers, 1 to 100.
Of course, this is a super-simple example of something that isn't necessarily ground-breaking on first inspection. Anyone should be able to reproduce this result with reasonable efficiency using C or derivatives. Not one line, but, who cares, that's not the important metric.
One of the most interesting demonstration of APL for those who have never seen it in action is this implementation of Conway's Game of Life in APL. Very much worth watching:
Once you internalize APL (which does not happen through casual use) it changes the way you think about how to solve problems computationally. You, quite literally, think at an entirely different level. Your brain visualizes data structures and implementation in a very different form. The closest equivalent I can reach for is using musical notation to describe music; which, of course, requires someone to have internalized the connection between notation and music.
This is where I was going when I said that we likely need a better programming paradigm for AI. That, in my opinion, can easily justify a new language. And, yes, I am biased --ten years of APL use will do that to you-- I think it has to be symbolic. Quite literally, a new language --just like musical notation is a language for music.
Going back to my core premise and the title of this thread: Throwing Rust at software development isn't going to fix bad programming. I don't see the point. And, yes, I could be wrong. If all hope of having capable programmers is gone, then, yes, of course, we need to make sure they don't make a mess just 'cause they have no clue.
Just like health checks and business clearance certifications don't avoid nasty food joints to open doors, or folks selling stuff on the sidewalks, yet they eventually get closed down.
Not sure I know what that means. To clarify, the cultist behavior comes in when there are tools that are perfectly fine and people insist on using something else.
> i want modern, secure languages with good paradigms and good package managers which have nice syntactic sugars and actual string types, memory checking and garbage collection.
Sorry, you lost me at garbage collection. Nobody I know who does serious real time embedded wants to touch garbage collection with a ten foot pole.
Everything you mentioned above is for the benefit of the developer. None of these things are necessary to build good software, at all.
The proof is in history. Not to go too far, Linux: C.
BTW, that's not to say we don't use other languages here. I have personally worked with everything from multiple assembly languages through Forth, C, C++, Visual Basic, LISP, APL, JS, Objective-C, PHP, Python, MicroPython and who knows what else. We just haven't turned any of the above into a religious belief.
In that context, with that level of experience, from consumer to aerospace, time and time again I find myself making the same observation about C and C++ being more than adequate for just-about everything. We are reinventing the wheel to allow for less capable, less knowledgeable software developers (or both).
A few points:
If your software development experience is limited to web or mobile development, please understand you lack a massive amount of context when it comes to software in the context of complex hardware manufacturing. To put it in simple terms, you really don't know what you are talking about. It's like someone who writes code for embedded systems in clothes washing machines having a strong opinion about how to engineer, deploy and maintain a tech stack for a SaaS.
Next, the cults formed around programming languages are some of the funniest things I read on HN. Except, they are not funny at all. There is nothing wrong with C or C++. Nothing. The problem is people who don't know how to program. At the core, to perform a certain function the microprocessor will, for the most part, do pretty much the same things, regardless of what language might be used to describe that. Blink a light in response to a button being pressed? At some point you are sensing and de-bouncing the input and then setting an output pin. Maybe there's I2C, SPI or CANbus in between, but what you have todo is the same.
Languages like C and C++ have been around for longer than most people reading this have been alive. They have been used --successfully and without issues-- for more projects, products and systems than anyone can imagine and at scales from a watch to spacecraft and everything in between. This cult of languages as solutions to problems caused by bad programming is nonsensical. One can write bad code in any language.
Professional certifications. This isn't going to stop people from writing bad code. At all. Designing and implementing complex code for real time systems is far more complex than designing a bridge or a building. Certifications are not magical inoculants against bugs and system design problems. Trying to view and fit software engineering from the context of other engineering disciplines is a mistake. They are not the same thing.
When you are dealing with fault-tolerant hardware + firmware systems things can get exponentially harder. Folks who work on web products have the luxury of being able to make lots of mistakes and fix them as fast as their deployment system allows. Imagine if you got ONE CHANCE to ship a non-trivial codebase and it has to be guaranteed to work to a high degree of certainty for ten or twenty years. Once you ship it, there will be millions of users and you cannot touch it at all. In fact, you have to telemetry data of any kind. You don't know if there are issues, if it has random failures, race conditions or odd failure modes you cannot possibly simulate at scale in the lab.
Imagine this next time you deploy an update. Imagine you are not allowed to touch it at all after that and it has to work. Imagine that, and maybe you can get a sense of what some of this work can be like.
This isn't limited to automotive. This goes for commercial, industrial, aerospace and other domains. Hardware is a million times more difficult that pure software products, particularly web type software. We just shipped a thousand units of one of our new products for installation in a large industrial facility. The firmware had to go through a year of testing before we dared ship even a single unit. That's another thing to imagine; the idea of having to test and qualify your web SaaS --not with clients as test subjects, on your own-- for a year or more before you can launch your business (and you are not allowed to touch it after that).
Anyhow. Don't be language cultists. That isn't the problem. Bad programmers and bad process is far more of an issue. Changing languages isn't going to magically fix that. It isn't a solution. You can build anything with C or C++. If written correctly, it will perform better than just-about anything and be reliable within the bounds of system specifications.