anorlunda said:
I presume that buffer overflow, heap management, and pointer validation are the shortcomings of C that lead to insecurity. But the broader implications make me curious.
For me, C was not the first programming language I learned but it was by far the most influential and helpful language I've ever met. I have no doubt that the above quoted are shortcomings of C but to be fair the advantages and drawbacks of any programming language is something that can't be studied without reference to a specific period of time, to the specific hardware architectures and systems that it is / was run on - be it an application, a compiler or more importantly an OS and last but not least, the networking and internetworking scene of the time considered. I think that it does no justice to a programming language which has directly and indirectly influenced and in fact has paved the way to develop most modern software like C, to just call it an insecure language - I don't say that the original post does this, I just say it for clarification purposes.
As is widely known, C was originally designed for and implemented on the UNIX operating system on the DEC PDP-ll by Dennis Ritchie and it is a descendant of B which was influenced by BCPL. At the time created, C fulfilled specific goals: a general-purpose programming language which features economy of expression, modern control flow and data structures, and a rich set of operators. As a language near the "bare metal" but with a lot of high level features, it was inevitable that a lot of its power could potentially
some day be utilized in innocuously inappropriate but potentially dangerous or purely malicious ways.
This became evident much later but in any case I don't think that it is due to intrinsic weaknesses - at least not so much, as it is due to the way that software was developed including libraries which to be fair they also need an appropriate time - frame, each one, to be judged fairly. There has been a lot of improving by the ANSI standards through years but again, especially for the case of OSes like UNIX and its descendants and Windows, all of which have a form of C "ticking" at their heart, the comlexity of the software itself combined with high demands for speed and flexibility does not leave much room to a programmer to be absolutely strict on security standards and on the other hand, real gurus of C - and C++ for that matter, are at least as far as I know very few - even nowadays, compared to the total number of people who are professional programmers.
The point for me is that C has done a tremendous job in the software industry for decades while hardware technology and networks / communications has, during this same time, improved so much. It is really difficult for me to think of another programming language that could have such a long-standing positive impact for machines and OSes in widespread use.
anorlunda said:
- What other features of a programming language directly aid security of the products?
- Are the security implications of the language different for OS compared to other software?
- My bias leans toward KISS. I suspect compiler/library vulnerabilities in very high level languages that lead to insecurities in the infrastructure. Are there studies that quantify complexity versus security? I mean statistically, not anecdotally. Perhaps DOD studies on Ada.
For the first, I would just say that features that aid security may be at odds with speed and flexibility and simpicity for the programmer, at least when taken to the extreme. There is no silver bullet here in my opinion. Absolute security can only come from hardware - as has already been noted, but this is not a silver bullet either, as we don't talk about dedicated OSes here or software constructs / systems in general, but for general purpose ones.
For the second, I think that the security implications are definitely different for OSes compared to other software, given the operations / tasks that the OS performs regarding hardware, itself and the software applications that run on it.The impact of security holes in an OS is combined and magnified in many cases and as the complexity of the OS increases so does the complexity of even idefintifying a potential hole. On the other hand a security hole in an application is in general and at average, more easily identified.
For the third, I think that KISS is a good general principle but unforunately, it may mean different things to different people at times and in any form, it is not always feasible to be followed. Software vulnerabilities are indeed in my opinion responsible for insecurities but this is inevitable: hardware must be
fast and
cheap and in many cases non-dedicated - at least for systems in widespread use, so software must do the job under the pressure of fast development, quick and dirty solutions in many cases, flexibility and simplicity. I think that the only think anyone sees at the end of this tunnel is "Welcome to the vulnerabilities realm".
anorlunda said:
In a von-Neumann architecture, we compile and link programs to create executable code. That program is data to the compiler and linker. But then we sprinkle it with pixie dust and say that it is now code. Isn't that the mechanism for almost all modern malware?
It may
lead to potential vulnerabilities but with a lot of factors in between. Von Neumann architecture was a real innovation in my opinion, as it solved a whole lot of problems in order for computers to get into widespread use. Now, a whole lot of factors including but not limited to the evolution of hardware itself, programming and (inter)networking, come into play and mediate the final result. It isn't feasible to create cheap and secure hardware for widespread use, as it is not feasible to create ideally secure software for another multitude of reasons nor have complete control of a network and of its potential internetworking. On the other hand, can non - Von Neumann architectures be purely utilized in order to develop systems for widespread use? I think that at least for the form of such systems we have today, there is no such case.