Security Versus Programming Language

  • Thread starter anorlunda
  • Start date

anorlunda

Mentor
Insights Author
Gold Member
7,302
4,078
The following quote caught my eye.

http://catless.ncl.ac.uk/Risks/31/40 said:
It seems clear that trying to write secure operating systems in C does not
work. Very smart people have tried for 50 years, and the solution to the
problem is not reduced to practice.
I presume that buffer overflow, heap management, and pointer validation are the shortcomings of C that lead to insecurity. But the broader implications make me curious.

  1. What other features of a programming language directly aid security of the products?
  2. Are the security implications of the language different for OS compared to other software?
  3. My bias leans toward KISS. I suspect compiler/library vulnerabilities in very high level languages that lead to insecurities in the infrastructure. Are there studies that quantify complexity versus security? I mean statistically, not anecdotally. Perhaps DOD studies on Ada.

I realize that clarity and structure influence program quality, and thus indirectly influence security. I am asking about direct factors, not indirect.
 
Last edited:
10,811
4,349
One common problem in the early days of the web was form entry fields where the web application would take what is typed and store it. The problem was folks would try crazy stuff like entering strings with backticks which a way shell scripts use to execute commands:

In BASH, the line:

echo date is: 'date'

would first run the date command and stuff its output into the echo statement for it to print.

Curiously, I had to use ' single quote because this editor doesn't allow use of the backtick (its the character on the left most key next to the 1 key on your keyboard (US Keyboard your keyboard may vary)

They call these kinds of attacks injection attacks. Consider now if the entry field is placed in a sql insert statement and with backticking you could run a sql statement inside a sql statement or any valid linux/unix... statement for that matter. Alternatively you could mess with the sql statement itself as shown in the example below.

So now the prudent thing to do is to verify the user input and quote anything that is not alphanumeric.
hence the &gt. codes you see on urls today.


One of the most famous examples of hacking was the Cliff Stoll story where a hacker entered a system by hacking a user acct, created a shell script called bin and changed the file separator to a space from /, proceeded to edit a file and then hung up.

The vi editor would try to retain the file for the user thinking the phone line had dropped (a common occurrence with phone modems) and keeping privacy in mind entered superuser mode saved the file to the /tmp directory so that only the user could access it and lastly, launched the /bin/mail command in superuser mode to notify the user of the file save.

With the hacker changing the / to a space the system actually ran the 'bin mail' command meaning it actually ran the hacker's bin command script in superuser mode allowing him to do anything on that system which he did creating a new acct staying as superuser and thus hiding from the normal security scans.

Cliff uncovered the deception when given the assignment to identify a $0.75 billing discrepancy that would pop up from time to time. He saw that one profs acct was accessed when this prof was on sabbatical at another university, hadn't signed on for over a year and that each time the person was on only for seconds and then disappeared.


and the NOVA show from 1990s:

 

anorlunda

Mentor
Insights Author
Gold Member
7,302
4,078
It might be nitpicking, but I could write a secure function SECURE_TEXT_FIELD to eliminate the injection problem. But then I depend on the programmer to use the function everywhere, and third parties will be forever suspicious that he did not.

A language feature does not depend on the programmer's discipline. For example, automatic garbage collection, or automatic real-time bounds checking.
 
1,091
193
Firstly a correction if I may - the article took a bit of finding as the fragment part of the url is incorrect - should be http://catless.ncl.ac.uk/Risks/31/40#subj6.

IMHO this is an intractable problem (and it seems that the author may agree with this as he refers to some failed alternatives). The fundamental requirement of any language for writing an operating system is that it can do anything with the hardware, and with this comes the possibility of doing bad things. Add the unnecessary but highly desireable requirement that the OS should do everything as quickly as possible makes the situation worse. Abstracting this ability away under a higher level language does not help - the problem then becomes 'how do we create a secure language for creating operating systems' with exactly the same challenges as 'how do we create a secure operating system', only now we have made it even harder to satisfy the performance constraint.

KISS is great, but unfortunately a further requirement for a (general purpose) operating system is to work on a range of hardware, both core and peripheral, and implement the latest standards for communication with other systems.

Hardened OSs turn these requirements upside down, placing security at the top of the pile. They only work on specific hardware, only communicate with other systems under the same level of control and any performance issues are dealt with by throwing $$$ at the problem.
 
1,091
193
A language feature does not depend on the programmer's discipline. For example, automatic garbage collection, or automatic real-time bounds checking.
These language features do not generally work well at the OS level - what happens when the automatic garbage collection interrupts a time-critical IO operation?
 

Klystron

Gold Member
470
506
These language features do not generally work well at the OS level - what happens when the automatic garbage collection interrupts a time-critical IO operation?
Before the job title was denatured, a programming team might consist of a 'systems programmer' working hand-in-glove with groups of application programmers under the general supervision of a software engineer. The systems person worked hard to avoid conflicts and optimize efficient I/O streams freeing application programmers to concentrate on apps. The software engineer conducted periodic software walk-throughs paying special attention to timing and synchronization. Multi-threaded parallel processing enhanced I/O throughput and computation efficiency at the expense of increased complexity.

Later, with network engineers, database engineers and configuration management (CM) build engineers collaborating on large projects; systems programmers might install OS patches and updates, configure browsers and email and perform jobs too technical for MIS (management information specialists) but that rarely required original code.
 

anorlunda

Mentor
Insights Author
Gold Member
7,302
4,078
Abstracting this ability away under a higher level language does not help - the problem then becomes 'how do we create a secure language for creating operating systems' with exactly the same challenges as 'how do we create a secure operating system', only now we have made it even harder to satisfy the performance constraint.
I think we are in agreement, that we can't blame insecurity on the choice of programming language. The comment referenced in the OP implies that insecurity is the fault of the C language.

Performance is a different subject. The desire for performance is hardly unique to the OS. Also, no matter how much faster CPUs become over the years, speed never seems to provide a cure for bugs or insecurity. That makes me skeptical of the suggestion that security and performance are tightly linked.
 

Klystron

Gold Member
470
506
Exactly like that, Peter. Systems programmers once worked directly on operating systems -- though I am referring to the time after they had to physically reroute wiring -- coding in assembler and machine specific job control languages. A new device meant writing or modifying a device driver. Like welding on an assembly line, the job has been automated.

definition:: denatured: 1) to deprive (object) of its natural character, properties, status, etc.
 

Want to reply to this thread?

"Security Versus Programming Language" You must log in or register to reply here.

Related Threads for: Security Versus Programming Language

Replies
11
Views
994
  • Posted
Replies
19
Views
6K
  • Posted
Replies
10
Views
5K
  • Posted
Replies
1
Views
546
  • Posted
Replies
2
Views
593
  • Posted
Replies
16
Views
3K
  • Posted
Replies
18
Views
8K

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top