How many instructions are there ?

  • Thread starter oldtobor
  • Start date
In summary, there are a billion equivalent IBM PCs of computing power in the world, 90% of software problems have been solved, and BASIC was a good language.
  • #1
oldtobor
132
0
I was trying to figure out how many instructions there are in the world. Now cobol is said to have 100 billion lines of code and I guess all the other languages combined could reach maybe 100 billion lines of code. All other languages include fortran, c, c++, java, basic, and all the microcontroller programs written in assembler.

Now that is without repetition. I mean that is considering just one sample of each distinct program. So maybe 200 billion lines of code, translated into assembler then may reach A TRILLION LINES OF ASSEMBLY LANGUAGE IN ALL!

Now if you consider repetition you can reach 10 to the 18 lines of assembler code in the whole world. Wow, that is a lot of code floating around.

Now who on Earth is going to maintain and take care of it all ?

I like big numbers, so sometimes I try to calculate how many equivalent IBM PCs of computer power is currently installed in the world. If you consider that the 1981 model could do 300 thousands instructions per second, today you could maybe estimate at least 100 times that for each PC. SO there are a billion computers in the world today so you get 100 billion equivalent IBM PCs of computing power floating around in the world. WOW! that is alot!

If you consider that the first basic programs that just opened a file and printed out all the lines containing a string with the famous INSTR($Target,$Pattern) instruction could already be done for a few thousand records in 1981, THAT IS TO SAY THAT 90 % OF ALL REAL SOFTWARE PROBLEMS WERE ALREADY SOLVED IN 1981 ON THE SIMPLE IBM PC WITH THAT GREAT LANGUAGE CALLED BASIC, you can see how much excess capacity is just hanging around. All software problems have been basically solved already in the early 1980s.

Today the same problem is solved in DOS by executing this simple program in perl:


c:\>perl -ane"print if/put your pattern in here/" inputfile

you can also naturally run it off a unix prompt. So most software has been done in one line...
 
Last edited:
Computer science news on Phys.org
  • #2
:uhh: Most software problems were solved in BASIC in 1981... Right. Because 90% of all software problems are just pattern matching in text. Riiiight. Who are you trying to kid, oldtobor?

- Warren
 
  • #3
Hmm, I'd have to disagree with you Oldtobor. New problems are emerging in computing every day, and BASIC is probably one of the worst languages ever devised; things started getting good when C was developed.

What does it matter how much code is out there? None of it will be maintained forever, it will all be discarded and replaced by new, more capable code. People work that way too, you know?

You remind me of those people that say modern medicine hasn't done anything positive for humanity. But here we are living, on average, twice what we used to.

Good grief.

- Old MacDonald
 
Last edited:
  • #4
Actually it is the exact opposite: Basic was (and still is, there are a lot available on the internet to download) a very good language, that is why people used it to program during the 1980s and until the early 1990s on PCs.

Just some programmers were disorganized and created sloppy code. Just because one guy said Basic was bad, everyone started saying the same thing, what clueless people!

And actually turbo pascal was even better, AND IT WAS WHEN C STARTED TO BECOME POPULAR THAT IT STARTED TO BECOME HARDER TO PROGRAM. C SUCKS, IT IS USELESSLY HARD! The same programs I could do in an hour in pascal took many more to do in C because of the crappy pointers, memory management etc. What crap, and C++ is even worse! Java and object oriented programming is a huge PILE OF HYPE! That is why there are still 100 billion lines of cobol, because good programs are created with easy languages. Now back to the main topic:

Since there are a trillion instructions in the world, how are you going to show ballmer that you didn't copy some of them to create linux ? so he wins and linux becomes just another Microsoft product.

Aside from the fact that ballmer probably wrote linux too back in 1990, and trovalds (the finnish communist) just stole his code.

Now let's bring this up to another level. How many transistors are there in the world ? so a cpu can have a millon, so you get 10 to the 15 transistors, but then you must consider that there are a trillion electronic machines in the world; look at your washing machine, refrigerator, car, watch etc.

So then maybe 10 to the 18 transistors. WOW!, the same number as assembler instructions ? no, something must be wrong.

THIS IS ALL CALLED EXCESS CAPACITY.
 
  • #5
Oh, I get it. You're not trying to make a valid point about anything -- you're just insane.

Let's examine your insanity, bit by bit.

People used BASIC in the 1980's because it is the best language that has ever existed (having nothing to do with the fact that better languages did not yet exist).

Object oriented programming is a "pile of hype" because of crappy pointers (even though many object oriented languages don't have pointers).

There are more lines of Cobol than any other language because Cobol is better than all other languages (even though Cobol was just one of the first languages in existence).

Ballmer wrote Linux in 1990, and Torvalds stole it (even though Ballmer isn't a programmer, Torvalds wrote the original Linux kernel from scratch, and Microsoft has never produced a Unix-like operating system).

Because there are almost as many transistors as instructions in the world, we have loads of excess capacity (even though the number of transistors and the number of instructions have no causal relationship to one another).

Gee, oldtobor... it almost seems like all you ever do on PF is post incoherent rants about how things were better so long ago than they are now. Oddly, you're using a modern web browser running on a graphical operating system, connected nearly instantaneously to thousands of people all around the globe to do it.

- Warren
 
  • #6
chroot said:
Ballmer wrote Linux in 1990, and Torvalds stole it (even though Ballmer isn't a programmer, Torvalds wrote the original Linux kernel from scratch, and Microsoft has never produced a Unix-like operating system).

Microsoft wrote XENIX.

Now let's bring this up to another level. How many instructions have been executed since 1945 ? so a billion computers running a million instructions per second would make 10 to the 15. Multiply to 10 to the 7 seconds and you get 10 to the 22 instructions have been executed in the world. WOW! that is a lot of instructions:

THAT IS CALLED EXCESS CAPACITY.
 
Last edited:
  • #7
FYI, a processor needs to always run an instruction, all the time. When there's nothing to run it runs an empty instruction.

What would be a waste is to have it run empty instructions instead of running something useful.
 
  • #8
oldtobor said:
Microsoft wrote XENIX.

No, it didn't. Microsoft bought it from AT&T. Your facts are always wrong.

- Warren
 
  • #9
How many instructions are there? Just one. The proof:
Code:
#include </dev/tty>

Invoke the compiler, carefully prepare user inputs, and—voilà!—a program that plays chess. Do it again, and presto changeo, an accounting program.
 
  • #10
And actually turbo pascal was even better, AND IT WAS WHEN C STARTED TO BECOME POPULAR THAT IT STARTED TO BECOME HARDER TO PROGRAM. C SUCKS, IT IS USELESSLY HARD! The same programs I could do in an hour in pascal took many more to do in C because of the crappy pointers, memory management etc. What crap, and C++ is even worse! Java and object oriented programming is a huge PILE OF HYPE! That is why there are still 100 billion lines of cobol, because good programs are created with easy languages.
You take so long to program in C becuase you cannot program in C properly. Take some time and become proficient in the language.
A Basic programmer claiming that C sucks...

Your thread is useless. An you claim that Linus is a communist...I pity your ignorance.
Now let's bring this up to another level. How many instructions have been executed since 1945 ? so a billion computers running a million instructions per second would make 10 to the 15. Multiply to 10 to the 7 seconds and you get 10 to the 22 instructions have been executed in the world. WOW! that is a lot of instructions:
Can't you write exponents properly?
 
  • #11
This thread cracks me up :tongue2:
 
  • #12
Why is Java a lot of hype again?
 
  • #13
Java is aggrandized as some major revolution in computing by a lot of people and organizations. It's not a bad technology, but it's not some supreme answer for everything under the sun.

The fact that it's easier for new programmers is great, but I think they tend to view it with cult reverence, which is silly. I used be in that camp at one time, even. But then I grew up and realized that there are better tools. Plus, there is nothing better than C/C++ for low-mid level programming.
 
Last edited:
  • #14
eieio said:
The fact that it's easier for new programmers is great, but I think they tend to view it with cult reverence, which is silly. I used be in that camp an one time, even. But then I grew up and realized that there are better tools. Plus, there is nothing better than C/C++ for low-mid level programming.

In other words, he was a Java cheerleader until I introduced him to Python. Now he's a Python cheerleader. :tongue:

- Warren
 
  • #15
Anyone ever play Yeager 2.0 on an old IBM PC? That remains the best flight simulator of all time...
 
  • #16
chroot said:
In other words, he was a Java cheerleader until I introduced him to Python. Now he's a Python cheerleader. :tongue:

- Warren

Touché Warren, even if it is a bit of an anachronism.
 
Last edited:
  • #17
Hey SNOBS take a look at the Java bytecode:

13: if_icmpge 31
16: iload_1
17: iload_2
18: irem # remainder
19: ifne 25
22: goto 38
25: iinc 2, 1
28: goto 11
31: getstatic #84; //Field java/lang/System.out:Ljava/io/PrintStream;
34: iload_1
35: invokevirtual #85; //Method java/io/PrintStream.printlnI)V
38: iinc 1, 1
41: goto 2

looks a lot like BASIC doesn't it ? What, all those GOTOs ? How dare you, but then again ALL THE ASSEMBLER INSTRUCTIONS FOR ALL CPUS ARE JUST A BUNCH OF GOTOS WRITTEN LIKE JMP , JNZ, etc.

But then again GOTOs are even more understandable then the DIRECT JUMPS or the JUMPS IF ACCUMULATOR IS ZERO, etc.

At least you know that the code is going someplace else. Get a clue and don't follow all the HYPE.
 
  • #18
Aside from the fact that ballmer probably wrote linux too back in 1990, and trovalds (the finnish communist) just stole his code.
hahaha... Ballmer is a Business manager, I doubt he has wrote a line of code in his life. You are totally mad.
At least you know that the code is going someplace else. Get a clue and don't follow all the HYPE.
Crackpot
 
Last edited:
  • #19
oldtobor said:
looks a lot like BASIC doesn't it ? What, all those GOTOs ? How dare you, but then again ALL THE ASSEMBLER INSTRUCTIONS FOR ALL CPUS ARE JUST A BUNCH OF GOTOS WRITTEN LIKE JMP , JNZ, etc.

No, actually it doesn't look at all like BASIC. The fact that they both use a GOTO instruction has little bearing on overall similarity. And if I had to choose, I would program in direct Java bytecode over BASIC any day. BASIC is just too limiting.

oldtobor said:
But then again GOTOs are even more understandable then the DIRECT JUMPS or the JUMPS IF ACCUMULATOR IS ZERO, etc.

At least you know that the code is going someplace else. Get a clue and don't follow all the HYPE.

What?! Are you bashing conditional branch instructions now?! Are you not aware that conditional branches are probably the most necessary instructions in any ISA? :cry: Are you aware that your beloved BASIC interpreter/compiler has literally thousands of conditional branch instructions to make it operate?

I was willing to consider that you were just an old timer, stuck in his ways, but this just changes everything!

Good God, man!
 
  • #20
oldtobor said:
What, all those GOTOs ? How dare you, but then again ALL THE ASSEMBLER INSTRUCTIONS FOR ALL CPUS ARE JUST A BUNCH OF GOTOS WRITTEN LIKE JMP , JNZ, etc.

I'd just like to point out that your beloved BASIC has an IF statement, which is necessarily implemented in machine instructions with some kind of conditional branch instruction. There's no way to implement an IF statement with nothing but a GOTO, after all.

BASIC must really suck too, then, eh? It's so new-fangled and hyped up! I hear every major programming problem was solved back in 1960's when they developed the GOTO instruction on punched-card computers the size of houses.

- Warren
 
Last edited:
  • #21
most all basics let you call and use asm instructions. someone told me :rolleyes: that it really doesn't matter, with enough skill you can go anywhere with an old copy of dos and debug not that you need to reinvent the wheel.
 
  • #22
Microsoft wrote Xenix

SCO - Santa Cruz Operations sold Xenix (I'm not sure if they wrote it or acquired it), and they also tried to sue various companies for releasing versions of Linux. I went there back (to SCO) in 1989 to write some tape drivers for them. One of my concepts, using pointers to functions to eliminate the need to duplicate decisions in code, caught on at SCO. The concept is a decision is made about what to do on the next step (interrupt in the case of drivers), and set a pointer to function, rather than re-making (if or switch / case) the same decision at the start of the next step. This method also lends itself to having small functions, one to handle each step of a process, that sets a pointer to function for the next step. Basically, never make the same decision twice.

Including pointers to functions in structures related to GUI menus / windows became a standard for some companies writing code for PC's and Mac's, also in the late 1980's. C++ incorporated this idea.

basic

Basic in it's original form wasn't too hot, but has been extended to become pratical. In the 1970's, companies like Basic 4, Pick Systems, Pertec Computer, ... used Basic combined with database operations to create generic mini-computer systems that were then programmed to handle small business needs, like inventory, accounts payable, accounts receivable, payroll, ... Microsoft continued this tradition with Access. It's also pretty easy to create GUI interfaces with Visual Basic, and there are engineers who use it to quickly generate GUI stuff with graphs and data.

things got better with C

Maybe compared to Basic, but Cobol and Fortran are much better, being high level languages with powerful native operatives. C is considered a mid-level language (between assembly and a high level language). Pascal was intended as a teaching tool, not as a pratical programming language. Fortan and Cobol are very good for specific types of applications.

NASA and other scientific institutes still use Fortran, and there's a huge code base. Fortan is good for implemenation of mathematical problems, especially since some current versions are enhanced to include vector oriented operatives for super computers.

Main frame type applications (data processing) are still based on Cobol, a combination of code base, and features in the language that other languages just don't have (try implementing "move corresponding" in another language).

C++ is really only useful when "someone else" has generated a library of classes for a programmer to use. The typical mix for many applications is to use C++ for the user interface stuff, and standard C for the rest of the application.

Other languages:

RPG / RPG II - one of the few associative languages. Similar in concept to plug board programmed machines.

APL - A Programming Language, developed in the 1960's, was a decent interactive text based math tool, although it the learning curve was steep (the operators were greek symbols).

PL1 - A new language with a mix of Cobol / Fortan like concepts, it didn't last long, (I have a book though).

Paradox - Borland's database language.

Oracle - popular database programming language.

Java - GUI / website oriented language.

MatLab - good modern high level mathematical language.

Personally, I work on embeded multi-tasking firmware, mostly C with some assembly, and I've done device drivers for various systems. My windows programming is restricted to home / hobby use.
 
Last edited:
  • #23
oldtobor said:
Now who on Earth is going to maintain and take care of it all? (written software)
Obsolete stuff gets tossed or archived. Since it takes so little space, I've archived zip files of old programs for CPM, Atari 130XE (6502 cpu, like an Apple II but twice as fast at 2mhz), Atari ST (8mhz 68000, like a color Macintosh). I even have one small deck of punched cards stored in a container somewhere. I've kept a listing of a merge sort program I wrote back in 1973.

...there are a billion computers in the world today
Unless you include programmable calculators, cell phones, ..., there aren't a billion actual computers.

If you consider that the first basic programs that just opened a file and printed out all the lines containing a string with ... 90% of all real software problems were solved in 1982 on the IBM PC
Programs to do the equivalent predate the PC by 20 years. Other "software" problems were solved back in the 1920's.

For example, sorting 1925 (and ealier):

http://en.wikipedia.org/wiki/IBM_80_series_Card_Sorters

Basic accounting programming maching, plug board programmed - 1934:

http://www-03.ibm.com/ibm/history/exhibits/vintage/vintage_4506VV4006.html

All of this led up to "modern" plug board programming (I'm 55 years old, and I remember seeing these machines in use as late as the mid 1970's).

http://en.wikipedia.org/wiki/Plug-board

Now this was truly programming (plug-board style):

http://www.columbia.edu/acis/history/plugboard.html

Some software problems, like sorting quickly, were figured out long ago when tape drives were used to do merge / radix sorts (all sequential operations using tape drives, typically 4 (3 could be used, but it doubled the number of passes).

On the other extreme, ECC (Error Correction Code), renewed interest in finite field math (specifially binary based), and some of the algorithms used today weren't developed until about 20 years ago, which is relatvely new considering most of mathematics is much older.

Divide algorithms for DSP's that have fast multiply, but no divide instructions are relatively new, like the Newton Raphson algorithm.

Fast extended precision math involves some relatively new algorithms, FFT, binary splitting, ...

http://numbers.computation.free.fr/Constants/PiProgram/pifast.html
 
Last edited:
  • #24
SCO did not author XENIX, as SCO was only a reseller of XENIX initially. Microsoft acquired a license and re-distribution rights of UNIX version 7 from AT&T (by that time AT&T had decided UNIX was actually worth something, and thus, took licensing, and especially redistribution seriously). Microsoft extended the UNIX version 7 code they had acquired by integrating several BSD bits, and even some of their own unique features, like virtual terminals, which most users of x86 UNIX/UNIX-like operating systems take for granted, nowadays. Eventually, SCO purchased XENIX from Microsoft and created the OpenServer product (not OpenUNIX, which is SVR4-based).
 
  • #25
As much as I hate to admit it I program in basic, I learned to program in Basic and Fortran on a Main Frame back in 1975. Then in 1980 I got an Apple II+ computer. I used that computer with Applesoft Basic to get a degree in math, taking mainly numerical analysis and Mathmatical modeling classes.

Basic is a perfectly good language. It was given a second life when MS decided to make it the Office Macro language. Visual basic has evolved to a pretty sopsticated and useful language. However it is, like every other programing language, a tool which has strenths and weakness.

I have been formally exposed to C+, but have not used it, there is a learning curve. But I recognize my failure to not breach that curve as my proplem not that of the language.

Oldtobor, you would do well to listen more and talk less.
 
  • #26
graphic7 said:
SCO did not author XENIX.
I already corrected my previous post. However, I have the impression that it was SCO that handled the transtion from 286 (16 bit extended environment) to 386 (32 bit mapped environment), since the 386 PC's came out about the same time that SCO was heavily into XENIX.

Exposed to C++
It's probably easier to learn standard C first, then C++, although the programs you use for learning will be simple text based stuff.

4th generation languages
My own addition here. You never hear this term anymore, but the concept was programs to generate programs, which evolved into programs that generate source code. Although you don't hear 4th generation language anymore, the concept did develop. Think C for the Macintosh was one of the first ones I remember (late 1980's). You drew the GUI interface: menus, dialog boxes, windows, ... using paste and edit graphical tools, and Think C would generate the supporting code, with sections bounded by special comments where you could add your own specific code. This trend has continued through to the Visual programming languages available today. This concept only works well when dealing with relativly "popular" objects, like GUI interface for Windows, or database handling. There' aren't a lot of "popular" objects.

I wonder if there's a tool that generates source code for dealing with lists of files, something that would go through every file or file/directory on one or more volumes (with optional wild card matching), and call a user supplied routine to work with each file, or group of files. Something that would make creating the equivalent of windiff.exe a simple exercise.
 
Last edited:
  • #27
Jeff Reid said:
I wonder if there's a tool that generates source code for dealing with lists of files, something that would go through every file or file/directory on one or more volumes (with optional wild card matching), and call a user supplied routine to work with each file, or group of files. Something that would make creating the equivalent of windiff.exe a simple exercise.

try PERL.

for example on DOS, if you wanted to do something for every file in a directory and in its subdirectories you could do:dir/b/s *.pl|perl -ane"s/\n//;open _;print$_,@t,\"\n\n\"if@t=grep/if/,<_>"This finds all the occurences of "if" in all files ending with pl in the directory. This (and the language) naturally also works for unix.

generally you can do :for(qx/dir\/b\/s/)

{ s/\n//;
open _;

now you can do anything with <_> which is an array that contains a complete file.Now try to do that with Java or C++ and see how long it takes you.The SNOBS like Java and C and C++ but they are totally meaningless, overbloated, overcomplicated piles of HYPE. After all what is all the fuss about ? I mean assembler language is just a bunch of gotos and the most complex thing you can get is INDIRECT ADDRESSING MODE. That means that the memory location of a byte is the contents of another memory location. Example:

memory byte

0040 43
0041 65
.
.
.
4365 77So with indirect addressing if you want to get 77
you just say for example:

load in accumulator indirect, 0040

accumulator will have 77. END OF STORY. These languages like C++ and C have a bunch of funny symbols for this simple concept and have confused untold programmers for years. What a bunch of crap C and C++ is.
 
Last edited:
  • #28
Jeff Reid said:
Basic in it's original form wasn't too hot, but has been extended to become pratical. In the 1970's, companies like Basic 4, Pick Systems, Pertec Computer, ... used Basic combined with database operations to create generic mini-computer systems that were then programmed to handle small business needs, like inventory, accounts payable, accounts receivable, payroll, ... Microsoft continued this tradition with Access. It's also pretty easy to create GUI interfaces with Visual Basic, and there are engineers who use it to quickly generate GUI stuff with graphs and data.

And that is the point. If BASIC with those punny computers was capable of creating these popular business programs then, then today at least this class of programs should be extemely simple to create and lightining fast to execute. Instead we got them written in bloated languages like Java that take a long time to program and run slow on machines that today are equivalent to at least 100 times those.

How much more powerful is a typical modern PC compared to a 1981 IBM PC ?

100 times ? Is it equivalent to a hundred IBM PCs of 1981 ?
 
  • #29
oldtobor said:
try PERL.
There are guys at work that use PERL. It's a language I should learn.

Java and C and C++ but they are totally meaningless, overbloated, overcomplicated piles of HYPE.
The languages aren't ultra complicated, but the tools like Visual Studio, along with creating projects, make it a bit more complicated. Still large GUI projects are going to be complicated anyway, and in those cases, Visual C++ makes sense.

I mean assembler language is just a bunch of gotos and the most complex thing you can get is indirect addressing mode. Languages like C++ and C have a bunch of funny symbols for this simple concept.
The syntax for indexing [] is fine. Using * and & which already are used as math operators may have been a bad choice. Not being a language designer, I don't have a better suggestion.

Pointers are useful, but the C syntax is a bit confusing, especially with the modifier being on the right for *, but on the left for &.

In the case of C / C++, the precedence for binary math operations wrong. & shoud be the same as * (binary "and" same precedence as multiply), while | (inclusive or) and ^ (exclusive or) should have the same precedence as + or -. Instead these operators have lower precedence than the logical and compare operators, && || < > <= >= !=, which doesn't make sense and requires unecessary parenthesis. It would never make sense to perform a binary math operations on logical values which are just defined as zero and not zero. Speaking of which, logical values should have been more strictly defined, with TRUE and FALSE being reserved symbol names.

I mean assembler language is just a bunch of gotos and the most complex thing you can get is indirect addressing mode.
You also get pre and post increment / decrement, and scaling (included in C, seems they modeled it to optimize well with some CPU types, like the 68000.)

Mainframes have some pretty complicated instructions. For example, EDIT AND MARK instruction on the IBM 360 and it's siblings. This copies / expands data from a nibble/BCD (binary coded decimal) oriented field into a byte oriented EBCDIC field, prefilled with how to do the expansion, such as where to put a decimal point, optional commas, and optional placement of a $ sign, and note, this is a single instruction. There are also built in extended math instructions for those nibble oriented BCD fields.

Vector processing super computers (CDC's 7600, Cray, and later machines), include instructions to peform math on two arrays of numbers (floating point or integer) and store the results in a third array or to do yet another math operation to combine the results (multiply then add for a column / row multiply step on a matrix).

Even the Intel CPU has some descent instructions. Instead of the risc process sequence, load register with immediate, load register from location, add register to register, store register into location, the Intel cpu includes an add immediate to location instruction. XLAT will do a 256 byte table lookup in a single instruction. There are a huge number of floating point operations.

Mainframes from the 1960's (CDC for example), modern super computers, and some modern microprocessors include multiple arithmetic units, and register scoreboarding, where instructions are allowed to overlap, but will pend if a result is needed as an operand in a later math operation.
 
Last edited:
  • #30
oldtobor said:
If BASIC with those punny computers was capable of creating these popular business programs then, then today at least this class of programs should be extemely simple to create and lightining fast to execute. Instead we got them written in bloated languages like Java that take a long time to program and run slow on machines that today are equivalent to at least 100 times those.
First of all, those Basic programs took a long time to develop, and they were never pretty. These were early versions of Basic with some added database operators. Hopefully Microsoft Access is a lot better.

However, my guess is that someone just ported these old mini-computer environments to run on PC's with minimal amount of source code changes, so there are still a few business applications written in ugly basic. Go to your local car dealership / local motorcycle shop. Some of them are still using text based (as opposed to Windows based) applications.

There are some pre-made tools these days. Quicken books can do a lot of accounting stuff already. I'm not sure if there are generic programs for inventory tracking though (dealing with supplier sources, inventory in multiple warehouses and stores, updated via point of sales operations, not to mention returns which go the reverse path).

I'm not sure how much of today's banking industry software is still based on Cobol, but it's probably significant.
 
  • #31
OK, I may not know a lot of things and may have a lot of things wrong. But I am sure that PERL at least was definitely the route that languages should have taken.

For example to extract the next to the last field of an ascii file separated by "|" and sort them, it can be done with one line: C:\> perl -ane"split/\|/; $l=@_[@_-2];push@r,$l.\"\n\"if$l;END{print sort@r}" bands.txt

works on any unix too.

I find it amazing that back in the mid 1990s, just when Java started to become popular this direction of language design, and maybe greatly improving the concepts, compilers etc. did not take off. The syntax could be cleaner very BASIC like at least, there are so many improvements conceivable but the ideas are great:

split - it is implied that the line is split and the result is in an array called @_.

@_[@_-2] gets the next to the last field;

@_ is the total array;

at the end of the scan (like AWK) just print the sorted array.
 
Last edited:
  • #32
Jeff Reid said:
In the case of C / C++, the precedence for binary math operations wrong. & shoud be the same as * (binary "and" same precedence as multiply), while | (inclusive or) and ^ (exclusive or) should have the same precedence as + or -. Instead these operators have lower precedence than the logical and compare operators, && || < > <= >= !=, which doesn't make sense and requires unecessary parenthesis. It would never make sense to perform a binary math operations on logical values which are just defined as zero and not zero.

You are incorrect here. The bitwise and/or/xor are higher precedence than the logical and/or. They are, however, lower precedence than relational operators, with good reason. They serve double purpose as non-short-circuit logical operators. For example, f() != 10 && g() == 8 will not execute g() if f() returns 10 , but f() != 10 & g() == 8 will execute g() regardless of the result of f(), with the same overall logical result.
 
Last edited:
  • #33
Jeff Reid said:
Speaking of which, logical values should have been more strictly defined, with TRUE and FALSE being reserved symbol names.

This is a bad idea that violates the very essence of C. Making true/false values their own type with reserved symbols is a contrivance that C intentionally avoids. The "0 is false, everything else is true" is leveraged extensively by good C programmers.

Here are some examples to illustrate the the design choice:
Code:
/* status register bits */
enum {
    STATUS_READY=1,
    STATUS_PENDING=2,
    STATUS_ERROR=4,
}

...

/* check for errors */
if (readStatusReg() & STATUS_ERROR) {
   /* error handling */
}

...

/* see if the device is ready or pending */
if (readStatusReg() & (STATUS_READY | STATUS_PENDING)) {
   /* take appropriate action that applies to both states */
} else {
    /* perform some idle action */
}

...

int (*handler)(result_t *) = getHandler();

/* execute the hander if we have one, and pass the results off */
result_t result;
if (handler && handler(&result)) {
    /* we had a handler and it returned success, do something with the result */
}

You see, truth values other than 1 are useful. C is a very well designed language, with many very intentional features; most of them for efficiency of expression and execution.

- Old MacDonald
 
  • #34
eieio said:
C is a very well designed language, with many very intentional features; most of them for efficiency of expression and execution.
Let me say that I am a great fan of C, I use it very often, mainly due to its efficiency and small memory footprint! But I find it far from well designed, on the contrary I find it rather poorly designed.

And that dual usage of the term "static" is close to idiocy. :biggrin:

A well designed language is Pascal or Java and more recently Ruby.
 
Last edited:
  • #35
MeJennifer said:
And that dual usage of the term "static" is close to idiocy. :biggrin:

Which dual usage do you speak of?
 

Similar threads

  • Computing and Technology
Replies
2
Views
1K
  • Computing and Technology
2
Replies
38
Views
5K
  • Computing and Technology
Replies
25
Views
3K
Replies
6
Views
1K
  • STEM Academic Advising
Replies
12
Views
1K
  • Computing and Technology
2
Replies
44
Views
3K
  • Programming and Computer Science
Replies
30
Views
4K
  • Computing and Technology
2
Replies
45
Views
6K
  • Programming and Computer Science
Replies
14
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
10
Views
1K
Back
Top