Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

How many instructions are there ?

  1. May 14, 2007 #1
    I was trying to figure out how many instructions there are in the world. Now cobol is said to have 100 billion lines of code and I guess all the other languages combined could reach maybe 100 billion lines of code. All other languages include fortran, c, c++, java, basic, and all the microcontroller programs written in assembler.

    Now that is without repetition. I mean that is considering just one sample of each distinct program. So maybe 200 billion lines of code, translated into assembler then may reach A TRILLION LINES OF ASSEMBLY LANGUAGE IN ALL!

    Now if you consider repetition you can reach 10 to the 18 lines of assembler code in the whole world. Wow, that is alot of code floating around.

    Now who on earth is going to maintain and take care of it all ?

    I like big numbers, so sometimes I try to calculate how many equivalent IBM PCs of computer power is currently installed in the world. If you consider that the 1981 model could do 300 thousands instructions per second, today you could maybe estimate at least 100 times that for each PC. SO there are a billion computers in the world today so you get 100 billion equivalent IBM PCs of computing power floating around in the world. WOW! that is alot!

    If you consider that the first basic programs that just opened a file and printed out all the lines containing a string with the famous INSTR($Target,$Pattern) instruction could already be done for a few thousand records in 1981, THAT IS TO SAY THAT 90 % OF ALL REAL SOFTWARE PROBLEMS WERE ALREADY SOLVED IN 1981 ON THE SIMPLE IBM PC WITH THAT GREAT LANGUAGE CALLED BASIC, you can see how much excess capacity is just hanging around. All software problems have been basically solved already in the early 1980s.

    Today the same problem is solved in DOS by executing this simple program in perl:


    c:\>perl -ane"print if/put your pattern in here/" inputfile

    you can also naturally run it off a unix prompt. So most software has been done in one line....
     
    Last edited: May 14, 2007
  2. jcsd
  3. May 14, 2007 #2

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    :uhh: Most software problems were solved in BASIC in 1981... Right. Because 90% of all software problems are just pattern matching in text. Riiiight. Who are you trying to kid, oldtobor?

    - Warren
     
  4. May 15, 2007 #3
    Hmm, I'd have to disagree with you Oldtobor. New problems are emerging in computing every day, and BASIC is probably one of the worst languages ever devised; things started getting good when C was developed.

    What does it matter how much code is out there? None of it will be maintained forever, it will all be discarded and replaced by new, more capable code. People work that way too, you know?

    You remind me of those people that say modern medicine hasn't done anything positive for humanity. But here we are living, on average, twice what we used to.

    Good grief.

    - Old MacDonald
     
    Last edited: May 15, 2007
  5. May 16, 2007 #4
    Actually it is the exact opposite: Basic was (and still is, there are alot available on the internet to download) a very good language, that is why people used it to program during the 1980s and until the early 1990s on PCs.

    Just some programmers were disorganized and created sloppy code. Just because one guy said Basic was bad, everyone started saying the same thing, what clueless people!

    And actually turbo pascal was even better, AND IT WAS WHEN C STARTED TO BECOME POPULAR THAT IT STARTED TO BECOME HARDER TO PROGRAM. C SUCKS, IT IS USELESSLY HARD! The same programs I could do in an hour in pascal took many more to do in C because of the crappy pointers, memory management etc. What crap, and C++ is even worse! Java and object oriented programming is a huge PILE OF HYPE! That is why there are still 100 billion lines of cobol, because good programs are created with easy languages.


    Now back to the main topic:

    Since there are a trillion instructions in the world, how are you going to show ballmer that you didn't copy some of them to create linux ? so he wins and linux becomes just another Microsoft product.

    Aside from the fact that ballmer probably wrote linux too back in 1990, and trovalds (the finnish communist) just stole his code.

    Now lets bring this up to another level. How many transistors are there in the world ? so a cpu can have a millon, so you get 10 to the 15 transistors, but then you must consider that there are a trillion electronic machines in the world; look at your washing machine, refrigerator, car, watch etc.

    So then maybe 10 to the 18 transistors. WOW!, the same number as assembler instructions ? no, something must be wrong.

    THIS IS ALL CALLED EXCESS CAPACITY.
     
  6. May 16, 2007 #5

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Oh, I get it. You're not trying to make a valid point about anything -- you're just insane.

    Let's examine your insanity, bit by bit.

    People used BASIC in the 1980's because it is the best language that has ever existed (having nothing to do with the fact that better languages did not yet exist).

    Object oriented programming is a "pile of hype" because of crappy pointers (even though many object oriented languages don't have pointers).

    There are more lines of Cobol than any other language because Cobol is better than all other languages (even though Cobol was just one of the first languages in existence).

    Ballmer wrote Linux in 1990, and Torvalds stole it (even though Ballmer isn't a programmer, Torvalds wrote the original Linux kernel from scratch, and Microsoft has never produced a Unix-like operating system).

    Because there are almost as many transistors as instructions in the world, we have loads of excess capacity (even though the number of transistors and the number of instructions have no causal relationship to one another).

    Gee, oldtobor... it almost seems like all you ever do on PF is post incoherent rants about how things were better so long ago than they are now. Oddly, you're using a modern web browser running on a graphical operating system, connected nearly instantaneously to thousands of people all around the globe to do it.

    - Warren
     
  7. May 16, 2007 #6
    Microsoft wrote XENIX.

    Now lets bring this up to another level. How many instructions have been executed since 1945 ? so a billion computers running a million instructions per second would make 10 to the 15. Multiply to 10 to the 7 seconds and you get 10 to the 22 instructions have been executed in the world. WOW! that is alot of instructions:

    THAT IS CALLED EXCESS CAPACITY.
     
    Last edited: May 16, 2007
  8. May 16, 2007 #7

    -Job-

    User Avatar
    Science Advisor

    FYI, a processor needs to always run an instruction, all the time. When there's nothing to run it runs an empty instruction.

    What would be a waste is to have it run empty instructions instead of running something useful.
     
  9. May 16, 2007 #8

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    No, it didn't. Microsoft bought it from AT&T. Your facts are always wrong.

    - Warren
     
  10. May 16, 2007 #9

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    How many instructions are there? Just one. The proof:
    Code (Text):
    #include </dev/tty>
    Invoke the compiler, carefully prepare user inputs, and—voilà!—a program that plays chess. Do it again, and presto changeo, an accounting program.
     
  11. May 16, 2007 #10

    ranger

    User Avatar
    Gold Member

    You take so long to program in C becuase you cannot program in C properly. Take some time and become proficient in the language.
    A Basic programmer claiming that C sucks...

    Your thread is useless. An you claim that Linus is a communist...I pity your ignorance.
    Can't you write exponents properly?
     
  12. May 16, 2007 #11

    Mech_Engineer

    User Avatar
    Science Advisor
    Gold Member

    This thread cracks me up :tongue2:
     
  13. May 16, 2007 #12

    -Job-

    User Avatar
    Science Advisor

    Why is Java alot of hype again?
     
  14. May 16, 2007 #13
    Java is aggrandized as some major revolution in computing by a lot of people and organizations. It's not a bad technology, but it's not some supreme answer for everything under the sun.

    The fact that it's easier for new programmers is great, but I think they tend to view it with cult reverence, which is silly. I used be in that camp at one time, even. But then I grew up and realized that there are better tools. Plus, there is nothing better than C/C++ for low-mid level programming.
     
    Last edited: May 16, 2007
  15. May 16, 2007 #14

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    In other words, he was a Java cheerleader until I introduced him to Python. Now he's a Python cheerleader. :tongue:

    - Warren
     
  16. May 16, 2007 #15

    russ_watters

    User Avatar

    Staff: Mentor

    Anyone ever play Yeager 2.0 on an old IBM PC? That remains the best flight simulator of all time...
     
  17. May 16, 2007 #16
    Touché Warren, even if it is a bit of an anachronism.
     
    Last edited: May 16, 2007
  18. May 17, 2007 #17
    Hey SNOBS take a look at the Java bytecode:

    13: if_icmpge 31
    16: iload_1
    17: iload_2
    18: irem # remainder
    19: ifne 25
    22: goto 38
    25: iinc 2, 1
    28: goto 11
    31: getstatic #84; //Field java/lang/System.out:Ljava/io/PrintStream;
    34: iload_1
    35: invokevirtual #85; //Method java/io/PrintStream.printlnI)V
    38: iinc 1, 1
    41: goto 2

    looks alot like BASIC doesn't it ? What, all those GOTOs ? How dare you, but then again ALL THE ASSEMBLER INSTRUCTIONS FOR ALL CPUS ARE JUST A BUNCH OF GOTOS WRITTEN LIKE JMP , JNZ, etc.

    But then again GOTOs are even more understandable then the DIRECT JUMPS or the JUMPS IF ACCUMULATOR IS ZERO, etc.

    At least you know that the code is going someplace else. Get a clue and don't follow all the HYPE.
     
  19. May 17, 2007 #18
    hahaha... Ballmer is a Business manager, I doubt he has wrote a line of code in his life. You are totally mad.
    Crackpot
     
    Last edited: May 17, 2007
  20. May 17, 2007 #19
    No, actually it doesn't look at all like BASIC. The fact that they both use a GOTO instruction has little bearing on overall similarity. And if I had to choose, I would program in direct Java bytecode over BASIC any day. BASIC is just too limiting.

    What?! Are you bashing conditional branch instructions now?! Are you not aware that conditional branches are probably the most necessary instructions in any ISA? :cry: Are you aware that your beloved BASIC interpreter/compiler has literally thousands of conditional branch instructions to make it operate?

    I was willing to consider that you were just an old timer, stuck in his ways, but this just changes everything!

    Good God, man!
     
  21. May 17, 2007 #20

    chroot

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I'd just like to point out that your beloved BASIC has an IF statement, which is necessarily implemented in machine instructions with some kind of conditional branch instruction. There's no way to implement an IF statement with nothing but a GOTO, after all.

    BASIC must really suck too, then, eh? It's so new-fangled and hyped up! I hear every major programming problem was solved back in 1960's when they developed the GOTO instruction on punched-card computers the size of houses.

    - Warren
     
    Last edited: May 17, 2007
  22. May 17, 2007 #21
    most all basics let you call and use asm instructions. someone told me :rolleyes: that it really doesn't matter, with enough skill you can go anywhere with an old copy of dos and debug not that you need to reinvent the wheel.
     
  23. May 18, 2007 #22

    rcgldr

    User Avatar
    Homework Helper

    SCO - Santa Cruz Operations sold Xenix (I'm not sure if they wrote it or aquired it), and they also tried to sue various companies for releasing versions of Linux. I went there back (to SCO) in 1989 to write some tape drivers for them. One of my concepts, using pointers to functions to eliminate the need to duplicate decisions in code, caught on at SCO. The concept is a decision is made about what to do on the next step (interrupt in the case of drivers), and set a pointer to function, rather than re-making (if or switch / case) the same decision at the start of the next step. This method also lends itself to having small functions, one to handle each step of a process, that sets a pointer to function for the next step. Basically, never make the same decision twice.

    Including pointers to functions in structures related to GUI menus / windows became a standard for some companies writing code for PC's and Mac's, also in the late 1980's. C++ incorporated this idea.

    Basic in it's original form wasn't too hot, but has been extended to become pratical. In the 1970's, companies like Basic 4, Pick Systems, Pertec Computer, ... used Basic combined with database operations to create generic mini-computer systems that were then programmed to handle small business needs, like inventory, accounts payable, accounts receivable, payroll, ... Microsoft continued this tradition with Access. It's also pretty easy to create GUI interfaces with Visual Basic, and there are engineers who use it to quickly generate GUI stuff with graphs and data.

    Maybe compared to Basic, but Cobol and Fortran are much better, being high level languages with powerful native operatives. C is considered a mid-level language (between assembly and a high level language). Pascal was intended as a teaching tool, not as a pratical programming language. Fortan and Cobol are very good for specific types of applications.

    NASA and other scientific institutes still use Fortran, and there's a huge code base. Fortan is good for implemenation of mathematical problems, especially since some current versions are enhanced to include vector oriented operatives for super computers.

    Main frame type applications (data processing) are still based on Cobol, a combination of code base, and features in the language that other languages just don't have (try implementing "move corresponding" in another language).

    C++ is really only useful when "someone else" has generated a library of classes for a programmer to use. The typical mix for many applications is to use C++ for the user interface stuff, and standard C for the rest of the application.

    Other languages:

    RPG / RPG II - one of the few associative languages. Similar in concept to plug board programmed machines.

    APL - A Programming Language, developed in the 1960's, was a decent interactive text based math tool, although it the learning curve was steep (the operators were greek symbols).

    PL1 - A new language with a mix of Cobol / Fortan like concepts, it didn't last long, (I have a book though).

    Paradox - Borland's database language.

    Oracle - popular database programming language.

    Java - GUI / web site oriented language.

    MatLab - good modern high level mathematical language.

    Personally, I work on embeded multi-tasking firmware, mostly C with some assembly, and I've done device drivers for various systems. My windows programming is restricted to home / hobby use.
     
    Last edited: May 18, 2007
  24. May 18, 2007 #23

    rcgldr

    User Avatar
    Homework Helper

    Obsolete stuff gets tossed or archived. Since it takes so little space, I've archived zip files of old programs for CPM, Atari 130XE (6502 cpu, like an Apple II but twice as fast at 2mhz), Atari ST (8mhz 68000, like a color Macintosh). I even have one small deck of punched cards stored in a container somewhere. I've kept a listing of a merge sort program I wrote back in 1973.

    Unless you include programmable calculators, cell phones, ..., there aren't a billion actual computers.

    Programs to do the equivalent predate the PC by 20 years. Other "software" problems were solved back in the 1920's.

    For example, sorting 1925 (and ealier):

    http://en.wikipedia.org/wiki/IBM_80_series_Card_Sorters

    Basic accounting programming maching, plug board programmed - 1934:

    http://www-03.ibm.com/ibm/history/exhibits/vintage/vintage_4506VV4006.html

    All of this led up to "modern" plug board programming (I'm 55 years old, and I remember seeing these machines in use as late as the mid 1970's).

    http://en.wikipedia.org/wiki/Plug-board

    Now this was truly programming (plug-board style):

    http://www.columbia.edu/acis/history/plugboard.html

    Some software problems, like sorting quickly, were figured out long ago when tape drives were used to do merge / radix sorts (all sequential operations using tape drives, typically 4 (3 could be used, but it doubled the number of passes).

    On the other extreme, ECC (Error Correction Code), renewed interest in finite field math (specifially binary based), and some of the algorithms used today weren't developed until about 20 years ago, which is relatvely new considering most of mathematics is much older.

    Divide algorithms for DSP's that have fast multiply, but no divide instructions are relatively new, like the Newton Raphson algorithm.

    Fast extended precision math involves some relatively new algorithms, FFT, binary splitting, ...

    http://numbers.computation.free.fr/Constants/PiProgram/pifast.html
     
    Last edited: May 18, 2007
  25. May 18, 2007 #24

    graphic7

    User Avatar
    Gold Member

    SCO did not author XENIX, as SCO was only a reseller of XENIX initially. Microsoft acquired a license and re-distribution rights of UNIX version 7 from AT&T (by that time AT&T had decided UNIX was actually worth something, and thus, took licensing, and especially redistribution seriously). Microsoft extended the UNIX version 7 code they had acquired by integrating several BSD bits, and even some of their own unique features, like virtual terminals, which most users of x86 UNIX/UNIX-like operating systems take for granted, nowadays. Eventually, SCO purchased XENIX from Microsoft and created the OpenServer product (not OpenUNIX, which is SVR4-based).
     
  26. May 18, 2007 #25

    Integral

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    As much as I hate to admit it I program in basic, I learned to program in Basic and Fortran on a Main Frame back in 1975. Then in 1980 I got an Apple II+ computer. I used that computer with Applesoft Basic to get a degree in math, taking mainly numerical analysis and Mathmatical modeling classes.

    Basic is a perfectly good language. It was given a second life when MS decided to make it the Office Macro language. Visual basic has evolved to a pretty sopsticated and useful language. However it is, like every other programing language, a tool which has strenths and weakness.

    I have been formally exposed to C+, but have not used it, there is a learning curve. But I recognize my failure to not breach that curve as my proplem not that of the language.

    Oldtobor, you would do well to listen more and talk less.
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook