How many instructions are there ?

  • Thread starter Thread starter oldtobor
  • Start date Start date
AI Thread Summary
The discussion centers on the vast quantity of code and instructions in programming languages, estimating around 200 billion lines of code globally, potentially reaching trillions when considering assembly language. Participants debate the efficacy of various programming languages, with some arguing that BASIC was highly effective in solving software problems as early as the 1980s, while others criticize it as outdated and less capable compared to languages like C and Java. The conversation also touches on the maintenance of legacy code and the emergence of new software challenges, suggesting that much of the existing code will eventually be replaced. Additionally, there are claims regarding the historical development of Linux and the role of different programming languages in modern computing. Overall, the thread highlights differing opinions on programming language effectiveness and the implications of code volume in the tech industry.
oldtobor
Messages
132
Reaction score
0
I was trying to figure out how many instructions there are in the world. Now cobol is said to have 100 billion lines of code and I guess all the other languages combined could reach maybe 100 billion lines of code. All other languages include fortran, c, c++, java, basic, and all the microcontroller programs written in assembler.

Now that is without repetition. I mean that is considering just one sample of each distinct program. So maybe 200 billion lines of code, translated into assembler then may reach A TRILLION LINES OF ASSEMBLY LANGUAGE IN ALL!

Now if you consider repetition you can reach 10 to the 18 lines of assembler code in the whole world. Wow, that is a lot of code floating around.

Now who on Earth is going to maintain and take care of it all ?

I like big numbers, so sometimes I try to calculate how many equivalent IBM PCs of computer power is currently installed in the world. If you consider that the 1981 model could do 300 thousands instructions per second, today you could maybe estimate at least 100 times that for each PC. SO there are a billion computers in the world today so you get 100 billion equivalent IBM PCs of computing power floating around in the world. WOW! that is alot!

If you consider that the first basic programs that just opened a file and printed out all the lines containing a string with the famous INSTR($Target,$Pattern) instruction could already be done for a few thousand records in 1981, THAT IS TO SAY THAT 90 % OF ALL REAL SOFTWARE PROBLEMS WERE ALREADY SOLVED IN 1981 ON THE SIMPLE IBM PC WITH THAT GREAT LANGUAGE CALLED BASIC, you can see how much excess capacity is just hanging around. All software problems have been basically solved already in the early 1980s.

Today the same problem is solved in DOS by executing this simple program in perl:


c:\>perl -ane"print if/put your pattern in here/" inputfile

you can also naturally run it off a unix prompt. So most software has been done in one line...
 
Last edited:
Computer science news on Phys.org
:rolleyes: Most software problems were solved in BASIC in 1981... Right. Because 90% of all software problems are just pattern matching in text. Riiiight. Who are you trying to kid, oldtobor?

- Warren
 
Hmm, I'd have to disagree with you Oldtobor. New problems are emerging in computing every day, and BASIC is probably one of the worst languages ever devised; things started getting good when C was developed.

What does it matter how much code is out there? None of it will be maintained forever, it will all be discarded and replaced by new, more capable code. People work that way too, you know?

You remind me of those people that say modern medicine hasn't done anything positive for humanity. But here we are living, on average, twice what we used to.

Good grief.

- Old MacDonald
 
Last edited:
Actually it is the exact opposite: Basic was (and still is, there are a lot available on the internet to download) a very good language, that is why people used it to program during the 1980s and until the early 1990s on PCs.

Just some programmers were disorganized and created sloppy code. Just because one guy said Basic was bad, everyone started saying the same thing, what clueless people!

And actually turbo pascal was even better, AND IT WAS WHEN C STARTED TO BECOME POPULAR THAT IT STARTED TO BECOME HARDER TO PROGRAM. C SUCKS, IT IS USELESSLY HARD! The same programs I could do in an hour in pascal took many more to do in C because of the crappy pointers, memory management etc. What crap, and C++ is even worse! Java and object oriented programming is a huge PILE OF HYPE! That is why there are still 100 billion lines of cobol, because good programs are created with easy languages. Now back to the main topic:

Since there are a trillion instructions in the world, how are you going to show ballmer that you didn't copy some of them to create linux ? so he wins and linux becomes just another Microsoft product.

Aside from the fact that ballmer probably wrote linux too back in 1990, and trovalds (the finnish communist) just stole his code.

Now let's bring this up to another level. How many transistors are there in the world ? so a cpu can have a millon, so you get 10 to the 15 transistors, but then you must consider that there are a trillion electronic machines in the world; look at your washing machine, refrigerator, car, watch etc.

So then maybe 10 to the 18 transistors. WOW!, the same number as assembler instructions ? no, something must be wrong.

THIS IS ALL CALLED EXCESS CAPACITY.
 
Oh, I get it. You're not trying to make a valid point about anything -- you're just insane.

Let's examine your insanity, bit by bit.

People used BASIC in the 1980's because it is the best language that has ever existed (having nothing to do with the fact that better languages did not yet exist).

Object oriented programming is a "pile of hype" because of crappy pointers (even though many object oriented languages don't have pointers).

There are more lines of Cobol than any other language because Cobol is better than all other languages (even though Cobol was just one of the first languages in existence).

Ballmer wrote Linux in 1990, and Torvalds stole it (even though Ballmer isn't a programmer, Torvalds wrote the original Linux kernel from scratch, and Microsoft has never produced a Unix-like operating system).

Because there are almost as many transistors as instructions in the world, we have loads of excess capacity (even though the number of transistors and the number of instructions have no causal relationship to one another).

Gee, oldtobor... it almost seems like all you ever do on PF is post incoherent rants about how things were better so long ago than they are now. Oddly, you're using a modern web browser running on a graphical operating system, connected nearly instantaneously to thousands of people all around the globe to do it.

- Warren
 
chroot said:
Ballmer wrote Linux in 1990, and Torvalds stole it (even though Ballmer isn't a programmer, Torvalds wrote the original Linux kernel from scratch, and Microsoft has never produced a Unix-like operating system).

Microsoft wrote XENIX.

Now let's bring this up to another level. How many instructions have been executed since 1945 ? so a billion computers running a million instructions per second would make 10 to the 15. Multiply to 10 to the 7 seconds and you get 10 to the 22 instructions have been executed in the world. WOW! that is a lot of instructions:

THAT IS CALLED EXCESS CAPACITY.
 
Last edited:
FYI, a processor needs to always run an instruction, all the time. When there's nothing to run it runs an empty instruction.

What would be a waste is to have it run empty instructions instead of running something useful.
 
oldtobor said:
Microsoft wrote XENIX.

No, it didn't. Microsoft bought it from AT&T. Your facts are always wrong.

- Warren
 
How many instructions are there? Just one. The proof:
Code:
#include </dev/tty>

Invoke the compiler, carefully prepare user inputs, and—voilà!—a program that plays chess. Do it again, and presto changeo, an accounting program.
 
  • #10
And actually turbo pascal was even better, AND IT WAS WHEN C STARTED TO BECOME POPULAR THAT IT STARTED TO BECOME HARDER TO PROGRAM. C SUCKS, IT IS USELESSLY HARD! The same programs I could do in an hour in pascal took many more to do in C because of the crappy pointers, memory management etc. What crap, and C++ is even worse! Java and object oriented programming is a huge PILE OF HYPE! That is why there are still 100 billion lines of cobol, because good programs are created with easy languages.
You take so long to program in C becuase you cannot program in C properly. Take some time and become proficient in the language.
A Basic programmer claiming that C sucks...

Your thread is useless. An you claim that Linus is a communist...I pity your ignorance.
Now let's bring this up to another level. How many instructions have been executed since 1945 ? so a billion computers running a million instructions per second would make 10 to the 15. Multiply to 10 to the 7 seconds and you get 10 to the 22 instructions have been executed in the world. WOW! that is a lot of instructions:
Can't you write exponents properly?
 
  • #11
This thread cracks me up :-p
 
  • #12
Why is Java a lot of hype again?
 
  • #13
Java is aggrandized as some major revolution in computing by a lot of people and organizations. It's not a bad technology, but it's not some supreme answer for everything under the sun.

The fact that it's easier for new programmers is great, but I think they tend to view it with cult reverence, which is silly. I used be in that camp at one time, even. But then I grew up and realized that there are better tools. Plus, there is nothing better than C/C++ for low-mid level programming.
 
Last edited:
  • #14
eieio said:
The fact that it's easier for new programmers is great, but I think they tend to view it with cult reverence, which is silly. I used be in that camp an one time, even. But then I grew up and realized that there are better tools. Plus, there is nothing better than C/C++ for low-mid level programming.

In other words, he was a Java cheerleader until I introduced him to Python. Now he's a Python cheerleader. :-p

- Warren
 
  • #15
Anyone ever play Yeager 2.0 on an old IBM PC? That remains the best flight simulator of all time...
 
  • #16
chroot said:
In other words, he was a Java cheerleader until I introduced him to Python. Now he's a Python cheerleader. :-p

- Warren

Touché Warren, even if it is a bit of an anachronism.
 
Last edited:
  • #17
Hey SNOBS take a look at the Java bytecode:

13: if_icmpge 31
16: iload_1
17: iload_2
18: irem # remainder
19: ifne 25
22: goto 38
25: iinc 2, 1
28: goto 11
31: getstatic #84; //Field java/lang/System.out:Ljava/io/PrintStream;
34: iload_1
35: invokevirtual #85; //Method java/io/PrintStream.printlnI)V
38: iinc 1, 1
41: goto 2

looks a lot like BASIC doesn't it ? What, all those GOTOs ? How dare you, but then again ALL THE ASSEMBLER INSTRUCTIONS FOR ALL CPUS ARE JUST A BUNCH OF GOTOS WRITTEN LIKE JMP , JNZ, etc.

But then again GOTOs are even more understandable then the DIRECT JUMPS or the JUMPS IF ACCUMULATOR IS ZERO, etc.

At least you know that the code is going someplace else. Get a clue and don't follow all the HYPE.
 
  • #18
Aside from the fact that ballmer probably wrote linux too back in 1990, and trovalds (the finnish communist) just stole his code.
hahaha... Ballmer is a Business manager, I doubt he has wrote a line of code in his life. You are totally mad.
At least you know that the code is going someplace else. Get a clue and don't follow all the HYPE.
Crackpot
 
Last edited:
  • #19
oldtobor said:
looks a lot like BASIC doesn't it ? What, all those GOTOs ? How dare you, but then again ALL THE ASSEMBLER INSTRUCTIONS FOR ALL CPUS ARE JUST A BUNCH OF GOTOS WRITTEN LIKE JMP , JNZ, etc.

No, actually it doesn't look at all like BASIC. The fact that they both use a GOTO instruction has little bearing on overall similarity. And if I had to choose, I would program in direct Java bytecode over BASIC any day. BASIC is just too limiting.

oldtobor said:
But then again GOTOs are even more understandable then the DIRECT JUMPS or the JUMPS IF ACCUMULATOR IS ZERO, etc.

At least you know that the code is going someplace else. Get a clue and don't follow all the HYPE.

What?! Are you bashing conditional branch instructions now?! Are you not aware that conditional branches are probably the most necessary instructions in any ISA? :cry: Are you aware that your beloved BASIC interpreter/compiler has literally thousands of conditional branch instructions to make it operate?

I was willing to consider that you were just an old timer, stuck in his ways, but this just changes everything!

Good God, man!
 
  • #20
oldtobor said:
What, all those GOTOs ? How dare you, but then again ALL THE ASSEMBLER INSTRUCTIONS FOR ALL CPUS ARE JUST A BUNCH OF GOTOS WRITTEN LIKE JMP , JNZ, etc.

I'd just like to point out that your beloved BASIC has an IF statement, which is necessarily implemented in machine instructions with some kind of conditional branch instruction. There's no way to implement an IF statement with nothing but a GOTO, after all.

BASIC must really suck too, then, eh? It's so new-fangled and hyped up! I hear every major programming problem was solved back in 1960's when they developed the GOTO instruction on punched-card computers the size of houses.

- Warren
 
Last edited:
  • #21
most all basics let you call and use asm instructions. someone told me :rolleyes: that it really doesn't matter, with enough skill you can go anywhere with an old copy of dos and debug not that you need to reinvent the wheel.
 
  • #22
Microsoft wrote Xenix

SCO - Santa Cruz Operations sold Xenix (I'm not sure if they wrote it or acquired it), and they also tried to sue various companies for releasing versions of Linux. I went there back (to SCO) in 1989 to write some tape drivers for them. One of my concepts, using pointers to functions to eliminate the need to duplicate decisions in code, caught on at SCO. The concept is a decision is made about what to do on the next step (interrupt in the case of drivers), and set a pointer to function, rather than re-making (if or switch / case) the same decision at the start of the next step. This method also lends itself to having small functions, one to handle each step of a process, that sets a pointer to function for the next step. Basically, never make the same decision twice.

Including pointers to functions in structures related to GUI menus / windows became a standard for some companies writing code for PC's and Mac's, also in the late 1980's. C++ incorporated this idea.

basic

Basic in it's original form wasn't too hot, but has been extended to become pratical. In the 1970's, companies like Basic 4, Pick Systems, Pertec Computer, ... used Basic combined with database operations to create generic mini-computer systems that were then programmed to handle small business needs, like inventory, accounts payable, accounts receivable, payroll, ... Microsoft continued this tradition with Access. It's also pretty easy to create GUI interfaces with Visual Basic, and there are engineers who use it to quickly generate GUI stuff with graphs and data.

things got better with C

Maybe compared to Basic, but Cobol and Fortran are much better, being high level languages with powerful native operatives. C is considered a mid-level language (between assembly and a high level language). Pascal was intended as a teaching tool, not as a pratical programming language. Fortan and Cobol are very good for specific types of applications.

NASA and other scientific institutes still use Fortran, and there's a huge code base. Fortan is good for implemenation of mathematical problems, especially since some current versions are enhanced to include vector oriented operatives for super computers.

Main frame type applications (data processing) are still based on Cobol, a combination of code base, and features in the language that other languages just don't have (try implementing "move corresponding" in another language).

C++ is really only useful when "someone else" has generated a library of classes for a programmer to use. The typical mix for many applications is to use C++ for the user interface stuff, and standard C for the rest of the application.

Other languages:

RPG / RPG II - one of the few associative languages. Similar in concept to plug board programmed machines.

APL - A Programming Language, developed in the 1960's, was a decent interactive text based math tool, although it the learning curve was steep (the operators were greek symbols).

PL1 - A new language with a mix of Cobol / Fortan like concepts, it didn't last long, (I have a book though).

Paradox - Borland's database language.

Oracle - popular database programming language.

Java - GUI / website oriented language.

MatLab - good modern high level mathematical language.

Personally, I work on embeded multi-tasking firmware, mostly C with some assembly, and I've done device drivers for various systems. My windows programming is restricted to home / hobby use.
 
Last edited:
  • #23
oldtobor said:
Now who on Earth is going to maintain and take care of it all? (written software)
Obsolete stuff gets tossed or archived. Since it takes so little space, I've archived zip files of old programs for CPM, Atari 130XE (6502 cpu, like an Apple II but twice as fast at 2mhz), Atari ST (8mhz 68000, like a color Macintosh). I even have one small deck of punched cards stored in a container somewhere. I've kept a listing of a merge sort program I wrote back in 1973.

...there are a billion computers in the world today
Unless you include programmable calculators, cell phones, ..., there aren't a billion actual computers.

If you consider that the first basic programs that just opened a file and printed out all the lines containing a string with ... 90% of all real software problems were solved in 1982 on the IBM PC
Programs to do the equivalent predate the PC by 20 years. Other "software" problems were solved back in the 1920's.

For example, sorting 1925 (and ealier):

http://en.wikipedia.org/wiki/IBM_80_series_Card_Sorters

Basic accounting programming maching, plug board programmed - 1934:

http://www-03.ibm.com/ibm/history/exhibits/vintage/vintage_4506VV4006.html

All of this led up to "modern" plug board programming (I'm 55 years old, and I remember seeing these machines in use as late as the mid 1970's).

http://en.wikipedia.org/wiki/Plug-board

Now this was truly programming (plug-board style):

http://www.columbia.edu/acis/history/plugboard.html

Some software problems, like sorting quickly, were figured out long ago when tape drives were used to do merge / radix sorts (all sequential operations using tape drives, typically 4 (3 could be used, but it doubled the number of passes).

On the other extreme, ECC (Error Correction Code), renewed interest in finite field math (specifially binary based), and some of the algorithms used today weren't developed until about 20 years ago, which is relatvely new considering most of mathematics is much older.

Divide algorithms for DSP's that have fast multiply, but no divide instructions are relatively new, like the Newton Raphson algorithm.

Fast extended precision math involves some relatively new algorithms, FFT, binary splitting, ...

http://numbers.computation.free.fr/Constants/PiProgram/pifast.html
 
Last edited:
  • #24
SCO did not author XENIX, as SCO was only a reseller of XENIX initially. Microsoft acquired a license and re-distribution rights of UNIX version 7 from AT&T (by that time AT&T had decided UNIX was actually worth something, and thus, took licensing, and especially redistribution seriously). Microsoft extended the UNIX version 7 code they had acquired by integrating several BSD bits, and even some of their own unique features, like virtual terminals, which most users of x86 UNIX/UNIX-like operating systems take for granted, nowadays. Eventually, SCO purchased XENIX from Microsoft and created the OpenServer product (not OpenUNIX, which is SVR4-based).
 
  • #25
As much as I hate to admit it I program in basic, I learned to program in Basic and Fortran on a Main Frame back in 1975. Then in 1980 I got an Apple II+ computer. I used that computer with Applesoft Basic to get a degree in math, taking mainly numerical analysis and Mathmatical modeling classes.

Basic is a perfectly good language. It was given a second life when MS decided to make it the Office Macro language. Visual basic has evolved to a pretty sopsticated and useful language. However it is, like every other programing language, a tool which has strenths and weakness.

I have been formally exposed to C+, but have not used it, there is a learning curve. But I recognize my failure to not breach that curve as my proplem not that of the language.

Oldtobor, you would do well to listen more and talk less.
 
  • #26
graphic7 said:
SCO did not author XENIX.
I already corrected my previous post. However, I have the impression that it was SCO that handled the transtion from 286 (16 bit extended environment) to 386 (32 bit mapped environment), since the 386 PC's came out about the same time that SCO was heavily into XENIX.

Exposed to C++
It's probably easier to learn standard C first, then C++, although the programs you use for learning will be simple text based stuff.

4th generation languages
My own addition here. You never hear this term anymore, but the concept was programs to generate programs, which evolved into programs that generate source code. Although you don't hear 4th generation language anymore, the concept did develop. Think C for the Macintosh was one of the first ones I remember (late 1980's). You drew the GUI interface: menus, dialog boxes, windows, ... using paste and edit graphical tools, and Think C would generate the supporting code, with sections bounded by special comments where you could add your own specific code. This trend has continued through to the Visual programming languages available today. This concept only works well when dealing with relativly "popular" objects, like GUI interface for Windows, or database handling. There' aren't a lot of "popular" objects.

I wonder if there's a tool that generates source code for dealing with lists of files, something that would go through every file or file/directory on one or more volumes (with optional wild card matching), and call a user supplied routine to work with each file, or group of files. Something that would make creating the equivalent of windiff.exe a simple exercise.
 
Last edited:
  • #27
Jeff Reid said:
I wonder if there's a tool that generates source code for dealing with lists of files, something that would go through every file or file/directory on one or more volumes (with optional wild card matching), and call a user supplied routine to work with each file, or group of files. Something that would make creating the equivalent of windiff.exe a simple exercise.

try PERL.

for example on DOS, if you wanted to do something for every file in a directory and in its subdirectories you could do:dir/b/s *.pl|perl -ane"s/\n//;open _;print$_,@t,\"\n\n\"if@t=grep/if/,<_>"This finds all the occurences of "if" in all files ending with pl in the directory. This (and the language) naturally also works for unix.

generally you can do :for(qx/dir\/b\/s/)

{ s/\n//;
open _;

now you can do anything with <_> which is an array that contains a complete file.Now try to do that with Java or C++ and see how long it takes you.The SNOBS like Java and C and C++ but they are totally meaningless, overbloated, overcomplicated piles of HYPE. After all what is all the fuss about ? I mean assembler language is just a bunch of gotos and the most complex thing you can get is INDIRECT ADDRESSING MODE. That means that the memory location of a byte is the contents of another memory location. Example:

memory byte

0040 43
0041 65
.
.
.
4365 77So with indirect addressing if you want to get 77
you just say for example:

load in accumulator indirect, 0040

accumulator will have 77. END OF STORY. These languages like C++ and C have a bunch of funny symbols for this simple concept and have confused untold programmers for years. What a bunch of crap C and C++ is.
 
Last edited:
  • #28
Jeff Reid said:
Basic in it's original form wasn't too hot, but has been extended to become pratical. In the 1970's, companies like Basic 4, Pick Systems, Pertec Computer, ... used Basic combined with database operations to create generic mini-computer systems that were then programmed to handle small business needs, like inventory, accounts payable, accounts receivable, payroll, ... Microsoft continued this tradition with Access. It's also pretty easy to create GUI interfaces with Visual Basic, and there are engineers who use it to quickly generate GUI stuff with graphs and data.

And that is the point. If BASIC with those punny computers was capable of creating these popular business programs then, then today at least this class of programs should be extemely simple to create and lightining fast to execute. Instead we got them written in bloated languages like Java that take a long time to program and run slow on machines that today are equivalent to at least 100 times those.

How much more powerful is a typical modern PC compared to a 1981 IBM PC ?

100 times ? Is it equivalent to a hundred IBM PCs of 1981 ?
 
  • #29
oldtobor said:
try PERL.
There are guys at work that use PERL. It's a language I should learn.

Java and C and C++ but they are totally meaningless, overbloated, overcomplicated piles of HYPE.
The languages aren't ultra complicated, but the tools like Visual Studio, along with creating projects, make it a bit more complicated. Still large GUI projects are going to be complicated anyway, and in those cases, Visual C++ makes sense.

I mean assembler language is just a bunch of gotos and the most complex thing you can get is indirect addressing mode. Languages like C++ and C have a bunch of funny symbols for this simple concept.
The syntax for indexing [] is fine. Using * and & which already are used as math operators may have been a bad choice. Not being a language designer, I don't have a better suggestion.

Pointers are useful, but the C syntax is a bit confusing, especially with the modifier being on the right for *, but on the left for &.

In the case of C / C++, the precedence for binary math operations wrong. & shoud be the same as * (binary "and" same precedence as multiply), while | (inclusive or) and ^ (exclusive or) should have the same precedence as + or -. Instead these operators have lower precedence than the logical and compare operators, && || < > <= >= !=, which doesn't make sense and requires unecessary parenthesis. It would never make sense to perform a binary math operations on logical values which are just defined as zero and not zero. Speaking of which, logical values should have been more strictly defined, with TRUE and FALSE being reserved symbol names.

I mean assembler language is just a bunch of gotos and the most complex thing you can get is indirect addressing mode.
You also get pre and post increment / decrement, and scaling (included in C, seems they modeled it to optimize well with some CPU types, like the 68000.)

Mainframes have some pretty complicated instructions. For example, EDIT AND MARK instruction on the IBM 360 and it's siblings. This copies / expands data from a nibble/BCD (binary coded decimal) oriented field into a byte oriented EBCDIC field, prefilled with how to do the expansion, such as where to put a decimal point, optional commas, and optional placement of a $ sign, and note, this is a single instruction. There are also built in extended math instructions for those nibble oriented BCD fields.

Vector processing super computers (CDC's 7600, Cray, and later machines), include instructions to peform math on two arrays of numbers (floating point or integer) and store the results in a third array or to do yet another math operation to combine the results (multiply then add for a column / row multiply step on a matrix).

Even the Intel CPU has some descent instructions. Instead of the risc process sequence, load register with immediate, load register from location, add register to register, store register into location, the Intel cpu includes an add immediate to location instruction. XLAT will do a 256 byte table lookup in a single instruction. There are a huge number of floating point operations.

Mainframes from the 1960's (CDC for example), modern super computers, and some modern microprocessors include multiple arithmetic units, and register scoreboarding, where instructions are allowed to overlap, but will pend if a result is needed as an operand in a later math operation.
 
Last edited:
  • #30
oldtobor said:
If BASIC with those punny computers was capable of creating these popular business programs then, then today at least this class of programs should be extemely simple to create and lightining fast to execute. Instead we got them written in bloated languages like Java that take a long time to program and run slow on machines that today are equivalent to at least 100 times those.
First of all, those Basic programs took a long time to develop, and they were never pretty. These were early versions of Basic with some added database operators. Hopefully Microsoft Access is a lot better.

However, my guess is that someone just ported these old mini-computer environments to run on PC's with minimal amount of source code changes, so there are still a few business applications written in ugly basic. Go to your local car dealership / local motorcycle shop. Some of them are still using text based (as opposed to Windows based) applications.

There are some pre-made tools these days. Quicken books can do a lot of accounting stuff already. I'm not sure if there are generic programs for inventory tracking though (dealing with supplier sources, inventory in multiple warehouses and stores, updated via point of sales operations, not to mention returns which go the reverse path).

I'm not sure how much of today's banking industry software is still based on Cobol, but it's probably significant.
 
  • #31
OK, I may not know a lot of things and may have a lot of things wrong. But I am sure that PERL at least was definitely the route that languages should have taken.

For example to extract the next to the last field of an ascii file separated by "|" and sort them, it can be done with one line: C:\> perl -ane"split/\|/; $l=@_[@_-2];push@r,$l.\"\n\"if$l;END{print sort@r}" bands.txt

works on any unix too.

I find it amazing that back in the mid 1990s, just when Java started to become popular this direction of language design, and maybe greatly improving the concepts, compilers etc. did not take off. The syntax could be cleaner very BASIC like at least, there are so many improvements conceivable but the ideas are great:

split - it is implied that the line is split and the result is in an array called @_.

@_[@_-2] gets the next to the last field;

@_ is the total array;

at the end of the scan (like AWK) just print the sorted array.
 
Last edited:
  • #32
Jeff Reid said:
In the case of C / C++, the precedence for binary math operations wrong. & shoud be the same as * (binary "and" same precedence as multiply), while | (inclusive or) and ^ (exclusive or) should have the same precedence as + or -. Instead these operators have lower precedence than the logical and compare operators, && || < > <= >= !=, which doesn't make sense and requires unecessary parenthesis. It would never make sense to perform a binary math operations on logical values which are just defined as zero and not zero.

You are incorrect here. The bitwise and/or/xor are higher precedence than the logical and/or. They are, however, lower precedence than relational operators, with good reason. They serve double purpose as non-short-circuit logical operators. For example, f() != 10 && g() == 8 will not execute g() if f() returns 10 , but f() != 10 & g() == 8 will execute g() regardless of the result of f(), with the same overall logical result.
 
Last edited:
  • #33
Jeff Reid said:
Speaking of which, logical values should have been more strictly defined, with TRUE and FALSE being reserved symbol names.

This is a bad idea that violates the very essence of C. Making true/false values their own type with reserved symbols is a contrivance that C intentionally avoids. The "0 is false, everything else is true" is leveraged extensively by good C programmers.

Here are some examples to illustrate the the design choice:
Code:
/* status register bits */
enum {
    STATUS_READY=1,
    STATUS_PENDING=2,
    STATUS_ERROR=4,
}

...

/* check for errors */
if (readStatusReg() & STATUS_ERROR) {
   /* error handling */
}

...

/* see if the device is ready or pending */
if (readStatusReg() & (STATUS_READY | STATUS_PENDING)) {
   /* take appropriate action that applies to both states */
} else {
    /* perform some idle action */
}

...

int (*handler)(result_t *) = getHandler();

/* execute the hander if we have one, and pass the results off */
result_t result;
if (handler && handler(&result)) {
    /* we had a handler and it returned success, do something with the result */
}

You see, truth values other than 1 are useful. C is a very well designed language, with many very intentional features; most of them for efficiency of expression and execution.

- Old MacDonald
 
  • #34
eieio said:
C is a very well designed language, with many very intentional features; most of them for efficiency of expression and execution.
Let me say that I am a great fan of C, I use it very often, mainly due to its efficiency and small memory footprint! But I find it far from well designed, on the contrary I find it rather poorly designed.

And that dual usage of the term "static" is close to idiocy. :biggrin:

A well designed language is Pascal or Java and more recently Ruby.
 
Last edited:
  • #35
MeJennifer said:
And that dual usage of the term "static" is close to idiocy. :biggrin:

Which dual usage do you speak of?
 
  • #36
Static has many meanings in C and C++.

  • A static file-scope variable acts like a global variable except that it is not visible to the linker. The opposite of static is no keyword.
  • A static function similarly is not visible to the linker.
  • A static function-scope variable has permanent storage and is initialized but once (this use of static is opposite of auto).
  • A static member variable is a class variable.
  • A static member function can only access static member variables.
 
  • #37
oldtobor said:
dir/b/s *.pl|perl -ane"s/\n//;open _;print$_,@t,\"\n\n\"if@t=grep/if/,<_>"

:smile: Perl's such a piece of crap! Do you really expect any programmer in his right mind to be able to actually type that garbage, from memory, without making at least twelve mistakes?? :smile:

And your "program" just relies on DOS to do its recursive listing, which isn't helpful at all! Jeff Reid was asking about writing an actual program to do this, not just to depend on the shell.

Not to mention that your stupid Perl program is horribly memory inefficient, attempting to store each entire file in memory as it is searched. What if your directory contains gigabyte files? You're screwed!

How about a simple Python program that actually does what Jeff Reid wants to? How about one that any programmer, of any language, can read and understand? How about one that anyone who knows Python could write in a couple of minutes? How about one that is time and memory efficient, without requiring any extraordinary effort on the part of the programmer?

Observe:

Code:
import os
from os.path import join
for root, dirs, files in os.walk('/my/path/here'):
	filename = join(root, name)
	for line in file(filename).readlines():
		if pattern in line:
			print "File", filename, "matched."

Perl's dead. Long, long dead.

- Warren
 
Last edited:
  • #38
D H said:
Static has many meanings in C and C++.

  • A static file-scope variable acts like a global variable except that it is not visible to the linker. The opposite of static is no keyword.
  • A static function similarly is not visible to the linker.
  • A static function-scope variable has permanent storage and is initialized but once (this use of static is opposite of auto).
  • A static member variable is a class variable.
  • A static member function can only access static member variables.

No, those all mean the same thing; with the exception of the static member function (method), which is a very natural extension that keeps in line with C++'s object oriented features.

The keyword 'static' simply specifies that the compiler reserve space for the item in either the initialized or uninitialized data segment, and that the symbol for that item be restricted to the scope in which it is defined. There is no difference between a static file-scope variable and a static function-scope variable. They both reside within the same block in the executable image and behave exactly alike; while the symbol behaves exactly as it should, from within the scope it was defined.

Static member variables behave exactly the same way, too; there is just one of such variable in the class scope. This is seen in the way you need to define static member variables in similar manner to static file-scope variables.

Really, all static variables are the same thing: a reserved, pre-initialized (or zeroed) section of the program data segment, with a localized symbol.

Furthermore, I really don't see static methods as being all that unintuitive in meaning, if you already grasp what static is supposed to mean. What do you want, a separate keyword like 'notinterestedinimplicitinstanceaccess'? The keyword 'static' is already closely associated with idea.

- OMD
 
  • #39
chroot said:
Code:
import os
from os.path import join
for root, dirs, files in os.walk('/my/path/here'):
	filename = join(root, name)
	for line in file(filename).readlines():
		if pattern in line:
			print "File", filename, "matched."

Perl's dead. Long, long dead.

- Warren

Good one.

Or if you want that last bit of efficiency:

Code:
import os
from os.path import join
for root, dirs, files in os.walk('/my/path/here'):
	filename = join(root, name)
	for line in file(filename):
		if pattern in line:
			print "File", filename, "matched."

Just iterate the file object (as of 2.3). That way you don't read in the whole file into a list first. :wink:

- OMD
 
  • #40
oldtobor said:
For example to extract the next to the last field of an ascii file separated by "|" and sort them, it can be done with one line: C:\> perl -ane"split/\|/; $l=@_[@_-2];push@r,$l.\"\n\"if$l;END{print sort@r}" bands.txt

:smile: One line that would take any programmer 20 minutes to really understand thoroughly. One that probably took you an hour to write in the first place!

If you want to take a file like this:

a | b | c
d | e
f | g | h | i

And sort the second-to-last elements of each line, here's a Python program that anyone who's ever programmed can understand immediately:

Code:
import sys

bigList = []

for line in sys.stdin:
	try:
		secondToLast = line.split('|')[-2].strip()
		bigList.append(secondToLast)
	except:
		pass

bigList.sort()
print bigList

I wrote this in literally three minutes. It's more efficient than your Perl (since it doesn't read the entire file at once), anyone here can understand it in seconds, and it does something your code does not: it includes exception handling to deal with lines that don't actually have two elements in them.

- Warren
 
Last edited:
  • #41
eieio said:
Just iterate the file object (as of 2.3). That way you don't read in the whole file into a list first. :wink:

:smile: That's what I was intending to do with readlines(). Good catch, it's not actually a generator!

- Warren
 
  • #42
Any language can be obfuscated.

Code:
o = lambda o:map(lambda a:filter(None,(map(lambda i:map(lambda x:a.__setitem__(x,0),range(2*i,o,i)),range(2,o)),a)[1])[1:],[range(o)])[0]
print p(20)

Regarding static:

The opposite of static at file scope is "extern", while the opposite of static at function scope is "auto". They are different concepts. This is not just my opinion; all of my C reference books have some caveat on the multiple meanings of "static".

I agree with McJennifer: C is a poorly architected language. Ada is the only well architected language that I know of, and it is more-or-less dead.
 
  • #43
D H said:
Any language can be obfuscated.

That's what's so hilarious about oldtobor. He complains adamantly about how languages like C are overly complex and hard to write and understand... and then shows us his thoroughly obfuscated Perl one-liner as an example of what he presumably feels is elegant and easy to understand.

I agree with McJennifer: C is a poorly architected language. Ada is the only well architected language that I know of, and it is more-or-less dead.

I've been meaning to learn Ada. I take it you don't think Python is well-architected?

- Warren
 
  • #44
Python seems ok, it seems to go in the right direction, maybe if it got rid of the object oriented stuff. Pity that it executes slow, but all interpreted languages are slow. But after decades of research couldn't they have finally created lightning fast interpreters ?

Software doesn't evolve; it simply changes, it simply draws a different picture of the same thing, it is an aesthetical - cultural creation. It is not like hardware where you can measure its progress, where there is a well defined task that can be optimized and you get progress.

Software is a based on what people want to do, how they want it to look, so it is fickle, it follows styles. There has been very little progress in software, linux , a 30 year old OS is the great new thing, and you still have to use vi because they can't create an EDIT program like the one that runs on DOS, from the prompt.

Maybe when multicore chips start integrating in hardware - firmware more and more software, there will start to be some progress. Then again the proliferation of so many languages and systems is another example of

EXCESS CAPACITY
 
  • #45
oldtobor said:
Python seems ok, it seems to go in the right direction, maybe if it got rid of the object oriented stuff. Pity that it executes slow, but all interpreted languages are slow. But after decades of research couldn't they have finally created lightning fast interpreters ?

It's not slow at all, oldtobor. It is, in fact, it's as fast as C or C++ for many purposes, and is generally faster than an equivalent program in C or C++, given equal amounts of time spent optimizing both.

Software doesn't evolve; it simply changes, it simply draws a different picture of the same thing, it is an aesthetical - cultural creation. It is not like hardware where you can measure its progress, where there is a well defined task that can be optimized and you get progress.

There are many ways you can track the progress of software's evolution -- like the speed or cost of development.

Software is a based on what people want to do, how they want it to look, so it is fickle, it follows styles. There has been very little progress in software, linux , a 30 year old OS is the great new thing, and you still have to use vi because they can't create an EDIT program like the one that runs on DOS, from the prompt.

You mean... like emacs?

Maybe when multicore chips start integrating in hardware - firmware more and more software, there will start to be some progress. Then again the proliferation of so many languages and systems is another example of

EXCESS CAPACITY

This was the paradigm of the mainframe, which ended some decades ago. It proved to be a poor way to look at things.

The truth is that all the intelligence should be in the compiler or interpreter, not in the hardware. Putting more complicated stuff in hardware is moving the wrong direction, for many reasons. (If you don't understand the reasons, ask.) The hardware should be simple, bulletproof, and run mind-bogglingly fast.

- Warren
 
  • #46
D H said:
Regarding static:

The opposite of static at file scope is "extern", while the opposite of static at function scope is "auto". They are different concepts. This is not just my opinion; all of my C reference books have some caveat on the multiple meanings of "static".

I agree with McJennifer: C is a poorly architected language. Ada is the only well architected language that I know of, and it is more-or-less dead.

You are incorrect on both counts.

The keyword 'extern' is not the opposite of static, though it may seem that way to new C programmers. It actually instructs the compiler that the specified symbol will be defined in another scope, usually another file. Declaring a variable 'extern' does not make it visible to other scopes/files, it makes it useable from other files. It basically says "hey, it's not going to come from this scope," and allows the compiler to use the symbol without having a definition in the current scope.

Global variables are useable from outside of their defining scope (file) by default; static makes the symbol for a global private to the scope.

Static and auto are also not opposites. Auto is, of course, redundant, since all variables at function scope are automatically created on the stack at runtime by default. Like I said before, static instructs the compiler to allot some space in the executable image for the data and keep the symbol private to the scope. This is the same thing static always means (excepting the static method, as mentioned before).

It's also a bit odd to consider static the opposite of both extern and auto at the same time. They have very different meanings, yet I've tried to demonstrate that static nearly always means the same thing.

I hardly care what your reference books say. Find a better one that will help you understand what C is doing. Then you may understand C's elegance. It sounds like your books are for beginners.

- OMD
 
  • #47
oldtobor said:
and you still have to use vi because they can't create an EDIT program like the one that runs on DOS, from the prompt.

Hmm, I think you are confusing can't and don't want to. If you want to port EDIT to a UNIX, then go ahead. You'll probably have to learn C if you can get the original source.

If you bothered to look, you would notice a simple editor called PICO, which has many similarities to EDIT, and isn't as difficult for n00bs/fogies as vi can be at first.

- OMD
 
Last edited:
  • #48
Just a few extracts from ISO/IEC 9899:TC2

6.2.2 Linkages of identifiers
1 An identifier declared in different scopes or in the same scope more than once can be made to refer to the same object or function by a process calledlinkage. There are three kinds of linkage: external, internal, and none.

2 In the set of translation units and libraries that constitutes an entire program, each declaration of a particular identifier withexternal linkage denotes the same object or function. Within one translation unit, each declaration of an identifier withinternal linkagedenotes the same object or function. Each declaration of an identifier with no linkagedenotes a unique entity.

3 If the declaration of a file scope identifier for an object or a function contains the storage class specifier static, the identifier has internal linkage.


OK. Static has a special meaning when used for a file scope identifier. What more does the standard have to say about "static"?

6.7.1 Storage-class specifiers
Syntax
storage-class-specifier:
typedef
extern
static
auto
register

Constraints
At most, one storage-class specifier may be given in the declaration specifiers in a declaration.


Making specifiers as conceptually different as "typedef" and "static" of the same class ("storage class specifiers") speaks volumes of how well architected the C language is.

6.7.5 Declarators
Syntax
declarator: pointeropt direct-declarator
direct-declarator:
identifier (declarator )
direct-declarator [ type-qualifier-listopt assignment-expressionopt ]
direct-declarator [ static type-qualifier-listopt assignment-expression ]
direct-declarator [ type-qualifier-list static assignment-expression ] direct-declarator[ type-qualifier-listopt *]
direct-declarator( parameter-type-list ) direct-declarator ( identifier-listopt )


This is good. The architects of C can't even use storage-class-specifier in their own BNF. They have to make static a special case, twice.

6.7.5.2 Array declarators
Constraints
In addition to optional type qualifiers and the keywordstatic, the [ and ] may delimit an expression or *. I fthey delimit an expression (which specifies the size of an array), the expression shall have an integer type. If the expression is a constant expression, it shall have a value greater than zero. The element type shall not be an incomplete or function type. The optional type qualifiers and the keyword static shall appear only in a declaration of a function parameter with an array type, and then only in the outermost array type derivation.

Semantics
If, in the declaration ‘‘TD1’’, D1has one of the forms: D [ type-qualifier-listopt assignment-expressionopt[/sub ] ] D [ static type-qualifier-listopt assignment-expression ] D[ type-qualifier-list static assignment-expression ] D[ type-qualifier-listopt * ] and the type specified for ident in the declaration ‘‘T D’’ is ‘‘derived-declarator-type-list T’’, then the type specified for ident is ‘‘derived-declarator-type-list array of T’’.121) (See 6.7.5.3 for the meaning of the optional type qualifiers and the keyword static.)


I could go on - the standard explicitly mentions static as a special case several more times.

All this means that C is indeed a well-architected language in which the term static has only one meaning.

:smile: Not.
 
Last edited:
  • #49
eieio said:
You are incorrect on both counts.

The keyword 'extern' is not the opposite of static, though it may seem that way to new C programmers.

You are wrong. One cannot declare something both static and extern. Declaring something extern makes it visible to the linker. Declaring something static makes it invisible to the linker. I don't know what you mean by the word opposite, but I think most people would assume visible and invisible are opposites.

Static and auto are also not opposites. Auto is, of course, redundant, since all variables at function scope are automatically created on the stack at runtime by default.

The two terms cannot be used in unison (there are no static auto variables) and one means the variable is allocated/initialized each time the function is called and the other means the variable is allocated/initialized once. Once again, I don't know what you mean by the word opposite. To me, static and auto are opposites.
 
  • #50
chroot said:
I take it you don't think Python is well-architected?

It appears to be well-architected. I just don't like it.

I don't like end-of-line meaning end-of-statement. One of the best things about forgetting how to do Fortran was forgetting how to make continuation statements.

I don't like the block structure via indentation. Visually impaired programmers (at least those I have worked with) detest indentation (and case-sensitivity, but that is a topic for another day). I learned long ago to use a pretty-printer to make sense of someone elses' indentation scheme. I learned long ago that forcing an indentation scheme is usually not a good idea.

I don't like late binding. I would much prefer the compiler to tell me about errors ASAP.

I don't like toy languages. Python, Pascal (oh yeah, and Basic) are toy languages. You can google "toy language" to see what I mean.
 

Similar threads

Replies
102
Views
814
Replies
40
Views
4K
Replies
25
Views
4K
Replies
44
Views
5K
Replies
30
Views
6K
Replies
45
Views
7K
Replies
1
Views
2K
Replies
12
Views
2K
Back
Top