An Interview with James Gosling: The Father of Java

  • Thread starter Thread starter jedishrfu
  • Start date Start date
AI Thread Summary
The discussion highlights the experiences and challenges faced by programmers, particularly in relation to Java's garbage collection and debugging processes. Participants share anecdotes about memory management issues, including debugging memory leaks and dealing with non-descriptive errors like segmentation faults. They reflect on the complexities of garbage collection in Java, which organizes heap data into short-term, interim, and long-term areas for efficient memory management. The conversation also touches on humorous and frustrating experiences with coding errors, particularly in C, and the importance of careful design and testing in programming. Additionally, there are stories about practical jokes in programming classes and the lessons learned from past mistakes, emphasizing the need for thorough documentation and the potential pitfalls of custom coding protocols. Overall, the thread underscores the blend of technical challenges and light-hearted camaraderie in the programming community.
Physics news on Phys.org
I've used it a lot professionally though I always feel a little patronized when a language has inbuilt garbage collection. I can keep track of my memory myself thank you. I never left a pointer dangling (*cough*).
 
I got to debug a memory leak. Its never pretty.

The java debugger was pretty cool. Over time it showed a sawtooth on a sawtooth on a sawtooth time vs memory in use as memory was garbage collected from three heap areas.

Java would group heap data into a short term area that it reviewed on every garbage cycle, an interim area that it reviewed on every nth cycle ( I don’t recall the n) and a longterm heap that was reviewed when interim items were moved to it.

https://docs.oracle.com/cd/E13150_01/jrockit_jvm/jrockit/geninfo/diagnos/garbage_collect.html
 
jedishrfu said:
I got to debug a memory leak. Its never pretty.

The java debugger was pretty cool. Over time it showed a sawtooth on a sawtooth on a sawtooth time vs memory in use as memory was garbage collected from three heap areas.

Java would group heap data into a short term area that it reviewed on every garbage cycle, an interim area that it reviewed on every nth cycle ( I don’t recall the n) and a longterm heap that was reviewed when interim items were moved to it.

https://docs.oracle.com/cd/E13150_01/jrockit_jvm/jrockit/geninfo/diagnos/garbage_collect.html

Oh I hear you. A non-descriptive error like "segmentation fault" is the worst. :smile: Especially hidden in a heap (pun intended) of code.

I admit I've never looked that much into the theory behind garbage collection, but I can see from your link that it's as easy to get lost in that as it is to leave a C pointer dangling at the end of an 8 hour long debug session.

Aargh the horror! :woot:
 
My favorite error in C was passing a buffer like an array of a limited size to a subroutine that passed it on to yet another routine maybe through recursion or its just part of the implementation.

The low level routine writes more data to the buffer than expected corrupting the stack causing a failure as the routines unravel the stack during their returns.

You try to debug it with printf statements that alter the stack enough that it doesn't happen.

—-

Another related error was writing data to a buffer of 12 characters in length and having a segmentation error when writing 12 characters.

However if i changed the buffer to 13 characters and writing 12 or 13 or 14 characters it didnt fail with a segmentation error.

I finally figured it out the system was allocating memory in multiples of 16 with the first 4 bytes containing metadata on the block. My 12 character string was actually 12+1 for the null byte. When I increased the size to 13 I actually got 32 bytes to write into.

I think it happened to me on OS/2 when i was still new to programming on it.
 
Last edited:
Most of my experiences with those kind of errors is usually messes left by CS students though on one shameful occasion a coworker.

We had a joker-hat, complete with bells and all, which was left on the desk of the latest ef-up for everyone to see. In a 12-man office that was pretty embarrassing. *Luckily* it never touched my desk.

With the mess left by the CS students one included a home-coded red-black data tree (oh heavens what a mess!), which I replaced with some C++ template library or some such. Obviously the "offender" had fun coding it but it just wouldn't fly (or "grow", as trees tend to do).

As I mentioned in some other thread that's where "refactoring" (read: "a complete rewrite") comes in. :smile:

EDIT: had fun inserting links to wiki.
 
I taught a C class inthe days of PC DOS and some my savvy dept mates were in the class. They were into hacking, assembler code and game pirating

They created some hidden character undeletable directories as a joke forcing me to reformat the 10MB hard drives when the classes were over because we shared the machines with the Lotus class and it used most of the disk space.
 
jedishrfu said:
I taught a C class inthe days of PC DOS and some my savvy dept mates were in the class. They were into hacking, assembler code and game pirating

They created some hidden character undeletable directories as a joke forcing me to reformat the 10MB hard drives when the classes were over because we shared the machines with the Lotus class and it used most of the disk space.

You could write a program which will read the "hidden character undeletable directories" and delete them from this program. I've had success with this MO before.

But yeah, the length students will go to in order to sabotage a lesson and make a practical joke! On Windows you could start CMD, go into fullscreen, run the debug command, issue a series of d and/or u commands and finally put the screen on it's side using the Ctrl+Alt+Left key combo. Finally, call the IT person responsible, telling them you suspect the computer contracted some sort of virus and watch them squirm. :smile:

I know I've mentioned this before but it never gets old. :woot:
 
But yeah, maybe I should start a course on my local library... "An Introduction to Programming" for nerdy schoolchildren. Combined with some homework help, especially mathematics, it might be a winner.
 
  • #10
I used to add humorous quips to my course, suggesting that you never leave your name in the comments or, if you did, erase it before moving on to a new job, lest someone call you in the middle of the night when the program fails.

I illustrated various C errors by describing them and mentioning how doing x,y, and z could really mess up your code. I also cautioned them that as stewards of their code, they should always fix a bug and leave a new one for future programmers.

I learned this approach to teaching from a telecommunications course I took, where the instructor would include stories that still resonate with me today.

---

One story was about a company named Expandor, which developed custom compression algorithms for clients. In one instance, they had statistically mapped the character alphabet used by a company's terminal application and recommended a new character encoding similar to the ETAON frequency alphabet, where E was represented by 01 and T by 011 ...

He continued that, true to their name, the data stream was many times longer than expected when they implemented the scheme at the customer's site.

The culprit was the RUBOUT character, which was scattered throughout their data stream and used for cursor movement. It acted as a screen-based form application where users tabbed to the next field to input data, and the rubouts served to separate one fixed-length field from another.

They hadn't considered the hidden ASCII codes in their analysis.

---

Another story he mentioned was how someone could overwhelm bank communications by injecting transactions with invalid checksum codes (ie whatever scheme the bank used to validate the transaction) into the stream in ever-increasing numbers until the bank personnel thought there was an error in the data stream checking algorithm and simply approved transactions. At that moment, you would send in your real transfer request and run off with the money.

I'm sure these kinds of problems can no longer occur, but both stories served as cautionary tales meant to emphasize the importance of careful design, implementation, and extensive testing.
 
  • #11
jedishrfu said:
I used to add humorous quips to my course, suggesting that you never leave your name in the comments or, if you did, erase it before moving on to a new job, lest someone call you in the middle of the night when the program fails.

I illustrated various C errors by describing them and mentioning how doing x,y, and z could really mess up your code. I also cautioned them that as stewards of their code, they should always fix a bug and leave a new one for future programmers.

I learned this approach to teaching from a telecommunications course I took, where the instructor would include stories that still resonate with me today.

---

One story was about a company named Expandor, which developed custom compression algorithms for clients. In one instance, they had statistically mapped the character alphabet used by a company's terminal application and recommended a new character encoding similar to the ETAON frequency alphabet, where E was represented by 01 and T by 011 ...

He continued that, true to their name, the data stream was many times longer than expected when they implemented the scheme at the customer's site.

The culprit was the RUBOUT character, which was scattered throughout their data stream and used for cursor movement. It acted as a screen-based form application where users tabbed to the next field to input data, and the rubouts served to separate one fixed-length field from another.

They hadn't considered the hidden ASCII codes in their analysis.

---

Another story he mentioned was how someone could overwhelm bank communications by injecting transactions with invalid checksum codes (ie whatever scheme the bank used to validate the transaction) into the stream in ever-increasing numbers until the bank personnel thought there was an error in the data stream checking algorithm and simply approved transactions. At that moment, you would send in your real transfer request and run off with the money.

I'm sure these kinds of problems can no longer occur, but both stories served as cautionary tales meant to emphasize the importance of careful design, implementation, and extensive testing.

I like to put my quips in my comments and documentation. Once I accidentally put my documentation in the source file instead of the header and some arrogant p.... sought me out (because I *do* leave my name, mostly because I'm proud of what I do, but also because I was an internal consultant) accusing me of not documenting my work. I absentmindedly (I was busy, aren't/weren't we all?) told him to look in the source file instead. I never heard from him again.

Leaving a bug for future programmers sounds to me kind of not nice, bordering on downright nasty! :smile:

Expandor, what a name heh. Talk about kicking destiny in the n... :smile:

But yeah, bespoke protocols. Those are seldom pretty. And then there's protocol drift...

Mærsk is such a big company that they talked about their machines "eating copper". Apparently, TDC (Danish Telecom), Danish Bank and Mærsk had to schedule their server updates because there weren't enough cables in the ground to do it all at once. I tried to map out their database (Mærsk's) at one point, filling a 12 person room wall. When I ran out of A3 paper I gave up. That thing is a monster!

EDIT: I probably got more but I have to be off now...
 
Back
Top