I'm becoming a much better programmer, but maybe not a faster one

.Scott

Homework Helper
2,338
782
Since my last post to this thread, I've realized that there are cases that fall between "maintainability" and "future-proofing".
For example, whenever I create a new file format I include the version number of the file format and the byte size of the header in the header. Is this future-proofing or is this maintainability? Whichever it is, I've learned that it's a tiny effort compared to the headaches it often avoids.
About a year ago, when asked to write a tool for copying FPGA code into an embedded flash device, I didn't use the hex file as source, but a file that included providence information (who compiled it, what their version number was, the target FPGA device model and version number, the date they compiled it, etc) so that I could tuck that information into a sector of the flash memory as well. It was only 6 months later that someone walked into my office with a radar sensor that had been programmed with that tool and sorely needed to know that information. So, I probably shouldn't say that future-proofing is always a "loosing game". But you certainly need to be careful about which of those games you choose to play.
 

Svein

Science Advisor
Insights Author
1,938
593
This discussion reminds me of a comment by an IT professional: "Well, if there is no requirement that the code should work, I can write it in less than an hour".
 
709
316
rcgldr said:
Typically Z/OS. Think of it as hardware and an OS that can run multiple virtual machines, each with its own virtual hardware and virtual OS (tri-modal addressing), but at full speed and in parallel.
That seems more like z/VM (virtual machine) or EMIF (extended multi-image facility). z/OS is the descendant of MVS (multiprocessing virtual storage).
Ah yes, the platform that IBM spruiked as their "Highlander" - there need only be one. Pretty much jumped the shark when you could install X86 blades and run Windows apps.
This is so incorrect that I barely know where to begin. You appear to be contending that IBM touted its mainframe technology as sufficient for all computational purposes, which it never has done, and you then seem to suggest or say that not only is this sufficiency untrue, but that blade server farms have obviated the need for the mainframe architecture. Both claims are manifestly false. If you think either of them to be true, please post some support for them, rather than just couching them in trendy terms.
"Wow, a mainframe that we can put a PC in," said nobody ever!
IBM was the primary corporate sponsor of the PC, and among the first companies to bring about integration between PCs and mainframes. In fact, IBM mainframes have used high-end single board computers, running OS/2, in their HMCs (hardware management consoles) since the '90s.

245191
 
709
316
There's a big difference between writing a mainframe O/S (or any O/S) and a "business application", which is what I took @.Scott's advice to apply to.
Indeed there is, and in general, the backward compatibility paradigm applies to both.
 
179
76
You appear to be contending that IBM touted its mainframe technology as sufficient for all computational purposes, which it never has done
I feel like I've touched a nerve, but about the turn of the century, IBM was telling the company I worked for - we were partners - that the mainframe could host traditional Z-series banking apps, and with the appropriate blades (or perhaps they were called 'cards', it was a while ago) could run Linux and Windows apps as well. They had impressive ROI graphs showing how this was considerably more cost effective...and supposedly secured...than typical approaches. None of our customers showed any shred of interest, it seemed an unlikely mixing of big iron and less disciplined business unit computing. So yes, they were telling us was sufficient for 'all' computational purposes that a regular business might have done at the time. To be fair, we didn't take that to mean SCADA or specialist types of ancillary computing, or even ML/AI, which was not really a thing at the time.

And sorry, I didn't keep any of that collateral, it was entirely secondary to what we were doing.

but that blade server farms have obviated the need for the mainframe architecture.
Nope, not saying that and didn't say that, you're reading something else into my few words. It was exactly the opposite, the mainframe was meant to subsume your PC hardware.
 
709
316
Tghu Verd said:
Nope, not saying that and didn't say that, you're reading something else into my few words. It was exactly the opposite, the mainframe was meant to subsume your PC hardware.
Please translate the following 2 sentences of yours into standard English without metaphor:
Tghu Verd said:
Ah yes, the platform that IBM spruiked as their "Highlander" - there need only be one. Pretty much jumped the shark when you could install X86 blades and run Windows apps.
 

PeroK

Science Advisor
Homework Helper
Insights Author
Gold Member
2018 Award
9,508
3,491
Indeed there is, and in general, the backward compatibility paradigm applies to both.
For a lot of business applications there is no concept of backward compatibility. You have version 1 with a defined set of functionality for a defined set of users and a defined set of interfaces; and, you have version 2 with a revised specification. There's certainly no principle that version 2 must be a superset of version 1 functionality.

If, for example, in version 2 a group of users is no longer going to use the application (they have perhaps moved on to a more specific application for them - or perhaps that part of the business has been sold), then there is no obligation to include a revised specification for them.

Or, for example, much of the system may have moved from batch printing to email to communicate with customers. Do you have to include a the old printing functionality in the new version, just in case the decision is reversed?

In truth, it's a moot point since you would have a certain budget and timescale for version 2 development and, in the sort of environment I worked, there would be no possibility of adding unspecified backward compatibility to the solution.

We may be talking at cross purposes here.
 
709
316
For a lot of business applications there is no concept of backward compatibility. You have version 1 with a defined set of functionality for a defined set of users and a defined set of interfaces; and, you have version 2 with a revised specification. There's certainly no principle that version 2 must be a superset of version 1 functionality.

If, for example, in version 2 a group of users is no longer going to use the application (they have perhaps moved on to a more specific application for them - or perhaps that part of the business has been sold), then there is no obligation to include a revised specification for them.

Or, for example, much of the system may have moved from batch printing to email to communicate with customers. Do you have to include a the old printing functionality in the new version, just in case the decision is reversed?

In truth, it's a moot point since you would have a certain budget and timescale for version 2 development and, in the sort of environment I worked, there would be no possibility of adding unspecified backward compatibility to the solution.

We may be talking at cross purposes here.
A concrete example of backward compatibility is that original MS Word .doc files can be read and edited by MS Word 2016, even though the .docx file format has superseded the .doc format. The earlier versions of the product could not have been built with anticipation of the newer functionalities of the later versions as effectively as the later versions were able to accommodate the existing formats of their predecessors. I think that reliance on an existing and ongoing commitment to some form of backward compatibility is more reasonable than trying to impose a come-what-may forward compatibility requirement.
 
Last edited:

PeroK

Science Advisor
Homework Helper
Insights Author
Gold Member
2018 Award
9,508
3,491
A concrete example of backward compatibility is that original MS Word .doc files can be read and edited by MS Word 2016, even though the .docx file format has superseded the .doc format. The earlier versions of the product could not have been built with anticipation of the newer functionalities of the later versions as effectively as the later versions were able to accommodate the existing formats of their predecessors. I think that reliance on an existing and ongoing commitment to some form of backward compatibility is more reasonable than trying to impose a come-what-may forward compatibility. requirement.
MS word is not a business application. There must be hundreds of millions of users of Word. A typical business application that I'm taking about would have a small number of customers. Often only one.

Although, generally, my experience was in putting together software and hardware components from various sources. MS Word would be a standard off-the-shelf component.

Towards the end of my career a general inability to distinguish between the something like Word and a full blown business application - perhaps to manage hospital patient information - was at the root of several IT disasters.

Anyway, I'm out of the industry now, so I ought not to have an opinion anymore.
 
709
316
MS word is not a business application. There must be hundreds of millions of users of Word. A typical business application that I'm taking about would have a small number of customers. Often only one.
Many typical business application sets (e.g. accounts receivable, accounts payable, customer maintenance, general ledger, inventory control) that could run on a System/370 of 45 years ago, could still run unchanged on a z/OS system of today.
Although, generally, my experience was in putting together software and hardware components from various sources. MS Word would be a standard off-the-shelf component.
Many of us tended to call that kind of activity 'cobbling' things together.
Towards the end of my career a general inability to distinguish between the something like Word and a full blown business application - perhaps to manage hospital patient information - was at the root of several IT disasters.
That's just plain terrible, but it's sometimes hard to determine whether a fault is in vendor equipment or code, or in something in-house for which the customer is responsible.
Anyway, I'm out of the industry now, so I ought not to have an opinion anymore.
That last line is clearly a non sequitur. The opinions of seasoned veterans should always be in the mix. I appreciate the idea of handing over the reins to the new guard; however, they will do well to ensure that they do not fail to uptake the insights of the old guard.

It's interesting to me that you mention hospital patient information.

The 'patient information' term can refer to medical records regarding individual patients; however, in the normal parlance of hospital administration, 'patient information systems' are what the physician interacts with in order to produce the sets of advisory to-the-patient information sheets.

When I was doing Y2K work at a major hospital complex, the IBM mainframe for which I was their systems programmer, which had interfaces to multiple other systems, was running a database product that had to be upgraded to a then-new Y2K-compliant version. The new version had to be able to work with the prior version's set of databases, and to change all the 2-digit-year date fields to allow 4-digit years. The success of that upgrade foundationally depended upon effective before-and-after anticipation, observation, and implementation, of backward compatibility.
 
179
76
Please translate the following 2 sentences of yours into standard English without metaphor:
Without metaphor, eh? I considered writing this response as pseudocode but decided that would be unnecessarily cheeky, so...

Around the year 2000, IBM's product marketing assumed that their Z Series was a sufficiently compelling platform that it would entice clients to consolidate all their business computing needs onto it, not just the Z/OS ones. The mechanism for this was dedicated x86 hardware that allowed for Unix and Windows to be partitioned into the Z, all managed from a central software control console application. It included virtualization-type capabilities and resource sharing between operating systems.

IBM reps told us this presented an unbeatable offering, but for some reason, IBM failed to appreciate that each class of computing community considered their needs separate and had no wish to be involved in the other. One Z Series Admin told me there was no way a PC was going to "pollute" his mainframe, and that seemed to be major stumbling block to the whole concept.

It seemed that small number of clients adopted this, but it was not what the majority of the market wanted, and soon enough, promotion of this concept ceased.
 

PeroK

Science Advisor
Homework Helper
Insights Author
Gold Member
2018 Award
9,508
3,491
Without metaphor, eh? I considered writing this response as pseudocode but decided that would be unnecessarily cheeky, so...

Around the year 2000, IBM's product marketing assumed that their Z Series was a sufficiently compelling platform that it would entice clients to consolidate all their business computing needs onto it, not just the Z/OS ones. The mechanism for this was dedicated x86 hardware that allowed for Unix and Windows to be partitioned into the Z, all managed from a central software control console application. It included virtualization-type capabilities and resource sharing between operating systems.

IBM reps told us this presented an unbeatable offering, but for some reason, IBM failed to appreciate that each class of computing community considered their needs separate and had no wish to be involved in the other. One Z Series Admin told me there was no way a PC was going to "pollute" his mainframe, and that seemed to be major stumbling block to the whole concept.

It seemed that small number of clients adopted this, but it was not what the majority of the market wanted, and soon enough, promotion of this concept ceased.
Around this time my company was asked to submit a bid for a new reservations system based on an IBM mainframe offering. I volunteered to put the solution together (no one else would touch it, but I thought it might be quite interesting!). One problem was that our Data Centre pricing model was based on MIPS. And, we had to quote for the costs of the system for all possibilities, including very large transaction volumes. The quoted costs were astronomical. IBM had a good staggered pricing model for their products and licences but our Data Centre people loaded the bid with astronomical support and operator costs. It was simply linear per MIP.

I argued long and hard with our mainframe Data Centre people. I said to them: you keep telling us that the mainframe is competitive and when we try to put together a bid (at the customer's insistence) based on a mainframe solution, you load the bid with unjustifiable support and operations costs.

Anyway, it was ridiculously expensive compared to the Unix/Oracle alternative we were bidding against. It was a shame because I really believed the IBM mainframe hardware and software was a really good option. The mainframe, as platform, had a lot of advantages.

The UNIX/Oracle support teams (ironically, that was my background) had been forced to become more flexible and commercially aware. The mainframe people were "take-it-or-leave-it" dinosaurs. And that, not any failing of the IBM mainframe itself, was why we never submitted another solution for a new-system based on mainframe technology.
 
179
76
Anyway, it was ridiculously expensive compared to the Unix/Oracle alternative we were bidding against. It was a shame because I really believed the IBM mainframe hardware and software was a really good option. The mainframe, as platform, had a lot of advantages.
Agree with that, @PeroK, shame really, but the best tech doesn't always win. (Though IBM sold about $20B of Z-Series kit last year, so I guess "lose" is a relative term!)
 
Let me give an example of what I mean.

Suppose there's a very simple requirement: Write a program to recursively search a directory and count the number of "*.dll" files.

Could easily whip up a working solution using C# DirectoryInfo class in about 1 minute.

However, you can think of this problem as a specific example of a more general problem of "Find some matching items in a possibly infinite tree of nodes containing items" and create abstractions like

interface IDataReference<TData>
{
TData Read();
}

interface IDataReferenceFilter<IDataReference<TData>>
{
bool IsFiltered(IDataReference<TData> dataReference);
}

interface IDataNode<IDataReference<TData>>
{
IEnumerable<IDataNode<TData>> Children { get; }

IEnumerable<IDataReference<TData>> Values { get; }
}


and then implementations like

// basically a wrapper over FileInfo
class FileReference : IDataReference<Stream> { }

class FilePathFilter : IDataFilter<FileReference> { }

// basically a wrapper over DirectoryInfo
class FileDirectory : IDataTreeNode<FileReference> { }


but is it worth it?
 
709
316
What would Dilbert do?
Pointy-Haired Boss said:
I have a very simple requirement: Write a program to recursively search a directory and count the number of "*.dll" files.
Dilbert said:
Sure, Boss, simple requirement; simple solution -- here:
Code:
dir *.dll /s
No, I want just the count; not all that other stuff.
Dilbert said:
Do you want the program to say what the count is a count of, too, or just say the number?
I want you to stop trying to make me do your job. I want to say what I want, and then you figure out what I meant, and you come back with what I wanted. Is that clear?
Dilbert said:
Clear as fog, Boss; I'll get right on it.
Attaboy.
What, hypothetically, is the origin of the requirement in your example? Why would you need to write a program to do something that can be done with a single command? What's the real requirement?

Whether you provide a more abstract or general-purpose solution, or a more specific one, or simply re-use existing code that already solves the problem, should depend on the real requirements you're trying to address.
 
Last edited:

Want to reply to this thread?

"I'm becoming a much better programmer, but maybe not a faster one" You must log in or register to reply here.

Related Threads for: I'm becoming a much better programmer, but maybe not a faster one

Replies
11
Views
3K
  • Posted
2
Replies
27
Views
2K
  • Posted
Replies
2
Views
2K
  • Posted
Replies
17
Views
810

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top