I'm becoming a much better programmer, but maybe not a faster one

  • Thread starter Thread starter SlurrerOfSpeech
  • Start date Start date
Click For Summary
After five years of experience, there's a notable improvement in software design skills, particularly in areas like abstraction and dependency management. However, this increased proficiency has not translated into faster delivery of work items, as the time taken remains the same despite the added complexity of code. The discussion highlights the potential pitfalls of excessive abstraction, which can lead to fragile systems that are difficult to maintain and extend. While future-proofing through abstraction can be beneficial, it often does not account for unforeseen changes in project scope, leading to significant costs. Ultimately, simplicity and directness in code are emphasized as essential for effective software development.
  • #31
Tghu Verd said:
Nope, not saying that and didn't say that, you're reading something else into my few words. It was exactly the opposite, the mainframe was meant to subsume your PC hardware.
Please translate the following 2 sentences of yours into standard English without metaphor:
Tghu Verd said:
Ah yes, the platform that IBM spruiked as their "Highlander" - there need only be one. Pretty much jumped the shark when you could install X86 blades and run Windows apps.
 
Technology news on Phys.org
  • #32
sysprog said:
Indeed there is, and in general, the backward compatibility paradigm applies to both.

For a lot of business applications there is no concept of backward compatibility. You have version 1 with a defined set of functionality for a defined set of users and a defined set of interfaces; and, you have version 2 with a revised specification. There's certainly no principle that version 2 must be a superset of version 1 functionality.

If, for example, in version 2 a group of users is no longer going to use the application (they have perhaps moved on to a more specific application for them - or perhaps that part of the business has been sold), then there is no obligation to include a revised specification for them.

Or, for example, much of the system may have moved from batch printing to email to communicate with customers. Do you have to include a the old printing functionality in the new version, just in case the decision is reversed?

In truth, it's a moot point since you would have a certain budget and timescale for version 2 development and, in the sort of environment I worked, there would be no possibility of adding unspecified backward compatibility to the solution.

We may be talking at cross purposes here.
 
  • #33
PeroK said:
For a lot of business applications there is no concept of backward compatibility. You have version 1 with a defined set of functionality for a defined set of users and a defined set of interfaces; and, you have version 2 with a revised specification. There's certainly no principle that version 2 must be a superset of version 1 functionality.

If, for example, in version 2 a group of users is no longer going to use the application (they have perhaps moved on to a more specific application for them - or perhaps that part of the business has been sold), then there is no obligation to include a revised specification for them.

Or, for example, much of the system may have moved from batch printing to email to communicate with customers. Do you have to include a the old printing functionality in the new version, just in case the decision is reversed?

In truth, it's a moot point since you would have a certain budget and timescale for version 2 development and, in the sort of environment I worked, there would be no possibility of adding unspecified backward compatibility to the solution.

We may be talking at cross purposes here.
A concrete example of backward compatibility is that original MS Word .doc files can be read and edited by MS Word 2016, even though the .docx file format has superseded the .doc format. The earlier versions of the product could not have been built with anticipation of the newer functionalities of the later versions as effectively as the later versions were able to accommodate the existing formats of their predecessors. I think that reliance on an existing and ongoing commitment to some form of backward compatibility is more reasonable than trying to impose a come-what-may forward compatibility requirement.
 
Last edited:
  • Informative
Likes Klystron
  • #34
sysprog said:
A concrete example of backward compatibility is that original MS Word .doc files can be read and edited by MS Word 2016, even though the .docx file format has superseded the .doc format. The earlier versions of the product could not have been built with anticipation of the newer functionalities of the later versions as effectively as the later versions were able to accommodate the existing formats of their predecessors. I think that reliance on an existing and ongoing commitment to some form of backward compatibility is more reasonable than trying to impose a come-what-may forward compatibility. requirement.
MS word is not a business application. There must be hundreds of millions of users of Word. A typical business application that I'm taking about would have a small number of customers. Often only one.

Although, generally, my experience was in putting together software and hardware components from various sources. MS Word would be a standard off-the-shelf component.

Towards the end of my career a general inability to distinguish between the something like Word and a full blown business application - perhaps to manage hospital patient information - was at the root of several IT disasters.

Anyway, I'm out of the industry now, so I ought not to have an opinion anymore.
 
  • Wow
Likes sysprog
  • #35
PeroK said:
MS word is not a business application. There must be hundreds of millions of users of Word. A typical business application that I'm taking about would have a small number of customers. Often only one.
Many typical business application sets (e.g. accounts receivable, accounts payable, customer maintenance, general ledger, inventory control) that could run on a System/370 of 45 years ago, could still run unchanged on a z/OS system of today.
Although, generally, my experience was in putting together software and hardware components from various sources. MS Word would be a standard off-the-shelf component.
Many of us tended to call that kind of activity 'cobbling' things together.
Towards the end of my career a general inability to distinguish between the something like Word and a full blown business application - perhaps to manage hospital patient information - was at the root of several IT disasters.
That's just plain terrible, but it's sometimes hard to determine whether a fault is in vendor equipment or code, or in something in-house for which the customer is responsible.
Anyway, I'm out of the industry now, so I ought not to have an opinion anymore.
That last line is clearly a non sequitur. The opinions of seasoned veterans should always be in the mix. I appreciate the idea of handing over the reins to the new guard; however, they will do well to ensure that they do not fail to uptake the insights of the old guard.

It's interesting to me that you mention hospital patient information.

The 'patient information' term can refer to medical records regarding individual patients; however, in the normal parlance of hospital administration, 'patient information systems' are what the physician interacts with in order to produce the sets of advisory to-the-patient information sheets.

When I was doing Y2K work at a major hospital complex, the IBM mainframe for which I was their systems programmer, which had interfaces to multiple other systems, was running a database product that had to be upgraded to a then-new Y2K-compliant version. The new version had to be able to work with the prior version's set of databases, and to change all the 2-digit-year date fields to allow 4-digit years. The success of that upgrade foundationally depended upon effective before-and-after anticipation, observation, and implementation, of backward compatibility.
 
  • Like
Likes Klystron and PeroK
  • #36
sysprog said:
Please translate the following 2 sentences of yours into standard English without metaphor:

Without metaphor, eh? I considered writing this response as pseudocode but decided that would be unnecessarily cheeky, so...

Around the year 2000, IBM's product marketing assumed that their Z Series was a sufficiently compelling platform that it would entice clients to consolidate all their business computing needs onto it, not just the Z/OS ones. The mechanism for this was dedicated x86 hardware that allowed for Unix and Windows to be partitioned into the Z, all managed from a central software control console application. It included virtualization-type capabilities and resource sharing between operating systems.

IBM reps told us this presented an unbeatable offering, but for some reason, IBM failed to appreciate that each class of computing community considered their needs separate and had no wish to be involved in the other. One Z Series Admin told me there was no way a PC was going to "pollute" his mainframe, and that seemed to be major stumbling block to the whole concept.

It seemed that small number of clients adopted this, but it was not what the majority of the market wanted, and soon enough, promotion of this concept ceased.
 
  • Like
Likes PeroK
  • #37
Tghu Verd said:
Without metaphor, eh? I considered writing this response as pseudocode but decided that would be unnecessarily cheeky, so...

Around the year 2000, IBM's product marketing assumed that their Z Series was a sufficiently compelling platform that it would entice clients to consolidate all their business computing needs onto it, not just the Z/OS ones. The mechanism for this was dedicated x86 hardware that allowed for Unix and Windows to be partitioned into the Z, all managed from a central software control console application. It included virtualization-type capabilities and resource sharing between operating systems.

IBM reps told us this presented an unbeatable offering, but for some reason, IBM failed to appreciate that each class of computing community considered their needs separate and had no wish to be involved in the other. One Z Series Admin told me there was no way a PC was going to "pollute" his mainframe, and that seemed to be major stumbling block to the whole concept.

It seemed that small number of clients adopted this, but it was not what the majority of the market wanted, and soon enough, promotion of this concept ceased.

Around this time my company was asked to submit a bid for a new reservations system based on an IBM mainframe offering. I volunteered to put the solution together (no one else would touch it, but I thought it might be quite interesting!). One problem was that our Data Centre pricing model was based on MIPS. And, we had to quote for the costs of the system for all possibilities, including very large transaction volumes. The quoted costs were astronomical. IBM had a good staggered pricing model for their products and licences but our Data Centre people loaded the bid with astronomical support and operator costs. It was simply linear per MIP.

I argued long and hard with our mainframe Data Centre people. I said to them: you keep telling us that the mainframe is competitive and when we try to put together a bid (at the customer's insistence) based on a mainframe solution, you load the bid with unjustifiable support and operations costs.

Anyway, it was ridiculously expensive compared to the Unix/Oracle alternative we were bidding against. It was a shame because I really believed the IBM mainframe hardware and software was a really good option. The mainframe, as platform, had a lot of advantages.

The UNIX/Oracle support teams (ironically, that was my background) had been forced to become more flexible and commercially aware. The mainframe people were "take-it-or-leave-it" dinosaurs. And that, not any failing of the IBM mainframe itself, was why we never submitted another solution for a new-system based on mainframe technology.
 
  • Like
Likes Klystron
  • #38
PeroK said:
Anyway, it was ridiculously expensive compared to the Unix/Oracle alternative we were bidding against. It was a shame because I really believed the IBM mainframe hardware and software was a really good option. The mainframe, as platform, had a lot of advantages.

Agree with that, @PeroK, shame really, but the best tech doesn't always win. (Though IBM sold about $20B of Z-Series kit last year, so I guess "lose" is a relative term!)
 
  • Like
Likes PeroK
  • #39
Let me give an example of what I mean.

Suppose there's a very simple requirement: Write a program to recursively search a directory and count the number of "*.dll" files.

Could easily whip up a working solution using C# DirectoryInfo class in about 1 minute.

However, you can think of this problem as a specific example of a more general problem of "Find some matching items in a possibly infinite tree of nodes containing items" and create abstractions like

interface IDataReference<TData>
{
TData Read();
}

interface IDataReferenceFilter<IDataReference<TData>>
{
bool IsFiltered(IDataReference<TData> dataReference);
}

interface IDataNode<IDataReference<TData>>
{
IEnumerable<IDataNode<TData>> Children { get; }

IEnumerable<IDataReference<TData>> Values { get; }
}


and then implementations like

// basically a wrapper over FileInfo
class FileReference : IDataReference<Stream> { }

class FilePathFilter : IDataFilter<FileReference> { }

// basically a wrapper over DirectoryInfo
class FileDirectory : IDataTreeNode<FileReference> { }


but is it worth it?
 
  • #40
What would Dilbert do?
Pointy-Haired Boss said:
I have a very simple requirement: Write a program to recursively search a directory and count the number of "*.dll" files.
Dilbert said:
Sure, Boss, simple requirement; simple solution -- here:
Code:
dir *.dll /s
No, I want just the count; not all that other stuff.
Dilbert said:
Do you want the program to say what the count is a count of, too, or just say the number?
I want you to stop trying to make me do your job. I want to say what I want, and then you figure out what I meant, and you come back with what I wanted. Is that clear?
Dilbert said:
Clear as fog, Boss; I'll get right on it.
Attaboy.
What, hypothetically, is the origin of the requirement in your example? Why would you need to write a program to do something that can be done with a single command? What's the real requirement?

Whether you provide a more abstract or general-purpose solution, or a more specific one, or simply re-use existing code that already solves the problem, should depend on the real requirements you're trying to address.
 
Last edited:
  • Like
Likes Klystron, Mark44 and QuantumQuest
  • #41
There is a difference between what I called "future proofing" and backward compatibility.

Future proofing is when you try to include features or elements to support unknown future requirements. For example, including the header size and the version number of the file format in a data file header will have no use in version 1.00 of the code - but it will allow backward compatibility in later versions.

Backward compatibility means that newer revisions of the application(s) will support older user data sets (data files, scripts, programming, etc). This can be done either natively or with conversion tools. For example, the latest versions of Word can still read the earliest Word files - but to edit them, it needs to convert them to the newer format.

It was also mentioned earlier in this thread that operating systems are different than most business applications. The key difference as it relates to backward compatibility is the degree to which application developers have control over the existing data sets that are supported by the application. When developing something like Word, there is never any possibility of going out and converting all Word files to the latest format. But in many business situations, there is only a single data base and it is completely practical to include all the current applications that support it with each backup of that data set. In such a case, a one-off data base conversion program is all that is needed to assure system continuity whenever those applications are updated.
 
  • #42
sysprog said:
What would Dilbert do?

What, hypothetically, is the origin of the requirement in your example? Why would you need to write a program to do something that can be done with a single command? What's the real requirement?

Whether you provide a more abstract or general-purpose solution, or a more specific one, or simply re-use existing code that already solves the problem, should depend on the real requirements you're trying to address.
One of my rules of survival on the job: If your boss asks you to do something, and it is easy to do, then do it -- quickly -- without questions or arguments. ;>)
 
  • Like
Likes QuantumQuest, Klystron and jedishrfu
  • #43
sysprog said:
What would Dilbert do?

That's a great sequence, and pretty much elaborates what Agile software development tries to solve from a requirements perspective. Whether Agile works depends on a lot of local factors, but the concept of getting the people who want something closer to the team doing the work - and delivering incremental improvements faster - is a good one.

In terms of @.Scott's future proofing, I've found that hard to design for. Perhaps I'm poor at predicting the future, but apart from simple aspects such as global variables and self-contained components where possible, any 'feature' that I thought would be worth lobbing in on a "just in case" basis, was wasted time. I figured that was me, but the theme of future proofing being a waste of time seems common in dev forums, and Steve Konves blog on the topic seems a good summary.
 
  • #44
One way to think about it is in terms of objects, instances, classes and interfaces as a means to future proof your code. Design to the interface and then classes that implement the interface can be swapped out for better ones without changing your overall logic. Also consider designing with the model view controller pattern where the model holds the data that your program needs and the view asks the model for whatever data it needs to display while the controller handles all the event activity going on..
 

Similar threads

Replies
29
Views
5K
Replies
127
Views
22K
  • · Replies 19 ·
Replies
19
Views
8K
  • · Replies 11 ·
Replies
11
Views
6K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 12 ·
Replies
12
Views
6K