Back to the analogy. (I won't change naming, but hardware aware guy VS algorithm only software guy, is the more appropriate naming)
The software guy is wrong… for reasons way deeper than FAPP (on which everyone has different opinions and judgment about what is practical and purposeful).
Even if computers were abstraction only made of bit, even if software guy admits there is a fine number of those (it is probably possible to retrofit this in his mathematical abstractions/language), and even if his work is definitely helpful (i.e
evaluate complexities, build certitude for signal processing and cryptographic, and more) ... he will still miss something.
Computer are NOT made of bit, and nothing else matter.
A helpful example within the analogy: the paradigm of those "bits", is
UMA, this correspond to a non-local paradigm, were all kind of fantasies can be build. Namely primitive object like
pointer and integer. Even worse, those are often abstract away by even more "high level" semantics (array and number). The result: one can build a program that halts after 2 weeks. And it will crash if the 'initial boundaries' don't fit within some ranges.
The hardware guy will produce another program, that halts in 42 seconds, and accept an order of magnitude bigger inputs, by carefully avoiding some singularities.
This is still an FAPP failure, so it is always easy for the software guy to claim equivalences of results. He probably have no-go theorem, involving cost of hardware vs cost of thinking more broadly.
Here is the trick. The hardware guy know that UMA is a fiction and information is not "just bit". He has a precise knowledge of the caches latencies and sizes, WAT per cycle, even silicon rarefaction and cost of electric bill if he is really a good one.
The result is that the deep understanding of those "
ontologies", made him develop a theoretical (logarithmic) framework/understanding, a specific domain that allows him to produce results that is
simply impossible to do otherwise.
Caches haven't been invented to solve a computing problem, but to solve a computer problem, related to physics, not bits. Now entire stacks of cache-software tools also exist, only to manage that "ontological issue"
The software guy has not incentive, nor even ways in principle, to discover those "useful" domain into the more or less infinite space of algorithm. He could even stumble on it by luck, and not even understand what it would be
useful for.
There is also caveat of being
too focused on ontology. In computer science the saying is "Premature optimization is the root of all evil". And this is true.
The other less vocalized truth is "Not caring about optimization is the other root of all evil". I've seen more projects fail because of the second than the first.