The answer to your question is not agreed. But at least it can be clearly stated.
As you say, the guts of it is whether consciousness is "computational", or whether it "involves something more"?
If it is computational - a software pattern - then the presumption that follows is it could run on any kind of hardware in principle. So this could be biological. Or it could be something that is doing the same job. You could do it in silicon chips and if you made the same patterns, your machine would be aware.
Now we already can say that the brain, as a biological organ, certainly looks computational in many aspects. Neurons fire in ways that look like digital signals. Axons conduct the signals point-to-point. It looks like information processing.
So the question is whether it is "pure computation" (rather than pure biology). Or whether it is something more entangled, more subtle, more difficult (even impossible) to replicate at the hardware level.
We can think the brain is crisply divided by the software/hardware distinction - as it is in the Turing machine which is the basis of our concept of computation - but it is then a big if as to whether the brain actually is divided in this "pure" fashion.
I believe, having studied this issue plenty

, that the mind is NOT pure computation. It is not a Turing machine. There is not a clean software~hardware divide that would allow you to identify some set of consciousness algorithms that you pick up (or download) and implement on some other general purpose hardware.
This is just a wrong mental image although a very common one.
Instead, what we should be focused on is generalising our notion of consciousness as a living process. So in fact, forget the machine model that comes from technology and actually apply some biological thinking.
Theoretical biology explains living systems with a whole bunch of concepts like anticipation, adaptation, semiosis, dissipation, hierarchy, modelling relations, that make consciousness seem much less of a mystery.
So computer analogies are fine as far as they go. Which isn't very far when it comes to living systems. Life needs to be described in its own terms at the end of the day.
For this reason, I think you make a big mistake when saying consciousness seems too vague and hard to define, so let's switch the conversation to qualia.
The idea of qualia is that there can be such a thing as "atoms of experience". Now you have not just a software/hardware divide, but a dualistic mind/matter divide. Effectively you have painted yourself into an intellectual corner with no way out (except by all sorts of crackpot magic like quantum coherence, panpsychism, etc).
So it is right to think that consciousness is a rather squishy and ill-defined concept once you examine it. You have started trying to generalise something specific - your current state of experience - to something more general, your brain biology. So you just need to keep going down that path, using actual concepts from biology.
BTW, Dennett ain't much of a biologist despite the fact he writes a lot about Darwin machines. It gives the game away that he treats biology as "just machinery".