A back of a fag packet calculation tells you that, if there is 0.04dB error (that is about 1%), the resulting error in distance measurement, at 100km, of 2km.
That's fascinating, but I meant this statement:
which is round about the limit of modern RF Power measurement accuracy.
How did you work out that 0.04dB is the limit of modern RF Power measurement accuracy?
I'll think you will find this is orders of magnitude off, even by tech of the late 80's.
Analytic calculations tend to be more straightforward than numerical ones; it is very easy to check it but it seems right to me.
You don't spell it out but your figures seem to imply that a power measurement error of 0.013dB wold give an error of 0.1 km in 65km (yes?). Is that so very different from my simple result?
Its best to get accurate.
Why does 10W of uncertainty in 1kW of RF power upset you? Power, as with many quantities has measurement error proportional to its value.
We're talking about high precision engineering, this level of power instability would be associated with catastrophic failure, rather than normal operations. Its just a misleading figure.
Anyway, we are unconcerned with the transmitter at this stage.
Your description of errors as being "transient" assumes that you are measuring over at least a year, if you want to attempt to cancel seasonal fluctuations. Is that really the serious intention?
Seasonal fluctuations can be mapped with a sweep from a network of known locations running around the clock. This is really a resources issue.
A GPS satellite receiver will give you 10m accuracy about a minute after switch on. Who can supply all this data for analysis at the "processing stage"? You can't monitor all these variables - and the actual accuracy of monitoring is again limited.
Of course you can, it would take a high performance computing lab and networks of dedicated hardware. Its not impossible, just expensive.
If you have no evidence of actual figures and cannot justify your claims of possible improvement in all the areas so far mentioned then the system cannot work.
Again with the feasibility study...no one has asked you answer this, nor would you be capable of answering it given that you have never seen the hardware.
To improve simple SNR by a factor of 3dB, you need to analyse,in broad terms, for twice as long. Powers of two will rapidly build up and give you a ridiculous required time for your analysis.
The wonders of supercomputing...its faster than you think.
A transmitter needs to be matched to its load. This is very difficult to achieve over a wide bandwidth. The result is always a 'frequency response' with undulations in the order of 1dB for most applications. This is adequate for most applications (digital and analogue). I'm not sure where a breed of transmitter/ matching network / feeder / matching network / antenna will come from for which the frequency response is much better than that. Don't tell me - it's been accounted for or it's a trivial problem. It is a relevant factor.
These are engineering issues, it comes down to the bandwidth of the antenna and accounting for the signal loss the circuitry. This would probably be one area with the most accurate experimental data and with in-built sensors real-time information could augment that.
Again, its a matter of information, accounting for it and processing overhead. Its not unsolvable.
A transmitter at an altitude of 20km would be on a plane or balloon. They both move about a lot. How would the variations be measured in order to eliminate them? GPS, perhaps. Why not cut out the middle man and just use GPS?
The whole system is moving anyway, the satellites are in orbit and the Earth is rotating. Being in a balloon or airplane won't make much difference.
200m - 300m underground?! What's all that about. If you mean under water then your available frequency bands are a bit limited. Submarines can use just a few tens of kHz whilst submerged.
Its just a requirement. This is why the sub-1000Hz range was chosen, very deep penetration, little attenuation and little signal loss.
Also, you don't say where the receiver will be. There would be even more problems in characterising the conditions at the receiver - which will be changing, presumable, as it moves about, over ground of varying conductivity, air of varying temperature, past obstacles that will cause multipath propagation / reflections / diffraction. Are these all going to be "accounted for"?
See post #15. Given the patterns, echos can be disregarded for trilateration purposes, but could be used to fill in gaps due to noise. So, they may prove more useful than a hindrance at the processing stage.
Yes - DATA is what you need and, to get enough data to average out all the effects mentioned and others requires a long time and a lot of monitoring points to gather. Even if you could process it all instantly, you still have to wait for it to build up. As has been said many times, it boils down to bandwidth / time. What time do you think you would need in order to reduce inherent variations, some of them of 'several' dB to a total of what would have to be in the order of 0.001dB?
It doesn't matter at this point. Its not as long as you think though.
One grouse you have had is the lack of numbers in the objections. Well, now you have some but you still say that any problem can be overcome.
Any problem can be quantified, broken down into manageable units and solved. Its the basis of science. So far, whilst you have listed some problems, they all have clear resolutions and associated costs.
You still haven't told the forum whether you have any practical Engineering (or Physics) experience which can qualify you to decide on the relevance of the many practical implications. Are you, in fact, any more than a software developer? Some of my best friends (and family) are software developers but they would not make wild assertions on engineering matters.)
Ending on an ad hominem is a sign of a weak argument. This line of questioning will not change the facts.