How Do You Connect an Antenna to an ADC Board for RF Undersampling?

In summary, the conversation revolves around starting a hobby project involving undersampling RF to ADC. The individual is looking at a specific development board and is trying to understand how to connect the receiver antenna to it. There is a discussion about the necessary components such as a SAW/BAW filter and a Low Noise Amplifier (LNA). The input of the ADC is +/-2.5v and there is a question about knowing the voltage at the antenna and how much the LNA needs to amplify. The individual also shares that they are working with a 1090mhz frequency and a desired distance of 1 to 30 miles. There is a suggestion to mix down the signal and then amplify it, but the individual mentions that the
  • #1
sharted
4
0
greetings all,

I am trying to start some undersampling RF to ADC hobby stuff. I am looking at this development board:
http://www.digikey.com/scripts/dksearch/dksus.dll?KeywordSearch?Keywords=ADC11C170HFEB&vendor=14

I am having a little trouble understanding how to connect the receiver antenna to the board. the board already has DC blocking caps, 1:1 balun transformers, and a termination circuit (to the differential inputs of the ADC) (see the image below)

obviously I need the antenna, a SAW/BAW filter, and probably a Low Noise Amplifier (LNA). however, the input of the ADC is +/-2.5v (approx, depends on reference). how do I know what voltage I will have at the antenna, and how much the LNA needs to amplify. obviously I want the signal close to the peak for the best dynamic range, but I don't want to go over the max. do I need an adjustable gain amp with clipping detection or something? if so, can anyone give me some keywords for searching, or point me in the right direction?

Thanks
(here is the image of the input circuit on the development board)
http://img138.imageshack.us/img138/6084/adccircuit.jpg
 
Last edited by a moderator:
Engineering news on Phys.org
  • #2
Your question is a bit like "how long is a piece of string?"
You would need to know the frequency, the field strength of the received signal and the kind of antenna. However, you would be dealing with less than 1mV for a usable 150MHz off air signal (say 1mV/m), received with a half wave dipole.
There's a calculator here _ http://www.giangrandi.ch/electronics/anttool/field.html which will help you, once you know the frequency and field strength of your signal. (The receiver input volts can be calculated assuming a 50 Ohm input and using
P = V2/R)

Your front end amplification will need to be something like 60dB (VoltsX1000), I think, so you will need some effective filtering to reject interfering signals in adjacent bands.
 
Last edited by a moderator:
  • #3
sorry if I didn't give all the specifications. I was looking for sort of a general answer, for the sake of those who would reply. I don't want to make people have to crunch a bunch of numbers just to answer the question.

however, if it helps, we are talking 1090mhz, with a desired distance from 1 to 30 miles, so about 95 to 125db free space loss (unless I calculated wrong). transmitted power is in the neighborhood of 51dBm, depending on transmitter cable loss and antenna gain. I don't know what receiver antenna I will use yet, but I guess it would range from 2dBi to 6dBi.

the above gives me a receive signal on the order of 1mv. although, I am still not sure how to properly amplify the signal. if the transmitter is close, I could get as much as 10mV but since I am not sure what remaining degradations are in my components (BPF/LNA/ADC/et cetera), I am not really sure of how weak of a signal I can sense at the ADC (through all the introduced component/board noise.). once I have the final configuration of the system figured out, I could get a much better idea of my losses, but I have a lot of unknowns at this stage.

so, since I don't expect much over 10mV, should I shoot for a constant gain of 250 and just accept that farther transmitters will give me poorer dynamic range? or should i try to design with a variable gain amp?

Thanks
 
  • #4
Did you consider mixing down the signal and then amplifying it?
 
  • #5
Waht said: "Did you consider mixing down the signal and then amplifying it?"

yeah, mixing it down would probably be the easiest way to do it. however, the point of the project is to teach myself how to undersample high frequency as close to the antenna as I can get. therefore, it is more of a software define radio. because, in theory, I would just need to change the sampling rate (maybe) and filter/amp circuit at the antenna and I could demod anything up to the low ghz range.

so, if you have any help on my questions, i would appreciate it.

thanks much
 
  • #7
what is your reason for a chain of amps? I can find plenty of single RF amps with 25, 30, 35, and even 40db gain that extend into ghz range. the advantages that I see are (1) I could possibly combine two with very low noise figures to have the same gain but lower noise figure when compared to a single amp, and (2) that I could mix and match amps to get the exact gain I want.

my biggest problem is knowing what gain I should choose. obviously with the ADC, you want the highest dynamic range possible so, in theory, the peak of my signal would be at 2.5v all the time (as long as it's not AM modulated :P ). my problem is picking a gain so that I don't rail my amps at high power, but also high enough that I can still demod lower power. I suppose I could always conservatively design for a dipole antenna, and if my gain is low, switch to a Yagi or something :P

so, is it worth going to a variable gain amplifier (VGA), which usually comes with a high noise figure? or should I just make my best guess on a low-noise static gain?

[begin rant], if I am adding a VGA, I should probably re-spin the development board based on the application note, rather than adding such a complicated board to sit between the ADC board and the antenna. that leads to another headache. I can design boards just fine, but why open another can of worms in an already complicated project.[end rant]

Thanks!

p.s. does anyone know anything about crystal oscillators for undersampling? most oscillators are rated for frequency stability, but I think that just refers to how accurate the frequency is. for my purposes, I care less about how accurate it is (over the long term), and more about jitter (short term variations). do you think it is worth shelling out $180 for a TCXO with a frequency stability as slow as 10 or 20 parts per billion (ppb)
 
Last edited:

Related to How Do You Connect an Antenna to an ADC Board for RF Undersampling?

1. How does an antenna interface with an ADC?

An antenna interfaces with an ADC by converting the analog signal from the antenna into a digital signal that can be processed by the ADC. This is typically done using an RF front-end circuit that includes amplifiers and filters to condition the signal before it is sent to the ADC.

2. What is the purpose of interfacing an antenna to an ADC?

The purpose of interfacing an antenna to an ADC is to convert the analog radio frequency (RF) signal received by the antenna into a digital signal that can be processed by a computer or microcontroller. This allows for the signal to be analyzed, stored, or transmitted digitally.

3. What factors should be considered when interfacing an antenna to an ADC?

When interfacing an antenna to an ADC, factors such as the frequency range of the antenna, the impedance matching between the antenna and the ADC, and the gain and sensitivity of the antenna should be considered. Additionally, the type of ADC being used and its sampling rate should also be taken into account.

4. What are the common challenges in interfacing an antenna to an ADC?

Some common challenges in interfacing an antenna to an ADC include ensuring proper impedance matching between the antenna and the ADC, dealing with noise and interference, and selecting the appropriate sampling rate for the ADC. Other challenges may include designing a suitable RF front-end circuit and ensuring the accuracy and reliability of the digital signal conversion.

5. Are there any best practices for interfacing an antenna to an ADC?

Yes, there are several best practices for interfacing an antenna to an ADC. These include using a low-noise amplifier to improve the signal-to-noise ratio, implementing proper shielding and grounding techniques to reduce interference, and carefully selecting components with suitable frequency ranges and impedance characteristics. Additionally, following the manufacturer's guidelines and considering the specific requirements of the application can also help ensure a successful antenna to ADC interface.

Similar threads

Replies
11
Views
2K
  • Electrical Engineering
Replies
4
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
4
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
5
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
2
Views
6K
Back
Top