# Sampling frequency

1. Jan 30, 2010

### nightworrier

hello everyone,

I have an exam question about sampling frequency. Actually i really don't get the concept. I have limited sources.
Question:

Bandwith of x(t) 10 kHz
y(t) obtained modulating x(t) with a sinus wave with frequency 30 kHz.
y(t) sampled
a) What is the sampling frequency of y(t) ?
b) z(t)=y(t)+s(t) -> bandwith of s(t) = 40 kHz, what is the sampling frequency of z(t) ?

Could you help me to find a solution ?

2. Jan 31, 2010

### emanuel_hr

It depends on the modulating technique. If it is amplitude modulation, then the spectrum of y(t) will stretch from 10khz to 40khz, thus for correct sampling(Nyquist-Shannon criterion) you need to sample y(t) at a freq of at least 80khz.
As for z(t) the answer is the same as above, since the spectrum of y+s doesn't extend further than 40khz.
Hope this helps.

3. Jan 31, 2010

### nightworrier

Thank you for your answer but i have another question if you can help me , i will be appreciate...

Consider x1(t) and x2(t) with bandwith B1=10 kHz and B2=15kHz.
The signal y(t)=x1(t)-x2(t) is sampled.

Find the minimum sampling frequency for which it is possible to recover the signal y(t) from its samples if the filter is used for reconstruction is ideal or if it has a transfer function which is constant in [-20kHz, 20kHz] and bandwith 35kHz