1. The problem statement, all variables and given/known data A radar transmitter used to measure the speed of pitched baseballs emits pulses of 2.0cm wavelength that are .25micros in duration. (a) what is the length of the wave packet produced? (b)to what frequency should the receiver be tuned? (c) What must be the minimum bandwidth of the receiver 3. The attempt at a solution I think I have (a) right: Length of the wave packet is the time of the pulse times the speed of the wave. L=c*t = 75m. (b) This is what I'm not sure on. Must the receiver be tuned for the group frequency or the phase frequency? If I'm thinking about this right, these pulses should make beats. The group frequency will be: 1/(.25micros), right? This is the frequency of the modulation wave. The phase frequency would be c/.02m, right? This is the frequency of the wave 'inside' the modulation wave. Or am I completely misunderstanding the problem? (c) The bandwidth, if im not mistaken is dw, (spread of angular frequency). dw*dt~1 Would dt here just be the .25micros? If anyone could tell me if Im doing this right and help me out, I would appreciate it.