Understanding Normalized Filters and Their Impact on Waveform Distortion

AI Thread Summary
Normalized filters are designed with component values that allow for a corner frequency of 1 rad/sec, enabling easy scaling for different frequencies. Understanding the significance of phase shifts is crucial, as they can cause distortion in the output waveform when filtering signals with varying frequencies. To maintain the original waveform shape, filters must ensure that phase shifts increase linearly with frequency, providing equal time delays for all components. This approach prevents distortion by treating all frequencies uniformly, akin to a delay line. Learning to calculate filter values independently is beneficial for verifying software outputs and understanding filter behavior.
jendrix
Messages
120
Reaction score
4
Hello, I am learning about filters but I'm having trouble understanding what 'normalised' means and it's significance. I am using a LPF filter in a project and whilst there is software that will design it for you, I would also like to learn to calculate the values for myself and learn the significance of adjusting component values.

On a second note this got me wondering, if a filter incorporates a phase shift that changes as a function of frequency, wouldn't the input and output waveform change significantly?
 
Engineering news on Phys.org
jendrix said:
if a filter incorporates a phase shift that changes as a function of frequency, wouldn't the input and output waveform change significantly?
Yes, if you filter a waveform having components of different frequencies it can undergo distortion so the output wave shape may look very different from the input waveshape. To preserve the wave shape requires use of a filter where phase shift increases with frequency, and in a linear fashion across the passband, to give equal TIME SHIFT for the various components. Such a filter acts as a delay.
 
jendrix said:
Hello, I am learning about filters but I'm having trouble understanding what 'normalised' means and it's significance. I am using a LPF filter in a project and whilst there is software that will design it for you, I would also like to learn to calculate the values for myself and learn the significance of adjusting component values.

On a second note this got me wondering, if a filter incorporates a phase shift that changes as a function of frequency, wouldn't the input and output waveform change significantly?

There are lowpass filter tables which give you "normalized" parts values, which means: The given values allow a corner frequency of 1 rad/sec.
Using a simple scaling process you can use these data for finding the parts values for any desired corner frequency (end of the pass band).

As to the second question: Of course, frequency-dependent amplitude changes are connected with a corresponding phase shift.
However, speaking of input and output waveforms, it is primarily the frequency-dependence of the various signal amplitudes within the spectrum of the applied wave which is responsible for the output waveform.
Example: A bandpass filtered square wave gives an output signal which looks - more or less - like a sinusoidal wave.
 
  • Like
Likes jendrix and jim hardy
jendrix said:
I would also like to learn to calculate the values for myself and learn the significance of adjusting component values.
Bravo for you ! Learn to calculate them yourself so you know when a computer code is putting out gibberish.

"Normalize" usually means dividing by a base , perhaps center frequency for a bandpass, so that the example they're using to demonstrate works in % or multiples of base instead of the actual number.
Analogous to the "Per Unit" method used in power systems analysis.

As LvW said - it's just scaling .
 
  • Like
Likes davenn
LvW said:
There are lowpass filter tables which give you "normalized" parts values, which means: The given values allow a corner frequency of 1 rad/sec.
Using a simple scaling process you can use these data for finding the parts values for any desired corner frequency (end of the pass band).

As to the second question: Of course, frequency-dependent amplitude changes are connected with a corresponding phase shift.
However, speaking of input and output waveforms, it is primarily the frequency-dependence of the various signal amplitudes within the spectrum of the applied wave which is responsible for the output waveform.
Example: A bandpass filtered square wave gives an output signal which looks - more or less - like a sinusoidal wave.
Hello, I don't suppose you have any resources for this do you? It seems finding the component values yourself isn't typically done anymore but I would still like to understand the process.

Thanks
 
jim hardy said:
Bravo for you ! Learn to calculate them yourself so you know when a computer code is putting out gibberish.

"Normalize" usually means dividing by a base , perhaps center frequency for a bandpass, so that the example they're using to demonstrate works in % or multiples of base instead of the actual number.
Analogous to the "Per Unit" method used in power systems analysis.

As LvW said - it's just scaling .

I'm starting to understand, I have been researching a Butterworth filters for a project. It will be a low pass with a corner frequency of 100 rad/sec. I assume scaling up isn't as simple as taking the normalised values and multiplying by 100?

Thanks
 
You are close...

Consider a simple RC low pass filter. If you multiply the component values by 100 the corner frequency goes down not up.
 
NascentOxygen said:
Yes, if you filter a waveform having components of different frequencies it can undergo distortion so the output wave shape may look very different from the input waveshape. To preserve the wave shape requires use of a filter where phase shift increases with frequency, and in a linear fashion across the passband, to give equal TIME SHIFT for the various components. Such a filter acts as a delay.
I am still trying to get my head around this. My immediate instinct was that if two waves of differing frequencies experienced a different phase shift then they would be out of phase with each other at the output?

Is it because the time period of for example a 10Hz waveform with a phase shift of 90o would give a time delay of 25ms therefore if you had a second waveform at 15Hz you would need a larger phase shift to account for a time delay of 25ms?

Thanks
 
jendrix said:
Is it because the time period of for example a 10Hz waveform with a phase shift of 90o would give a time delay of 25ms therefore if you had a second waveform at 15Hz you would need a larger phase shift to account for a time delay of 25ms?
That's the idea. If everything experiences the exact same time delay, the filter acts like a delay line, and produces no wave-shape distortion. Of course, if you build a filter where all components in the input are passed across to the output with no alteration to their relative amplitudes and all experience a common time delay, it won't be doing much of what we commonly think of as "filtering".
 
  • Like
Likes jendrix
Back
Top