I don't understand what my professor told me. He said that if a fourieer series of a function f is not continuous then it doesn't converge uniformly to f.(adsbygoogle = window.adsbygoogle || []).push({});

But the given theorem only states that, "if a sequence of functions is continuous and converges uniformly to f, then f is continuous."

I'm given that f is not continuous. and that the fourieer series of f is not continuous. How does this mean that the fourieer series doesnt converge uniformly to f?

This is just a application of logic here

A&B=> C

not C=> (not A) or (not B)

knowing (not A) doesnt really give me (not B)

**Physics Forums - The Fusion of Science and Community**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Convergence of fourieer series

Loading...

Similar Threads - Convergence fourieer series | Date |
---|---|

I Complex Fourier Series | Jan 10, 2018 |

A Convergence order of central finite difference scheme | Nov 8, 2017 |

A 2D Finite Difference Convergence Rate Issues | Apr 26, 2017 |

Advection equation stability for explicit scheme | Aug 25, 2015 |

Doubt about convergence test on differential equations | Jun 9, 2015 |

**Physics Forums - The Fusion of Science and Community**