Book that goes deep into the basics of statistics

Click For Summary

Discussion Overview

The discussion centers around finding a book that provides a deep understanding of statistical concepts relevant to random processes, particularly in the context of Kalman filtering. Participants express a desire for rigorous treatment of topics such as autocorrelation, ergodicity, and the properties of stationary processes, rather than superficial descriptions.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant seeks a book that rigorously defines and motivates concepts like autocorrelation and ergodicity, expressing dissatisfaction with existing materials that lack depth.
  • Another participant suggests engineering-oriented resources that may not be mathematically rigorous but cover applied random processes more thoroughly.
  • A participant mentions their mathematical background, including measure theory, and emphasizes the need for a deeper understanding of foundational concepts rather than just practical applications.
  • Recommendations include Grimmett and Stirzaker's book for its careful discussions and sections on ergodic theorems, and Papoulis' work for its applied focus, despite its organizational issues.
  • One participant notes that books on time series analysis, such as Box and Jenkins, could provide extensive discussion on autocorrelation, while also suggesting a look into control theory for Kalman filtering.
  • A participant clarifies that their interest lies in understanding the foundational concepts necessary for Kalman filtering rather than the filtering techniques themselves.

Areas of Agreement / Disagreement

Participants express varying preferences for the depth and mathematical rigor of the resources they seek. While some agree on the need for foundational understanding, there is no consensus on specific book recommendations or the best approach to learning these concepts.

Contextual Notes

Participants acknowledge the limitations of existing resources, including the lack of rigorous definitions and the need for motivation behind statistical concepts. There is also mention of varying levels of mathematical treatment across recommended texts.

Avatrin
Messages
242
Reaction score
6
Summary:: Random processes, autocovariance, ergodicity, Gauss-Markov etc

Hi

I am a person who resolutely prefers depth over breadth, and currently I am trying to learn more about random signals and Kalman filtering. However, the books I have found so far will mention and superficially describe concepts like autocorrelation and ergodicity, but they do not go particularly deeply into describing them (what exactly are the properties of stationary processes? Or autocorrelated functions?).

I would prefer a book that is both rigorous and motivates these definitions well. The book I am currently using does not even define the difference between ensemble averaging and time averaging...
 
Physics news on Phys.org
what books are you using, and what is your math background?

For some engineering-oriented accounts, look at
https://www.physicsforums.com/threads/probability-and-random-processes-engineering-approach.341282/
They are not rigorous from a mathematician's point of view, though (no measure theory). I'm not sure the books cover Kalman filtering, but they do cover applied random processes more carefully than many engineering treatments.

Is that the kind of thing you are looking for, or are you looking for something more mathematical?
 
Last edited:
Currently, I am using Brown and Hwangs Random Signals and Applied Kalman Filtering.

My background is mathematical; I have encountered measure theory in the context of integration theory. Moreover, I have taken courses in analysis and topology. Statistics and probability theory is something I did take one course in as well; However, autocorrelation and the other concepts I mentioned above are new to me.

I am not looking for books to learn Kalman filtering; Brown and Hwang do a good enough job at that. What I am looking for is a book that goes deeper into the kinds the concepts I mentioned in my earlier post.

"Autocorrelation is defined as ## R_x = E(X_k,X_i)## " is not enough. I want to know why I should care about that definition. Which kinds of processes do that eliminate? Unmotivated definitions are painful, and that's why I am looking for a book to supplement Brown and Hwang.

So, I guess I am looking for something more mathematical. It doesn't have to be general as to require measure theory, since the purpose right now is to learn Kalman filtering. However, I need to understand those foundations upon which Kalman filtering is built.
 
One pretty good general book is by Grimmett and Stirzaker. I have the second edition
https://www.amazon.com/dp/0198536658/?tag=pfamazon01-20
but a newer third edition is also available. It is not measure-theoretic but they seem to be pretty careful with their discussions and they do have an eye towards applications. Most theorems have proof, but some useful stuff that is too advanced are carefully stated without proof. It has a nice section on Ergodic theorems, which connect time averages and ensemble averages. With your background I suspect this would be a reasonable choice.

I am an engineer so learned this stuff from that perspective so am most familiar with those references. Papoulis' "probability, random variables and stochastic processes" was the primary book that I used to learn about stochastic processes (https://www.amazon.com/dp/0070484775/?tag=pfamazon01-20). It is pretty good but not the best organized. Here is what he says about about second order properties such as covariance functions:
" For the determination of the statistical properties of a stochastic process, knowledge of the [CDF] function ##F(x_1,x_2,\ldots,x_n; t_1, t_2, \ldots, t_n)## is required for every ##x_i##, ##t_i## and ##n##. However, for many applications, only certain averages are used, in particular, the expected value of ##x(t)## and of ##x^2(t)##. These quantities can be expressed in terms of the second-order properties of ##x(t)## defined as follows..."
Papoulis is much more applied than Grimmett and Stirzaker with less theory and more practical examples, applications and motivation. He still has a sections on ergodicity and other important theoretical aspects, though, and is careful with the way he treats the topic. He does have a section on Kalman filters which begins with the statement "In this section we extend the preceeding results to nonstationary processes with causal data and we show that the results can be simplified if the noise is white and the signal is an ARMA process." He then derives the general form, states the practical difficulties, then proves how the two stated assumptions imply some additional properties that allow the estimator to be simplified.

There are many other engineering books similar to Papoulis, some better than others. Some like the book by Stark and Woods better (https://www.amazon.com/dp/0137287917/?tag=pfamazon01-20).

I think the book by Hajek on the other page I linked in my previous post is also pretty good, again from an engineering perspective. It is basically at a level between Papoulis and Grimmett and Stirzaker (closer to the latter). The link has a legal download of an earlier version of what is now a published book (https://www.amazon.com/dp/1107100127/?tag=pfamazon01-20)

Ideally you would have a library to look at these before purchasing - not sure how practical that is right now, though...

Jason
 
  • Like
Likes   Reactions: Avatrin
If you want to see the other extreme of a great deal of discussion of autocorrelation, then you should look at books on time series analysis and ARIMA (Box and Jenkins would certainly have enough). If your interest is more specifically in Kalman filtering, then look into control theory.
 
  • Like
Likes   Reactions: Avatrin
Okay, thanks to both of you! I'll read through those books to see if they help.

But, no, currently my interest is not specifically Kalman filtering; It's more the foundations I need to understand Kalman filtering. In other words, I want to understand the first few chapters of any book on the subject. Generally, for any subject, once I understand the fundamentals as deeply as I need to, the remaining chapters are a breeze. That's always been the case for me; From statistical mechanics to abstract algebra, the first one/two chapters are generally the hardest ones for me. Once those "click", the remainder is much easier for me to get through.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 4 ·
Replies
4
Views
8K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • Poll Poll
  • · Replies 2 ·
Replies
2
Views
9K
  • · Replies 10 ·
Replies
10
Views
2K
  • Poll Poll
  • · Replies 1 ·
Replies
1
Views
3K