Book that goes deep into the basics of statistics

  • Thread starter Avatrin
  • Start date
  • #1
244
6

Main Question or Discussion Point

Summary:: Random processes, autocovariance, ergodicity, Gauss-Markov etc

Hi

I am a person who resolutely prefers depth over breadth, and currently I am trying to learn more about random signals and Kalman filtering. However, the books I have found so far will mention and superficially describe concepts like autocorrelation and ergodicity, but they do not go particularly deeply into describing them (what exactly are the properties of stationary processes? Or autocorrelated functions?).

I would prefer a book that is both rigorous and motivates these definitions well. The book I am currently using does not even define the difference between ensemble averaging and time averaging....
 

Answers and Replies

  • #2
jasonRF
Science Advisor
Gold Member
1,314
367
what books are you using, and what is your math background?

For some engineering-oriented accounts, look at
https://www.physicsforums.com/threads/probability-and-random-processes-engineering-approach.341282/
They are not rigorous from a mathematician's point of view, though (no measure theory). I'm not sure the books cover Kalman filtering, but they do cover applied random processes more carefully than many engineering treatments.

Is that the kind of thing you are looking for, or are you looking for something more mathematical?
 
Last edited:
  • #3
244
6
Currently, I am using Brown and Hwangs Random Signals and Applied Kalman Filtering.

My background is mathematical; I have encountered measure theory in the context of integration theory. Moreover, I have taken courses in analysis and topology. Statistics and probability theory is something I did take one course in as well; However, autocorrelation and the other concepts I mentioned above are new to me.

I am not looking for books to learn Kalman filtering; Brown and Hwang do a good enough job at that. What I am looking for is a book that goes deeper into the kinds the concepts I mentioned in my earlier post.

"Autocorrelation is defined as ## R_x = E(X_k,X_i)## " is not enough. I want to know why I should care about that definition. Which kinds of processes do that eliminate? Unmotivated definitions are painful, and that's why I am looking for a book to supplement Brown and Hwang.

So, I guess I am looking for something more mathematical. It doesn't have to be general as to require measure theory, since the purpose right now is to learn Kalman filtering. However, I need to understand those foundations upon which Kalman filtering is built.
 
  • #4
jasonRF
Science Advisor
Gold Member
1,314
367
One pretty good general book is by Grimmett and Stirzaker. I have the second edition
https://www.amazon.com/dp/0198536658/?tag=pfamazon01-20&tag=pfamazon01-20
but a newer third edition is also available. It is not measure-theoretic but they seem to be pretty careful with their discussions and they do have an eye towards applications. Most theorems have proof, but some useful stuff that is too advanced are carefully stated without proof. It has a nice section on Ergodic theorems, which connect time averages and ensemble averages. With your background I suspect this would be a reasonable choice.

I am an engineer so learned this stuff from that perspective so am most familiar with those references. Papoulis' "probability, random variables and stochastic processes" was the primary book that I used to learn about stochastic processes (https://www.amazon.com/dp/0070484775/?tag=pfamazon01-20&tag=pfamazon01-20). It is pretty good but not the best organized. Here is what he says about about second order properties such as covariance functions:
" For the determination of the statistical properties of a stochastic process, knowledge of the [CDF] function ##F(x_1,x_2,\ldots,x_n; t_1, t_2, \ldots, t_n)## is required for every ##x_i##, ##t_i## and ##n##. However, for many applications, only certain averages are used, in particular, the expected value of ##x(t)## and of ##x^2(t)##. These quantities can be expressed in terms of the second-order properties of ##x(t)## defined as follows..."
Papoulis is much more applied than Grimmett and Stirzaker with less theory and more practical examples, applications and motivation. He still has a sections on ergodicity and other important theoretical aspects, though, and is careful with the way he treats the topic. He does have a section on Kalman filters which begins with the statement "In this section we extend the preceeding results to nonstationary processes with causal data and we show that the results can be simplified if the noise is white and the signal is an ARMA process." He then derives the general form, states the practical difficulties, then proves how the two stated assumptions imply some additional properties that allow the estimator to be simplified.

There are many other engineering books similar to Papoulis, some better than others. Some like the book by Stark and Woods better (https://www.amazon.com/dp/0137287917/?tag=pfamazon01-20&tag=pfamazon01-20).

I think the book by Hajek on the other page I linked in my previous post is also pretty good, again from an engineering perspective. It is basically at a level between Papoulis and Grimmett and Stirzaker (closer to the latter). The link has a legal download of an earlier version of what is now a published book (https://www.amazon.com/dp/1107100127/?tag=pfamazon01-20&tag=pfamazon01-20)

Ideally you would have a library to look at these before purchasing - not sure how practical that is right now, though...

Jason
 
  • Like
Likes Avatrin
  • #5
FactChecker
Science Advisor
Gold Member
5,692
2,106
If you want to see the other extreme of a great deal of discussion of autocorrelation, then you should look at books on time series analysis and ARIMA (Box and Jenkins would certainly have enough). If your interest is more specifically in Kalman filtering, then look into control theory.
 
  • Like
Likes Avatrin
  • #6
244
6
Okay, thanks to both of you! I'll read through those books to see if they help.

But, no, currently my interest is not specifically Kalman filtering; It's more the foundations I need to understand Kalman filtering. In other words, I want to understand the first few chapters of any book on the subject. Generally, for any subject, once I understand the fundamentals as deeply as I need to, the remaining chapters are a breeze. That's always been the case for me; From statistical mechanics to abstract algebra, the first one/two chapters are generally the hardest ones for me. Once those "click", the remainder is much easier for me to get through.
 

Related Threads on Book that goes deep into the basics of statistics

Replies
5
Views
1K
  • Last Post
Replies
2
Views
4K
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
5
Views
2K
  • Last Post
Replies
5
Views
7K
  • Last Post
Replies
1
Views
764
  • Last Post
Replies
8
Views
6K
Replies
19
Views
3K
Replies
3
Views
10K
  • Last Post
Replies
2
Views
3K
Top