Ok, this is definitely a terminology issue. Sorry if I made anything unclear. To avoid further confusion about terminology, I'll stick empirical examples.
The expectation value of an operator is the average value you would measure. For example, the expectation value of the angular momentum operator (##\langle L_{z} \rangle##) operating on the ground state wavefunction of the hydrogen atom (##|\psi_{100}\rangle##) is just the angular momentum you would measure on average if you had a large sample of (non-interacting) ground state hydrogen atoms. To clarify, if you picked atoms one by one out of this sample and measured their ground state angular momentum, the average of your data would be ##\langle L_{z} \rangle##, aka the expectation value of the angular momentum operator in the ground state.
My intention was to compare the version that Griffiths uses for the standard deviation of an observable to the standard definition of the standard deviation for a continuous variable, which you mentioned you understand. If this is different from what you're used to, the definitions I've used can be found and are motivated in Sections 5 and 6 of Chapter 15 of the 3rd Edition of Mary Boas's book Mathematical Methods in the Physical Sciences.
To paraphrase briefly, the standard deviation is the average "spread" of a function, in the sense of a least-squares fit. Notice that ## Q - \langle q \rangle ## is the distance of the function Q from it's average (aka expected value), ##\langle q \rangle##. At each point in space (i.e. each value of x), the distance ##Q - \langle q \rangle## can vary. The quantity ##(Q - \langle q \rangle)^{2}##, the square of that distance (as a function of x), is a good measure of how far the function Q has varied from its average (at the point x). To extend this to the whole distribution (whereas before it only applied to a single point x), we take the average of the quantity ##(Q - \langle q \rangle)^{2}##, which is called the variance, ## Var_{Q} = \int (Q - \langle q \rangle )^{2} \rho(x) dx##. The variance is how much the function Q differs from it's average value ##\langle q \rangle##, squared, on average. The standard deviation is defined as the square root of the variance.
Hope this helps!