The higher the magnitude of the divergence, the more they diverge. . .
Let's say that those arrows are created by the vector function -- since I don't know what function you actually used--: (and now you must forgive me for lack of knowing TeX!)
f = (ax, ay)
The divergence of f is
div f = a + a = 2a
A divergence of magnitude 1 would mean that a =1/2, so the arrows have a smaller magnitude. A divergence of magnitude 10 means that a = 5, so the arrows expand rapidly.
This makes sense with the graphical interpretation: if the divergence is 0, then there should be no source, so in this case it is 0 everywhere. (There are many examples of functions with zero divergence that are non-zero, like any non-compressible fluid flow in a tank with a constant volume of fluid, or a magnetic field where there is no changing electric field nearby.)
If you had a tiny box and measured the net flux in or out of the box at a point, that gives you the divergence. This is actually the
http://en.wikipedia.org/wiki/Divergence#Definition_of_divergence" of divergence.
To summarize and get back to your question, a higher divergence in that picture means that the arrows would have a higher magnitude.