Well, let's estimate it.
Cosmic ray density is believed to be basically one per 1000 cubic meters, almost all protons (I read this here:
http://hyperphysics.phy-astr.gsu.edu/hbase/Astro/cosmic.html)
So charge density is 10^-3/m^3.
The Laniakea galaxy supercluster that we are in is approximately 160 Mpc in diameter, or 10^24m.
Its volume is therefore about 10^72 m^3.
Multiplying this by the charge density we get a total charge of
Q = 10^69.
Now, the cluster contains around 10^5 galaxies.
So there is a charge of 10^64 per galaxy.
Let's assume each galaxy contains a billion solar mass black hole at its core, and that these holes
ultimately gave rise to the cosmic rays. The compensating negative charges should then accumulate on these holes.
Is the charge enough to bring the holes to extremality?
To answer this we need the black hole mass in Planck units.
The Planck mass is 2x10^-5 g.
The solar mass is 2x10^33 g or 10^38 in Planck units.
A billion solar mass black hole therefore has mass 10^47 in Planck units.
Now 10^64 is definitely larger than 10^47.
In fact it looks like the cosmic ray charge per galaxy is seventeen orders of magnitude larger
than the maximal charge theoretically possible for a billion solar mass black hole.
This is a lot more than a loose electron here and there.