Recently I became rather bored and decided to play around with random numbers. I utilized a spreadsheet for this. I made 4 columns containing 53 rows. Each entry into a cell was limited to a range between 1 and 300. I then made a 5th column such that each row in the 5 column corresponded to the sum of all the numbers in the same row in each column. I also put the constraint that the sum of each column must be close to each other (a difference of +/- 5). I then did the correlation of each column with the 5th column (the one containing the sum of each row). I then repeated this for a 2nd spreadsheet. I did this again on a third spreadsheet as well, except in one of the columns I copied the same column from one of the previous sheets and pasted it over the original. So if Column B of spreadsheet 3 was made, I would go to spreadsheet 2, copy column B and then paste it over column B in spreadsheet 3. Now what I did next was say that the random numbers would have to occur in a sequence. That is say they must come 5 at a time, and must be entered into the copied column (column b in this case). What I noticed is that when I graphed all the correlations with each other, I found that they showed a trend, such that instead of having a 1 in the number of combinations 30 taken 5 at a time produces chance, I was able to significantly limit the amount of combinations that would fit the pattern. Another intriguing idea for this would be to maybe use past lottery numbers for the numbers and use it to extrapolate a much smaller range for possible numbers to come up. My main question is, is there any known relation that does what is happening here? If I was a bit too vague I can clarify more by using a solid example.