Hi, uperkurk,
division, as you used it in \frac{\pi}{\pi}, is an operation between two numbers, but \lbrace 3,4,5,6,7...\rbrace is a set. If you want to "divide two sets", you would need to define what you mean by that.
Not a big crime, actually, since in analysis courses the real numbers are defined as sequences of fractions, like for example \lbrace \frac 3 1, \frac {31}{10}, \frac {314}{100}, \frac {3141}{1000}, \frac {31415}{10000}, \frac {314159}{100000}, ... \rbrace, that may converge to a "hole" where no actual fraction is (even if some are very close, none is at the actual spot); then operations are defined among these sequences. But your example sequence \lbrace 3,4,5,6,7... \rbrace does not get closer to anything: you can always mention a number, a million, a quadrillion, and your sequence will always surpass that number. It is unbounded.
Perhaps what you had in mind is that, if \frac 3 3 = 1, and \frac 4 4 = 1, and \frac 5 5 = 1, ... what happens as you go on. The best you can say is that\lim_{n \to \infty} \frac n n = 1that is, that the fraction \frac n n tends to 1 as n grows arbitrarily large (not surprisingly, as it was 1 all along), but even that depends on how the numerator and denominator grow; for example, the fractions \frac 6 3, \frac 8 4, \frac {10} 5, ... that is, \frac {2n} n, tend to a different value (2) as n grows large.
You will gradually meet these issues as/if you approach college. Hope this helps with some ideas to toy with in the meantime.