hey, I am not ignoring the fact that these procedures produce primes.
the fact that you think they are large and that the process is easy is opinion, nothing more. if some of these methods are capable of finding all the primes up to some "large prime" so easily, why do they have to include tests of primality? the algorithm either produces primes or it produces possible primes, and if it doesn't exclusively produce the inclusive list of primes without testing for primality as a part of the computation, it isn't that effective, else the largest prime known wouldn't have taken 50 days to compute, it would have just been generated easily.
im not sure how you deem that these algorithmic solutions that test for primality are "easy". I am quite sure they are easy to understand, but easy to compute? hardly.
i am interested in the process, its the part that makes the procudure "easy" or "hard".
how can you claim the result is something that process is not? if the procedure is tedious, time consuming, and requires massive computing power, how can the result be "rather easy"? testing primality is pretty refined computing wise, but it still takes years to test 10 million digit numbers. they arent just generated, they are produced and then tested by these algorithmic solutions. the test isn't part of the algorithm really, since almost all of the primality tests are the standard test, Lucas set these out over a hundred years ago in basic form.
large is such a relative term, there will always be primes much larger than the ones we are currently testing, and so to say that these numbers are large assumes some initial frame. for me, that frame is referencing when it becomes quicker to look the number up in a table rather than test it. if the number has been tested once, its done. how do previous tests or any previous piece of information make these tests for primality faster based on a known prime table? as far as i know, the largest prime number known doesn't help in finding the next prime after it, and so the calculations always become increasingly tedious and time consuming. and thus, they cannot be easy when applied to something they need to do.
as long as the algorithm uses a test for primality or produces only known primes, its useless as far as technology or finding the next prime number.
i do, in fact, see a problem with finding lots of 10 million digit primes. its going to take forever just to test them, producing them is easy, producing them exclusively and inclusively is not. producing all of them without testing all of them is impossible. to me, that means the opposite of everything you just said.
*EDIT* let me add, that i think the current method for solving primality problems are quite elegant and efficient, my point is that there are always larger primes to be plugged and chugged, making the calculations always necessarily difficult and tedious. I am a fan of the current methods. i never really mentioned any specific problem with the algorithms, aside from the fact that there isn't any use over the known basic forms of primes and a primality test. producing a number and testing a number through brute force are not the same thing, in my mind. producing a number and knowing its prime without testing it was my idea of a "prime producing algorithm" that produces all the primes as it goes up, such that no prime is left out. obviously there is the simple algorithm that just tests each number as you go up by 1. its not fast or easy. none of the mentioned algorithms are doing anything but eliminating numbers based on them being composite, and then doing the same exact thing as testing every single number. thus, they are not fast or easy either. just my opinion. please, watch your tone. is there a an algorithm that doesn't include testing for primality and gives a prime number of increasing size for each increased input?
i don't claim to be all knowing or even having an extensive knowledge on the subject. i do claim to know that the definitions of large are relative. even finding the primality of a mersenne, as you said one of the more easily determined primes, takes days and days when you look at unknown primes. none of these algorithms are faster than a table lookup until you get so large that neither a table lookup or an algorithm are all that quick. still, a table is much much faster. it cannot be said that these algorithmic methods produce large primes, let's use mersenne primes for example, quickly. it took two years with thousands of computers testing to find one, and that was in 1998. though computers are a lot faster today, so are the number of digits in the numbers we are testing. so, it can still be said, that the same methods that took tens of thousands of computers years to test enough to find one prime are in use today. another simple fact, you can't generate a mersenne prime of any given size, say with an exponent higher than the highest known prime, without first testing its exponent. this alone should stick out as a problem. without a table of known primes or without searching all these larege numbers for primes first, the next larger ones cannot even be generated. how that is fast and easy, is beyond me. *EDIT*