# Predict Pi by neural net?

Hi all,

Would it be possible to try to predict Pi by a neural net. So basically when you input 1 into the neural net you will train it to output 3. Next input 2 and train it to ouput 1. Etc... Next after training the net such that it will predict the first 1000 positions correctly would it predict the 1001th, 1002th, etc... position correctly.

If so Pi is not random. Otherwise Pi is random.

Would this be possible?

## Answers and Replies

Borek
Mentor
No.

• ProfuselyQuarky
Why not?

If so Pi is not random. Otherwise Pi is random.

Pi is not random.

You don't understand what a neural net does. They are complex models which have many parameters and can approximate arbitrary mathematical functions. You need to train them on large amounts of data to tune the parameters and fit the model to your training data.

There are simple series expansions which can already calculate Pi accurately.

• ProfuselyQuarky
Hmm okay, so it probably would work but it would only prove what we already know.

Hmm okay, so it probably would work

I'm not sure how you got that out of the replies so far. It would not work.

• ProfuselyQuarky
Okay thanks for the explanation.

Hmm okay, so it probably would work but it would only prove what we already know.
No it wouldn't. Neural networks detect patterns. The digits of pi have no pattern.

Hepth
Gold Member
I disagree with everyone saying it wouldn't work outright. I think it might be possible, given a large enough network that has some recursion.

Basically with a large number of nodes a NN can represent any differentiable function. The Bailey–Borwein–Plouffe formula rapidly converges to pi, and is a summation. It also be used to give you the Nth hexadecimal digit of pi. So if you can accurately represent the formula by your network, it should be able to give you the Nth hexadecimal digit of pi. Bellard's formula does the same in binary digits.

Now, ASSUMING there exists a formula similar to BPP that will give you the Nth DECIMAL digit, (is there one?) If the neural network can simulate the other two accurately, I see no reason why it could not do the same for decimal digits.

I would believe that you couldn't just use some basic feed-forward NN, but would have to have something much deeper that at minimum loops backward.

This is an interesting project. If you want to work on it and write a paper it would be interesting.

EDIT: I should add that I'm currently using neural networks in my research.

• arnesinnema
Oke thanks for the positive feedback that's what this world needs more.

At the moment however I have no interest and opportunity to work on this. I put this in the open so I guess it's open-source now.

Hepth
Gold Member
No problem, its actually quite an interesting problem and perhaps I'll look into it further, though I need to finish up these 3 other projects first.