Entropy of the distribution as a function of time

Click For Summary
SUMMARY

The discussion centers on calculating the entropy of a distribution as a function of time using Python. The user shared a Monte Carlo simulation code to estimate the area of an N-dimensional unit circle, but the implementation lacks a clear connection to entropy. Key issues identified include the incorrect evaluation of the expression (1 - (-1)) in the integral calculation, which always results in 2, and the misuse of the exponentiation operator, which affects the intended computation. The conversation highlights the need for clarity in mathematical expressions within programming.

PREREQUISITES
  • Understanding of entropy in statistical mechanics
  • Familiarity with Python programming
  • Knowledge of Monte Carlo simulation techniques
  • Basic concepts of N-dimensional geometry
NEXT STEPS
  • Research "Python Monte Carlo methods for entropy calculation"
  • Learn about "N-dimensional integrals in Python"
  • Explore "Matplotlib for visualizing mathematical functions"
  • Study "Entropy in information theory and its applications"
USEFUL FOR

Data scientists, programmers working on statistical simulations, and anyone interested in understanding entropy calculations in programming contexts.

alex steve
Messages
5
Reaction score
0
I am having an issue with finding the entropy in my program. I was asked to the find the entropy of the distribution as a function of time but i do not know where to start with entropy.

I understand entropy but putting it in my program is where I am stuck

Here is my code:

Python:
# -*- coding: utf-8 -*-
"""
Created on Thu Nov 12 11:15:44 2015"""import matplotlib.pyplot as plt

import random
def Function(D):    #D = dimensions
    sumOfSquare = 0.0
    for i in range(0, len(D)):
        sumOfSquare += D[i]**2
    if sumOfSquare <=1:
        return 1
    else:
        return 0
       
def MonteCarlo(f_n,dim):
    intervalsForSphere = 1000000
    integral = 0.0
    for i in range(0, intervalsForSphere):
        for j in range(0,len(dim)):
            dim[j] = random.random()
        integral += f_n(dim)
    integral = (1-(-1))**len(dim)/intervalsForSphere * integral
    return integral
   
print("10 dimensional unit circle ")
Ten_Dim= list(range(1,10+1))
ten_D_circle = MonteCarlo(Function,Ten_Dim)
print("area:",ten_D_circle)

AreaofCircle = []
x = []

for i in range(1,13):
    D = list(range(1,i+1))
    AreaofCircle.append(MonteCarlo(Function, D))
    x.append(i)

plt.plot(x,AreaofCircle)
plt.xlim([0,13])
plt.xlabel("Dimensions")
plt.ylabel("area")
plt.title("Area of N-dimensional Unit Circle")
plt.show()
 
Technology news on Phys.org
alex steve said:
integral = (1-(-1))**len(dim)/intervalsForSphere * integral

Not really about entropy, but are you certain about this expression? It is always evaluated to 2 between the parentheses. Shouldn't it be (1 - (-1)**len(dim)…) or something similar? I'll look further into the code and try to help, but the quoted code above made me scratch my head.
 
In addition to what DevacDave said about (1 - (-1)) always evaluating to 2, the expression after '**' probably isn't what you want.
Python:
integral = (1-(-1))**len(dim)/intervalsForSphere * integral

The ** operator is higher in precedence than any of the arithmetic operators, so the expression on the right above is raising 2 to the power len(dim), and is then dividing that result by intervalsForSphere, and finally, multiplying by integral.
 
We have many threads on AI, which are mostly AI/LLM, e.g,. ChatGPT, Claude, etc. It is important to draw a distinction between AI/LLM and AI/ML/DL, where ML - Machine Learning and DL = Deep Learning. AI is a broad technology; the AI/ML/DL is being developed to handle large data sets, and even seemingly disparate datasets to rapidly evaluated the data and determine the quantitative relationships in order to understand what those relationships (about the variaboles) mean. At the Harvard &...

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 18 ·
Replies
18
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
3
Views
4K
Replies
4
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K