Statistical Entropy


Statistical Entropy is the application of probability theory to the principle of entropy showing that entropy is a measure of the amount of disorder in a system. It is based on the probability of molecular positions.

The number of possible ways for a given condition to occur called equivalent microstates is denoted as W.

Entropy is denoted as S.

k is the Boltzmann Constant = 1.38 X 10-23 JL-1

S = k ln W

This shows that entropy is a measure of the disorder of a system.

>

 

This program provides illustrates the basic principle of statistical entropy.
It provides a detailed illustration of the split bottle experiment.
 

Click on the image below to run program.

Click on the blue valve to open it.

The see how applying energy to a system affects the entropy and disorder of a system.

 

Sponsor a page

at $1 a month

$11 for a year

Support this website  with a $1 gift.

 
 

Visit our

Online Store

Gifts of other amounts

Click Here

 

 
 
Custom Search
 
 

Home page