# Is there a Lagrangian formulation of statistical mechanics?

+ 10 like - 0 dislike
438 views

In statistical mechanics, we usually think in terms of the Hamiltonian formalism. At a particular time $t$, the system is in a particular state, where "state" means the generalised coordinates and momenta for a potentially very large number of particles. (I'm interested primarily in classical systems for the sake of this question.) Since this state cannot be known precisely, we consider an ensemble of systems. By integrating each point in this ensemble forward in time (or, more often, by considering what would happen if we were able to perform such an integral), we deduce results about the ensemble's macroscopic behaviour. Using the Hamiltonian formalism is useful in particular because it gives us the concept of phase space volume, which is conserved under time evolution for an isolated system.

It seems to me that we could also consider ensembles within the Lagrangian formalism. In this case we would have a probability distribution over initial values of the coordinates (but not their velocities), and another distribution over the final values of the coordinates (but not their velocities). (Actually I guess these would need to be two jointly distributed random variables, since there could easily be correlations between the two.) This would then lead to a probability distribution over the paths the system takes to get from one to the other. I have never seen this Lagrangian approach mentioned in statistical mechanics. I'm curious about whether the idea has been pursued, and whether it leads to any useful results. In particular, I'm interested in whether the idea of phase space volume has any direct meaning in terms of such a Lagrangian ensemble.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Nathaniel
There is the Response-function / Martin-Siggia-Rose formalism which casts a Langevin description into a path-integral picture. See here for a simpler one-particle description. Not sure if this is what you are looking for.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Vijay Murthy
@VijayMurthy that looks interesting, and I'll look into it further. From those handwritten notes it looks like they're starting with some stochastic dynamics and then deriving something that looks like a path integral; whereas I'm hoping for something that starts with a classical Lagrangian and then derives a statistical ensemble based on it. But thanks, and I look forward to taking a closer look.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Nathaniel
I don't know the answer to your question, but must say I am intrigued to see where it leads. Feynman's advice was to try to understand things in as many ways as possible!

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Michael Brown
I concur with Michael Brown, this is a really interesting question! And I had never given the slightest thought of a non-hamiltonian description of statistical mechanics!

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user user17581
One obvious problem with lagrangians, though, would be that one cannot introduce a lagrangian for massless particles, whereas hamiltonian would still exist. The same problem appears, perhaps, when it comes to counting internal degrees of freedom and doing quantum statistics.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Alexey Bobrick
@VijayMurthy For Wilson RG, all spin models get cast into continuous form which means you get Landau-Ginzburg-type Lagrangian.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Slaviks
@Slaviks, Thanks for the comment. One can write an action functional and need not do an RG. The OP asked for a Lagrangian description, not an RG. The MSR action functional can be written for particle systems too. So I dont get your comment. Perhaps I am missing something.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Vijay Murthy

+ 5 like - 0 dislike

The transition between the Hamiltonian and Lagrangian formalisms in mechanics can be accomplished by means of the Hamilton-Jacobi theory. Consider for example a classical statistical ensemble on a phase space $(x,p)$ defined by:

A. The (initial) state of this ensemble is defined by a distribution function $f(x_0,p_0)$ satisfying the normalization condition:

$\displaystyle{\int f(x_0,p_0) dx_0dp_0 = 1}$

($(x_0,p_0)$ are the initial conditions)

B. The time evolution is governed by the Hamiltonian function $H(x,p, t)$.

According to the Hamilton-Jacobi theory, there exists Hamilton-Jacobi phase function $S(x_0, x_1, t_0, t_1)$ satisfying the Hamilton-Jacobi equation:

$\displaystyle{\frac{\partial S}{\partial t}+H\left(x_1,\frac{\partial S}{\partial x_1}, t\right) = 0}$

(where $(x_1,p_1)$ are the coordinates and momenta at time $t$)

The momenta can be derived from the Hamilton-Jacobi phase function:

$\displaystyle{p_i = \frac{\partial S}{\partial x_i}}$

The problem of expressing the state of the system in terms of the initial and final coordinates is rendered to a problem of transformation of probability distributions. We can define the state of the system in the initial and final coordinates as:

$\displaystyle{F_t(x_0, x_1) = f\left(x_0,\frac{\partial S}{\partial x_1}(x_0, x_1, t) \right)}$

The trasnsformation Jacobian is given by:

$\displaystyle{dx_0 dp_0 = \frac{\partial^2 S}{\partial x_0\partial x_1}}dx_0 dx_1$

And the normalization condition:

$\displaystyle{\int F_t(x_0, x_1) \frac{\partial^2 S}{\partial x_0\partial x_1}(x_0, x_1, t)dx_0dx_1 = 1}$

In the general case, the joint distribution $F_t(x_0, x_1)$ will not be separable

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user David Bar Moshe
answered Feb 28, 2013 by (4,195 points)
Hi David, can you tell how this relates to, say, the Ising model away from its critical point?

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Vibert
To elaborate on that question: for a weakly interacting system, the correspondence between Lagrangian and Hamiltonian systems seems unsurprising. But stat. mech. can also easily describe strongly coupled systems (Ising, Potts etc.), and I wouldn't know how they translate to Lagrangians.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Vibert
@Vibert I think this is a very general result, so it will hold regardless of what the interaction terms are. (But note the last sentence: "In the general case, the joint distribution ... will not be separable".) But models like Ising, Potts etc. live in discrete phase spaces, so Lagrangian mechanics doesn't apply to them. (At least, as far as I know, there is no discrete analog of the Lagrangian description.) But that's ok - I was asking about continuous systems, where the Lagrangian and Hamiltonian descriptions can both be used.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Nathaniel
@David also, is there anywhere I can go to read more about this? I mean about the distribution $F_t$ and its properties, rather than about Hamilton-Jacobi theory. Or is this original work on your part? I'm curious about whether this line of thinking leads to any important results.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Nathaniel
@Vibert Spin systems can also be formulated on phase space, for example a single spin phase space can be chosen to be a 2-sphere (the collection of all spin directions). This description can also be generalized to the case of many spins, which can be formulated on "non-flat" phase spaces. The phase space is just the collection of initial data of an evolving mechanical system; The Hamiltonian contains the information about the interaction. Thus, the same description applies for both the weakly and the strongly interacting cases.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user David Bar Moshe
@Nathaniel. You are correct, the differentials in the Hamilton-Jacobi equation should be with respect to the end point. Also, the time dependence that I wrote is not the most general. In the case of an explicitely time varying Hamiltonian, the phase function depends on $t_0$ and $t_1$ and not only on their difference. In this case the time differetiation is also with respect to the end point. In fact you may choose either one of the boundary points.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user David Bar Moshe
@David thanks, that's exactly what I thought. I decided to award you a bounty instead of accepting the answer, because it's still possible that someone will know of a work in which this is applied to statistical mechanics, and I don't want them to be put off by the green tick. Your answer puts me well on the way to working out applications in stat. mech. for myself, so I'm very grateful. Annoyingly, I have to wait 24 hours before I can actually award the bounty to you.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Nathaniel
+ 3 like - 0 dislike

There is a field theory version of statistical physics. The temperature is like the imaginary time. In this way we can formulate theory by path integral with action determined by Lagrangian.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Xiao-Qi Sun
answered Mar 5, 2013 by (30 points)
I know about this. I wish I understood it better, but I don't think it's what I'm looking for. At least, the connection between the two ideas is not obvious.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Nathaniel
+ 2 like - 0 dislike

I am not sure if this is what you are up to (it is related to what Xiao-Qi Sun said) to but I'll give it a try too ...

At the beginning of Chapter V.2 of his QFT Nutshell, Anthony Zee explains how classical statistical mechanics (characterized by the corresponding partition function involving the Hamilton function) in $d$- dimensional space is related to Eucledian field theory (characterized by the corresponding generating functional or path integral involving the Lagrangian).

To see this relationship, consider for example the Minkowskian path integral of a scalar field

$$(1) \,\, \cal{Z} = \int\cal{D}\phi e^{(i/\hbar)\int d^dx[\frac{1}{2}(\partial\phi)^2-V(\phi)]} = \int\cal{D}\phi e^{(i/\hbar)\int d^dx\cal{L}(\phi)} = \int\cal{D}\phi e^{(i/\hbar)S(\phi)}$$

Upon Wick rotation, the Lagrange density $\cal{L}(\phi)$ turns into the energy density and the action $S(\phi)$ gets replaced by the energy functional $\cal E(\phi)$ of the field $\phi$

$$(2) \,\, \cal{Z} = \int\cal{D}\phi e^{(-1/\hbar)\int d^d_Ex[\frac{1}{2}(\partial\phi)^2+V(\phi)]} = \int\cal{D}\phi e^{(-1/\hbar)\cal{E}(\phi)}$$

with

$$\cal E(\phi) = \int d^d_Ex[\frac{1}{2}(\partial\phi)^2+V(\phi)]$$

This can now be compared to the classical statistical mechanics of an N-particle system with the Energy

$$E(p,q) = \sum_i \frac{1}{2m}p_i^2+V(q_1,q_2,\cdots,q_N)$$

and the corresponding partition function

$$Z = \prod_i\int dp_i dq_i e^{-\beta E(p,q)}$$

Integrating over the momenta $p_i$ one obtains the reduced partition function

$$Z = \prod_i\int dq_i e^{-\beta V(q_1,q_2,\cdots,q_N)}$$

Following the usual procedure to obtain the field theory which corresponds to this reduced partiction function by letting $i\rightarrow x$, $q_i \rightarrow \phi(x)$ and identifying $\hbar = 1/\beta = k_B T$ it has exactly the same form as the Euclidian path integral (2).

So it can finally be seen that in this example, the (reduced) partition function of an N-particle system in d-dimensional space corresponds to the path integral of a scaler field in d-dimensional spacetime.

These arguments can be further generalized to obtain a path integral representation of the quantum partition funcction, finite temperature Feynman diagrams, etc too ...

If I understand this right, this line of thought relating statistical mechanics to field theory is for example applied in topics like the Nonequilibrium functional renormalization group or in AdS/CFT to relate the correlation functions on the CFT side to the string amplitudes on the AdS side.

answered May 9, 2013 by (5,440 points)
Thanks. I was asking about formulating classical statistical mechanics in terms of the classical Lagrangian, whereas this links classical statistical mechanics to the quantum Lagrangian approach (i.e. path integrals). But I (somewhat idly) wonder whether there's some kind of deep connection between these two ideas.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Nathaniel
(For some reason my inbox wasn't pinged when you answered. I wonder why.)

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Nathaniel
@Nathaniel I wrote the first version of this post way after midnight, so I had to hide it away first until I have profread and completed it in the morning ...

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Dilaton
I have recently learned that even for classical systems, like for example the Navier Stokes equations, one can derive some kind of "path" or functional integral. The integration is then among other things done over all solutions of the Navier-Stokes equations, instead of all pathes, the scalar fields correspond to the (components) of the velocity, etc And a corresponding (quite long and ugly) action can be derived in case too.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Dilaton
Since for example turbulence theory can be described by nonequilibrium statistical mechanics (in the MaxEnt formalism one would have the energy or enstrophy flux that are constant on an a scale invariant subrange as additional relevant variables to appear in the nonequilibrium distribution function), there must be a connection between the functional integrals of the Navier-Stokes equations and certain statistical mechanics partition functions too I think. But this I have not yet seen in to much detail up to now.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Dilaton
@Nathaniel what exactly is the work of Roddy Dewar about? I dont know it ... I just know about the MaxEnt for both, classical and quantum mechanic systems from a nonequilibrium statistics course I have taken and this paper I found quite instructive and about how to finde a path integral formulation of the Navier Stokes equations I learned from this at the first time. The action of the Navier Stokes equation involves some crazy Grassmann and ghost fields too, ha ha ...

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Dilaton
let us continue this discussion in chat

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Nathaniel
+ 0 like - 0 dislike

The Hamilton formulation of classical dynamics gives rise to a very strong and important theorem in statistical mechanics that is the Liouville theorem. As you probably know already it states that the probability density $\rho(\mathbf{r}, \mathbf{p})$ to be around a given point $(\mathbf{r}, \mathbf{p})$ in phase space follows the equation of evolution:

$\frac{\partial \rho}{\partial t} = \{\rho, H \}$ where $\{ \cdot\}$ denotes the Poisson brackets.

This equation is equivalent to Hamilton equations of evolution for $(\mathbf{r}, \mathbf{p})$.

Now, when you look at macrovariables, it can be worked out (it has been done first by Zwandsig I think) that the Liouville equation (for the microvariables) gives rise to a Fokker-Planck equation for these macrovariables. It is in spirit very similar to the Liouville equation except that there is a stochastic component in it whose simplest characteristic is to add a second space derivative on the right hand side of the evolution equation.

Now, if you know your maths, you also know that any Fokker-Planck equation can be associated to a set of stochastic equations for the macrovariables under study (one very famous being the Langevin equation)...and we are back to something very close to the Hamilton equations but for macrovariables.

In case you were wondering if there is a minimum action principle for these stochastic equations, I am not aware of that. I think, they are very similar to Shrodinger equation in this respect. However what it means is that indeed the macrovariable propagators can be expressed as path integrals. The Wiener measure is one typical case.

Note that my answer is focused on Hamilton and Lagrangian dynamics in the classical sense where they were used to compute trajectories in time.

In classical statistical mechanics, you could find a Lagrangian approach akin to what is done in, say, QFT. This would be the Landau-Ginsburg approach of phase transitions and complex systems in general.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user gatsu
answered Mar 23, 2013 by (40 points)
Thanks for the answer, but my question is about whether there is a formalism that uses the least action principle applied to the microvariables to directly construct an ensemble over phase space paths, without first deriving Hamiltonian equations of motion and then a Fokker-Planck equation.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Nathaniel
It sounds difficult for at least one reason. Least action principle is a functional of the generalized coordinates $\mathbf{q}(t)$ only so there is no such thing as phase space to begin with. Moreover, as far as I know, Lagrangian formalism is not suited to give the evolution of any function of $\mathbf{q}(t)$ so basically I don't quite see how it would work...but maybe you have something more precise in mind. Note however, that there is something called topological entropy that counts the number of possible paths in the system.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user gatsu
The idea is that you would start with a joint distribution of $\mathbf{q}(t_0)$ and $\mathbf{q}(t_1)$, which should then uniquely specify the distribution over paths taken to get from initial to final points. That much is clear, but I'm interested in knowing what follows from such a line of reasoning. One possible application might be to give a concise answer to this question.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user Nathaniel
Ok, I don't very much about this question but you might be interested in the work of Fabbrice Debbasch. Publication 20, although maybe too simple for you, may give you some hints on the directions to take.

This post imported from StackExchange Physics at 2014-03-17 04:24 (UCT), posted by SE-user gatsu

 Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead. To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL. Please consult the FAQ for as to how to format your post. This is the answer box; if you want to write a comment instead, please use the 'add comment' button. Live preview (may slow down editor)   Preview Your name to display (optional): Email me at this address if my answer is selected or commented on: Privacy: Your email address will only be used for sending these notifications. Anti-spam verification: If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:p$\hbar$ysics$\varnothing$verflowThen drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds). To avoid this verification in future, please log in or register.