Entropy of geometric structures

 

Last updated: 22/Oct/2010

(This is the first draft of a note to be submitted to a special issue of the Bulletin of the Brazilian Mathematical Society)

Geometric structures

In this note, by a geometric structure, we mean a normed vector bundle $A \to M$ over a vector bundle M (i.e. on each fiber there is a Banach norm, which varies continuously from fiber to fiber), together with a bundle map \sharp: A \to TM from A to the tangent bundle of M, which we will call the anchor map.

Many fundamental structures in geometry are indeed geometric structures.

Examples:

- A Riemannian metric is a geometric structure, where A = TM, the anchore map is the identity map, and the norm on A is given by the metric.

- Each vector field can be viewed as a geometric structure with A = M \times \mathbb{R} together with a norm (up to isomorphisms such a norm is unique). The anchor map maps the section of norm 1 in A to the vector field in question.

- A folitation or distribution on a manifold can also be viewed as a geometric structure: the bundle A is a subbundle of TM, the anchor map is the inclusion map. The norm on A gives a metric on the foliation/distribution.

- More generally, a singular foliation or distribution can often be viewed as arising from a geometric structure (where one retains only the image of the anchor map, and forget about the original normed vector bundle A)

- If (M,\Pi) is a Poisson manifold, then on can take A = T^*M (the associated cotangent algebroid), with the anchor map being the contraction map with the Poisson tensor \Pi, and choose a norm on T^*M.

Entropy

- Rudolf Clausiu (circa 1865 ?) invented the word entropy(for thermodynamics). From Greek origin: “en” means “internal”, and “tropy” means “transformations”. (Degree of mixing, chaos, …)

- Boltzmann (1870s): formula S = k. \ln \Omega for Clausius’ entropy (entropy of a system  is equal to a constant times the logarithm of the number of indistinguishable micro-states which have the same macro-state of the system).

- Shannon (1940s): entropy in information theory, with a formula similar to Boltzmann’s. In Shannon’s theory, entropy means the amount of information.

- Kolmogorov (1950s): defined measure-theoretic entropy (also called metric entropy)  for dynamical systems, by a formula similar to that of Boltzmann and Shannon.  Kolmogorov’s entropy may be viewed as the speed of loss of information in a (deterministic) dynamical system due to mixing phenomena of the system. Sinai (late 1950s ?) proved a famous result which say that two Bernoulli shifts are isomorphic if and only if they have the same metric entropy.

- Adler et al. gave a notion of topological entropy for dynamical systems. Dinaburg, Bowen , etc. studied it. (1960s-1970s ..)

- Ghys, Langevin, Walczak (1988): geometric entropy of foliations

- Bis (2004): entropy of regular distributions

Many other definitions of entropy in physics, mathematics, etc.

The aim of this note is to define an invariant, called (geometric) entropy, for an arbitrary geometric structure, and study of some its basic properties. Our notion extends that of Ghys, Langevin, Walczak and Bis in a natural way (to a more general situation, with singularities). In particular, for foliations with a metric, our entropy coincides with geometric entropy of G-L-W.

Definition of entropy

A-admissible paths of speed at most r. An A-admissible path on M is a (piecewise-smooth) path \gamma: [0,1] \to M which can be lifted to a piecewise-smooth path \hat{\gamma}: [0,1] \to A such that \pi(\hat{\gamma}(t)) = \gamma(t) for all t, where \pi: A \to M is the projection map, and such that \sharp \hat{\gamma} = d \gamma / dt (for all t except maybe a finite number of points). The path is said to have speed at most r if we can choose its A-lift \hat{\gamma} so that \| \hat{\gamma}\| \leq r (almost everywhere).

Remark. The above definition of A-admissible paths is similar to that of A-paths in the theory of Lie algebroids. The difference is that here we talk about speed, and look at the paths on M instead of the lifted paths.

For each x \in M and r \geq 0 we will denote by P(x) the set of A-admissible paths starting at x, and by P_r(x) the set of A-admissible paths starting at x whose speed does not exceed r.

Metric d_r. Fix a metric d on M (the entropy that we define will not depend on the choice of d). For each r > 0 define a new metricd_r on M as follows:

d_r(x,y) = \sup_{\gamma \in P_r(x)} \inf_{\lambda \in P_r(y)} \sup_{t \in [0,1]} d(\gamma(t),\lambda(t)) + \sup_{\lambda \in P_r(y)} \inf_{\gamma \in P_r(x)} \sup_{t \in [0,1]}  d(\gamma(t),\lambda(t))

Intuitively, the metric d_r measures how far can x and y can get away from each other by A-admissible paths with speed at most r.

Lemma. d_r satisfies the axioms of a metric, and moreover d_r \geq d_{s} for any r \geq s \geq 0, and d_0 = d.

The proof is rather straightforward.

The compact case. Let us first assume that the manifold M is compact (to make the definition simple). For each \epsilon > 0 denote by $N_\epsilon (r)$ the maximal number of points in M which are at least \epsilon-separated by the metric d_r. Then put:

h_\epsilon = \limsup_{r \to \infty} \ln N_\epsilon (r)

and

h = \lim_{\epsilon \to 0+} h_\epsilon

By de finition, the above number h is the entropy of our geometric structure.

The noncompact case.

h(M) = \sup_U h(U)

where U are precompact subsets of M.

The local case. Here $B$ is a compact subset of M (for example B is just a  point)

h_{local}(B) = \inf_U h(U)

where U are precompact neighborhoods of  B

Comparison to previous notions of entropy

Theorem. In the case of regular foliation with a Riemannian metric, the above definition is equivalent to the definition of metric entropy by Ghys-Langevin-Walczak.

Theorem. In  the case of vector fields, the above definition gives the topological entropy (multiplied by 2).

Theorem. In the case of regular distributions, the above definition is equivalent to the definition given by Bis.

Zero vs positive entropy

Theorem. The entropy depend on the norm of A (the bigger the norm, the bigger the entropy, and if we multiply the norm by a positive constant, then the entropy will be multiplied by the same constant). However, the fact that it is zero or non-zero doesn’t depend on the choice of the norm.

Theorem. If the anchor map is surjective then the entropy is zero. In particular, symplectic structures have zero entropy

Theorem. If a regular distribution satisfy the bracket-generating condition (and so it’s controllable) then its entropy is 0. (The case of contact structures was proved by Bis).

Proof: Use ball-box theorem in the general case.

Sub-additivity property

(to be added)

Entropy of Poisson structures

In dynamics, integrable systems often have zero entropy.

However, zero-entropy and integrability for Poisson manifolds and two different notions.

Example: integrable Poisson structure with positive entropy (take the wedge product of a hyperbolic vector field times with a constant vector field in a additional dimension)

Example: Non-integrable Poisson structures whose leaves are S^2 in $\mathbb{R}^3.

Print Friendly
 

Leave a Reply

  

  

  

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Spam Protection by WP-SpamFree