Last updated: 22/Oct/2010
(This is the first draft of a note to be submitted to a special issue of the Bulletin of the Brazilian Mathematical Society)
In this note, by a geometric structure, we mean a normed vector bundle $A \to M$ over a vector bundle (i.e. on each fiber there is a Banach norm, which varies continuously from fiber to fiber), together with a bundle map from to the tangent bundle of , which we will call the anchor map.
Many fundamental structures in geometry are indeed geometric structures.
- A Riemannian metric is a geometric structure, where , the anchore map is the identity map, and the norm on is given by the metric.
- Each vector field can be viewed as a geometric structure with together with a norm (up to isomorphisms such a norm is unique). The anchor map maps the section of norm 1 in to the vector field in question.
- A folitation or distribution on a manifold can also be viewed as a geometric structure: the bundle is a subbundle of , the anchor map is the inclusion map. The norm on gives a metric on the foliation/distribution.
- More generally, a singular foliation or distribution can often be viewed as arising from a geometric structure (where one retains only the image of the anchor map, and forget about the original normed vector bundle )
- If is a Poisson manifold, then on can take (the associated cotangent algebroid), with the anchor map being the contraction map with the Poisson tensor , and choose a norm on .
- Rudolf Clausiu (circa 1865 ?) invented the word entropy(for thermodynamics). From Greek origin: “en” means “internal”, and “tropy” means “transformations”. (Degree of mixing, chaos, …)
- Boltzmann (1870s): formula for Clausius’ entropy (entropy of a system is equal to a constant times the logarithm of the number of indistinguishable micro-states which have the same macro-state of the system).
- Shannon (1940s): entropy in information theory, with a formula similar to Boltzmann’s. In Shannon’s theory, entropy means the amount of information.
- Kolmogorov (1950s): defined measure-theoretic entropy (also called metric entropy) for dynamical systems, by a formula similar to that of Boltzmann and Shannon. Kolmogorov’s entropy may be viewed as the speed of loss of information in a (deterministic) dynamical system due to mixing phenomena of the system. Sinai (late 1950s ?) proved a famous result which say that two Bernoulli shifts are isomorphic if and only if they have the same metric entropy.
- Adler et al. gave a notion of topological entropy for dynamical systems. Dinaburg, Bowen , etc. studied it. (1960s-1970s ..)
- Ghys, Langevin, Walczak (1988): geometric entropy of foliations
- Bis (2004): entropy of regular distributions
Many other definitions of entropy in physics, mathematics, etc.
The aim of this note is to define an invariant, called (geometric) entropy, for an arbitrary geometric structure, and study of some its basic properties. Our notion extends that of Ghys, Langevin, Walczak and Bis in a natural way (to a more general situation, with singularities). In particular, for foliations with a metric, our entropy coincides with geometric entropy of G-L-W.
Definition of entropy
A-admissible paths of speed at most . An A-admissible path on is a (piecewise-smooth) path which can be lifted to a piecewise-smooth path such that for all , where is the projection map, and such that (for all except maybe a finite number of points). The path is said to have speed at most if we can choose its A-lift so that (almost everywhere).
Remark. The above definition of A-admissible paths is similar to that of A-paths in the theory of Lie algebroids. The difference is that here we talk about speed, and look at the paths on instead of the lifted paths.
For each and we will denote by the set of A-admissible paths starting at , and by the set of A-admissible paths starting at whose speed does not exceed .
Metric . Fix a metric on (the entropy that we define will not depend on the choice of ). For each define a new metric on as follows:
Intuitively, the metric measures how far can and can get away from each other by A-admissible paths with speed at most .
Lemma. satisfies the axioms of a metric, and moreover for any , and .
The proof is rather straightforward.
The compact case. Let us first assume that the manifold is compact (to make the definition simple). For each denote by $N_\epsilon (r)$ the maximal number of points in which are at least -separated by the metric . Then put:
By de finition, the above number is the entropy of our geometric structure.
The noncompact case.
where are precompact subsets of .
The local case. Here $B$ is a compact subset of (for example is just a point)
where are precompact neighborhoods of
Comparison to previous notions of entropy
Theorem. In the case of regular foliation with a Riemannian metric, the above definition is equivalent to the definition of metric entropy by Ghys-Langevin-Walczak.
Theorem. In the case of vector fields, the above definition gives the topological entropy (multiplied by 2).
Theorem. In the case of regular distributions, the above definition is equivalent to the definition given by Bis.
Zero vs positive entropy
Theorem. The entropy depend on the norm of (the bigger the norm, the bigger the entropy, and if we multiply the norm by a positive constant, then the entropy will be multiplied by the same constant). However, the fact that it is zero or non-zero doesn’t depend on the choice of the norm.
Theorem. If the anchor map is surjective then the entropy is zero. In particular, symplectic structures have zero entropy
Theorem. If a regular distribution satisfy the bracket-generating condition (and so it’s controllable) then its entropy is 0. (The case of contact structures was proved by Bis).
Proof: Use ball-box theorem in the general case.
(to be added)
Entropy of Poisson structures
In dynamics, integrable systems often have zero entropy.
However, zero-entropy and integrability for Poisson manifolds and two different notions.
Example: integrable Poisson structure with positive entropy (take the wedge product of a hyperbolic vector field times with a constant vector field in a additional dimension)
Example: Non-integrable Poisson structures whose leaves are in $.