Title: Generalized information-entropy measures and Fisher information
Abstract: We show how Fisher's information already known particular character as the fundamental information geometric object which plays the role of a metric tensor for a statistical differential manifold, can be derived in a relatively easy manner through the direct application of a generalized logarithm and exponential formalism to generalized information-entropy measures. We shall first shortly describe how the generalization of information-entropy measures naturally comes into being if this formalism is employed and recall how the relation between all the information measures is best understood when described in terms of a particular logarithmic Kolmogorov-Nagumo average. Subsequently, extending Kullback-Leibler's relative entropy to all these measures defined on a manifold of parametrized probability density functions, we obtain the metric which turns out to be the Fisher information matrix elements times a real multiplicative deformation parameter. The metrics independence from the non-extensive character of the system, and its proportionality to the rate of change of the multiplicity under a variation of the statistical probability parameter space, emerges naturally in the frame of this representation.