Acessibilidade / Reportar erro

Bits and q-bits as versatility measures

Abstracts

Using Shannon information theory is a common strategy to measure any kind of variability in a signal or phenomenon. Some methods were developed to adapt information entropy measures to bird song data trying to emphasize its versatility aspect. This classical approach, using the concept of bit, produces interesting results. Now, the original idea developed in this paper is to use the quantum information theory and the quantum bit (q-bit) concept in order to provide a more complete vision of the experimental results.

bit; complexity; entropy; q-bit; measure; versatility


Usar a teoria da informação de Shannon é uma estratégia comum para medir todo tipo de variabilidade em um sinal ou fenômeno. Alguns métodos foram desenvolvidos para adaptar a medida de entropia informacional a dados de cantos de pássaro, tentando enfatizar seus aspectos de versatilidade. Essa abordagem clássica, usando o conceito de bit, produz resultados interessantes. Agora, a idéia original desenvolvida neste artigo é usar a teoria quântica da informação e o conceito de q-bit, com a finalidade de proporcionar uma visão mais completa dos resultados experimentais.

bit; complexidade; entropia; q-bit; medida; versatilidade


BIOLOGICAL SCIENCES

Bits and q-bits as versatility measures

José R.C. Piqueira

Escola Politécnica, Universidade de São Paulo, Av. Prof. Luciano Gualberto, trav. 3-158, 05508-900 São Paulo, SP, Brasil

ABSTRACT

Using Shannon information theory is a common strategy to measure any kind of variability in a signal or phenomenon. Some methods were developed to adapt information entropy measures to bird song data trying to emphasize its versatility aspect. This classical approach, using the concept of bit, produces interesting results. Now, the original idea developed in this paper is to use the quantum information theory and the quantum bit (q-bit) concept in order to provide a more complete vision of the experimental results.

Key words: bit, complexity, entropy, q-bit, measure, versatility.

RESUMO

Usar a teoria da informação de Shannon é uma estratégia comum para medir todo tipo de variabilidade em um sinal ou fenômeno. Alguns métodos foram desenvolvidos para adaptar a medida de entropia informacional a dados de cantos de pássaro, tentando enfatizar seus aspectos de versatilidade. Essa abordagem clássica, usando o conceito de bit, produz resultados interessantes. Agora, a idéia original desenvolvida neste artigo é usar a teoria quântica da informação e o conceito de q-bit, com a finalidade de proporcionar uma visão mais completa dos resultados experimentais.

Palavras-chave: bit, complexidade, entropia, q-bit, medida, versatilidade.

INTRODUCTION

The first papers suggesting informational entropy as a biological complexity measure appeared in the 70’s (Saunders and Ho 1976) and several applications of these ideas have been implemented with good results (Silva et al. 2000, Mazza et al. 2002). The main hypothesis of this approach is that the systems present probabilistic behaviors, implying the construction of s-algebras of events and probability measures over the s-algebras. It is a probabilistic view of nature and it is supposed that, in spite of the probabilistic behavior, a search for regularities is possible. However, the hypothesis of considering that the natural system temporal evolution happens in this way could be complemented with quantum mechanics concepts.

Under the quantum mechanics hypothesis, the information is measured in quantum bits (q-bit) (Hirvensalo 2001). Quantum systems might present self-organized and complex behavior and using q-bits to evaluate complexity seems to be an interesting way to study versatility.

This paper starts showing how to use classical information theory to evaluate complexity, discussing some limitations of this approach. In the sequence, preliminary ideas of q-bits as versatility measure are presented. The compromise here is only with the explanation of the mathematical formulae and the way to use them to model a set of biological data. There is a large number of papers on the application of classical techniques but quantum information models are only beginning to appear (Fagali and Piqueira 2003 unpubl.).

CLASSICAL INFORMATIONAL APPROACH

The idea of using information theory to evaluate biological complexity had a strong development in the middle of the 70's, when researchers tried to collect the relevant data about some process or system and to calculate the necessary memory capacity, measured in bits, in order to record them (Saunders and Ho 1976, Papentin 1980).

Thinking about a system or a process as asource of data, one can propose a probability measure creating the s-algebra of events over the data and, consequently, define individual information measure, in bits, associated to the s-algebra (Khinchin 1957).

In the case of bird songs, the repertoire is a natural atomic partition to construct the s-algebra of events (Silva et al. 2000) and, taking the mean value of the individual information over the possible events, informational entropy is defined in bits per symbol. With this measure one can calculate the optimum code length, in bits, and multiplying it by the total number of events, evaluate the necessary memory capacity in order to record the data related to the system or process (Khinchin 1957, Saunders and Ho 1976, Lint 1982, Piqueira 1994).

Therefore, observing the informational entropy evolution is equivalent to measure how the necessary memory capacity varies according to the passing of time and one can estimate how complex the system or process is becoming, as suggested by some previous works (Piqueira 1994, Rieke at al. 1997, Mazza et al. 2002).

For a given process or system the measured phenomenon has to be divided into several discrete bands, creating the s-algebra of events and taking the relative frequency as a probability, the individual information associated with each band is given by:

Consequently, the informational entropy, in bits/symbol associated with the phenomenon or system, is given by:

Considering bioacoustic measures, McCowan et al. (1999) studied the whistle repertoire of the Bottlenose Dolphins Tursiops truncatus and the informational entropy was measured for several individuals in a population.

The same ideas were applied to bird songs by Silva et al. (2000) but some improvements were introduced. Considering that the notes belonging to the repertoire do not have the same duration, the band of the song of each individual was estimated defining mean time of the notes (t) by:

Then, the bandwidth (H) of each individual was defined by:

This kind of approach, therefore, provides interesting ways, using the parameters E and H, for the comparison of songs from several individuals belonging to a population.

QUANTUM INFORMATIONAL APPROACH

Despite the fact that approaches using classical informational theory are useful and produce plausible interpretations of biological data, a complementary approach, considering the quantum bit (q-bit) concept, can model how the several parts of the whole system depend on each other.

The idea is to consider that the process or system presents quantum behavior with the same bands of measures discussed in the classical model. Each band is associated with a pure quantum state (Hirvensalo 2001), determining a base in a Hilbert space.

From now on, the tools for analyzing the biological phenomena are developed defining state vector and self-adjoint operator, i.e. a linear operator represented by a matrix whose conjugate transpose is its inverse.

The dynamics determined by these operators will be used to study the temporal evolution of the system under study.

Then, considering a n-dimensional Hilbert space with a base (|x1 > , |x2 > , |x3 > , ...,|xn >), each state is represented by:

The complex numbers ai are the amplitudes of the state representation and the product ai is the probability of the state x being represented by the pure state |xi > . Consequently:

Following this procedure, a quantum system of n levels can be represented and its dynamics is given by how the state x varies as time passes.

Considering that between two consecutive instants the state varies from x to x' given by:

the state transition can be represented by a nxn self-adjoint matrix A, such that:

If two quantum systems are simultaneously represented using the bases: (|x1 > , |x2 > , |x3 > ,...,|xn >) and (|y1 > , |y2 > ,|y3 > ,..., |ym >), the composed Hilbert space Hnm, with dimension n · m, has the states K represented by the tensor product (Shoulten 1951):

If a composed state K could be represented by:

it is decomposable. If not, it is called entangled.

This is a new way of describing experiments in bioacoustics which: 1) considers a population of several individuals and, using the same techniques of classical informational theory, writes down the evolutionary state of their measures; 2) considers the population as a whole and verifies if the state of the population is entangled or decomposable. This reasoning may provide new ways of interpreting experimental results.

ACKNOWLEDGMENTS

Supported by CNPq (Conselho Nacional de Desenvolvimento Científico e Tecnológico).

Manuscript received on January 15, 2004; accepted for publication on February 5, 2004.

E-mail: piqueira@lac.usp.br

  • HIRVENSALO M. 2001. Quantum Computation. Berlin: Springer Verlag.
  • KHINCHIN AI. 1957. Mathematical Foundation of Information Theory. New York: Dover.
  • LINT JH VAN. 1982. Introduction to coding theory. Berlin: Springer Verlag.
  • MAZZA M, PINHO M, PIQUIERA JRC AND ROQUE AC. 2002. Using information theory for the analysis of cortical reorganization in a realistic computational model of the somatosensory system. Neurocomputing 46: 923-928.
  • MCCOWAN B, HANSER SF AND DOYLE LR. 1999. Quantitative tools for comparing animal communication systems: information theory applied to Bottlenose Dolphin whistle repertoires. Anim Behav 57: 409-419.
  • PAPENTIN F. 1980. On order AND complexity, I: general considerations. J theor Biol 87: 421-456.
  • PIQUEIRA JRC. 1994. Structural AND functional complexity: an informational approach. San Antonio, Texas: Proc IEEE Conf on Systems, Man and Cybernetics, p. 1974-1978.
  • RIEKE F, WARLAND D, STEVENINCK RR AND BIALEK W. 1997. Spikes: exploring the neural code. Cambridge, Mass.: MIT Press.
  • SAUNDERS PT AND HO MW. 1976. On the increase of complexity in evolution. J theor Biol 63: 375-384.
  • SHOULTEN JA. 1951. Tensor Analysis for Physicists. Oxford: Clarendon Press.
  • SILVA ML, PIQUEIRA JRC AND VIELLIARD JME. 2000. Using Shannon entropy on measuring the individual variability in the Rufous-bellied Thrush Turdus rufiventris vocal communication. J theor Biol 207: 57-64.

Publication Dates

  • Publication in this collection
    08 June 2004
  • Date of issue
    June 2004

History

  • Received
    05 Jan 2004
  • Accepted
    05 Feb 2004
Academia Brasileira de Ciências Rua Anfilófio de Carvalho, 29, 3º andar, 20030-060 Rio de Janeiro RJ Brasil, Tel: +55 21 3907-8100 - Rio de Janeiro - RJ - Brazil
E-mail: aabc@abc.org.br