Acessibilidade / Reportar erro

The notion of process in nonstandard theory and in Whiteheadian metaphysics

Abstract

In this article I intend to show that certain aspects of A.N. Whitehead's philosophy of organism and especially his epochal theory of time, as mainly exposed in his well-known work Process and Reality, can serve in clarify the underlying assumptions that shape nonstandard mathematical theories as such and also as metatheories of quantum mechanics. Concerning the latter issue, I point to an already significant research on nonstandard versions of quantum mechanics; two of these approaches are chosen to be critically presented in relation to the scope of this work. The main point of the paper is that, insofar as we can refer a nonstandard mathematical entity to a kind of axiomatical formalization essentially 'codifying' an underlying mental process indescribable as such by analytic means, we can possibly apply certain principles of Whitehead's metaphysical scheme focused on the key notion of process which is generally conceived as the becoming of actual entities. This is done in the sense of a unifying approach to provide an interpretation of nonstandard mathematical theories as such and also, in their metatheoretical status, as a formalization of the empirical-experimental context of quantum mechanics.

Actual entity; Concrescence; Coordinate division; Divisibility relation; Genetic division; Infinitesimal; Nonstandard hull; Prehension; Process; Standard; Ultra eigenvector


ARTICLES

Messologgiou 66, 26222. Patras. GREECE. livadas@math.upatras.gr

ABSTRACT

In this article I intend to show that certain aspects of A.N. Whitehead's philosophy of organism and especially his epochal theory of time, as mainly exposed in his well-known work Process and Reality, can serve in clarify the underlying assumptions that shape nonstandard mathematical theories as such and also as metatheories of quantum mechanics. Concerning the latter issue, I point to an already significant research on nonstandard versions of quantum mechanics; two of these approaches are chosen to be critically presented in relation to the scope of this work. The main point of the paper is that, insofar as we can refer a nonstandard mathematical entity to a kind of axiomatical formalization essentially 'codifying' an underlying mental process indescribable as such by analytic means, we can possibly apply certain principles of Whitehead's metaphysical scheme focused on the key notion of process which is generally conceived as the becoming of actual entities. This is done in the sense of a unifying approach to provide an interpretation of nonstandard mathematical theories as such and also, in their metatheoretical status, as a formalization of the empirical-experimental context of quantum mechanics.

Keywords: Actual entity. Concrescence. Coordinate division. Divisibility relation. Genetic division. Infinitesimal. Nonstandard hull. Prehension. Process. Standard. Ultra eigenvector.

1 Introduction - Preliminaries

This work is an original attempt to provide some clues to a connection, on the interpretational level, between A.N. Whitehead's philosophy of organism mainly exposed in his opus Process and Reality, Whitehead (1978), and the underlying roots of the axiomatical foundation of nonstandard mathematical analysis taken as such and also as a supportive metatheory in certain nonstandard alternatives of quantum mechanical theory. It should be noted that the relation between White-head's philosophy of organism and quantum mechanics in general has already been the object of research with various claims as to their mutual relevance in Epperson (2004), Folse (1974) and Shimony (1965); M. Epperson's work in Epperson (2004) will be a major reference source in this respect.

My guiding motivation will be the key notion of the Whiteheadian philosophy of organism which is that of process, in the Category of the Ultimate, described as the becoming of actual entities (termed also actual occasions). In M. Epperson's interpretation of quantum mechanics, this is characterized as a clear ontological principle, in the sense that "every fact is a determinant in the becoming of every new fact, such that the evolution of any fact entails both temporally prior facts and logically prior potentia as data, and an integration of these data that is unique to that evolution." (Epperson (2004), p. 120). To the extent that the Whiteheadian process entails a metaphysical character inasmuch as it is associated with an actual entity as the outcome of a real concrescence of a multiplicity of potentia, otherwise indescribable but only in its outcome (or in terms of the coordinate division of 'satisfaction'), it will be associated on the interpretational level with the underlying assumptions in axiomatizing the existence of nonstandard entities.

In doing so, except for a brief but (hopefully) meaningful reference to some basic principles of the Whiteheadian cosmological scheme below, there will be also a brief reference to the theoretical context of nonstandard mathematical analysis in its two main ramifications, the extensional part (A. Robinson, E. Zakon) and the intensional part (mainly E. Nelson's Internal Set Theory) in section 3. Further in section 4, I will employ certain notions of the Whiteheadian cosmological scheme provide an interpretation of two versions of nonstandard quantum mechanics such as those presented in the pioneering work of M. O. Farrukh in Farrukh (1975) and also in A. Raab's work in Raab (2004). As a matter of fact, sections 3 and 4 contain some nonstandard formalism and terminology, considered vital in grounding my overall arguments, which is nevertheless not absolutely necessary for a reader with no sufficient relevant knowledge in comprehending the ensuing discussion. In the built-up of my arguments I will make some parallel references to corresponding notions of the Husserlian phenomenology, while in section 2, I will associate A.N. Whitehead's theory of extension, an organic part of his overall doctrine, with the question of incommensurability of events in quantum mechanics, based on Y. Tanaka's work in Tanaka (2004).

I will mainly rely on the following principles of A.N. Whitehead's philosophy of organism to the extent that as they can be linked with the 'underlying semantics' of nonstandard mathematical theory as such and also in its merit as a formal metatheory of quantum mechanics. A greater emphasis will be given to his theory of extension, commonly referred to as Whitehead's epochal theory of time, leading to an in-expressibility of the genetic division in the process of becoming of an actual entity within the world (in the sense of its becoming concrete) as opposed to the 'phase' of its having become concrete (coordinate division).

These principles form part of the categorial scheme of Whitehead's philosophy of organism which branches into four distinct categories:

(I) The Category of the Ultimate, (II) The Categories of Existence, (III) The Categories of Explanation, and (IV) The Categoreal Obligations. One should keep in mind that the guiding motivation behind the Whiteheadian categorial scheme is that philosophy should be explanatory of abstraction and not of concreteness. As Whitehead himself put it, "Each fact is more than its forms, and each form 'participates' throughout the world of facts [..] but the individual fact is a creature, and creativity is the ultimate behind all forms, inexplicable by forms, and conditioned by its creatures" (Whitehead (1978), p. 20).

From the Category of the Ultimate, I rely on the notion of creativity, akin in its fundamentality to the Aristotelian category of 'primary substance', which is the ultimate principle by which the 'many' conceived of as the universe taken in disjunction, become each time an actual occasion, thereby constituting the universe taken in conjunction; in a sense, this is the underlying principle abridging plurality to unity.

From the eight Categories of Existence and the Categories of Explanation, I mostly rely on the following:

(i) The actual entities (also termed actual occasions) which are the last irreducible constituent 'things' of which the world is constituted and which are associated with the primary notion of process (or creativity) inasmuch as the latter is the becoming of actual entities, ( ii) The prehensions, or concrete facts of relatedness which are thought of by Whitehead as being a generalization of Descartes' 'mental cogitations' and Locke's 'ideas' and are associated with a fundamental analysis of an actual entity into its most concrete elements. Prehensions are defined as relational properties associated with a process of becoming (concrescence) and point to a subjective factor in which a concrete element is the prehension in question. This kind of analysis discloses the actual entity to be a concrescence of prehensions originating in the process of its becoming. Analysis in terms of prehensions is termed in the Whiteheadian scheme 'division' and is subsequently analyzed to the complementary notions of genetic and coordinate division. Every prehension consists of three factors: (a) the 'subject' that prehends, that is, the actual entity in which that prehension is a concrete element; (b) the 'datum' which is prehended; (c) the 'subjective form' which is the mode by which the subject prehends that datum, (iii) The nexus (plural of nexus) which are sets of actual entities in the unity of the relatedness constituted by their prehensions of each other, that is, constituted in the objectifications of each other and (iv) The eternal objects or pure potentialities 'applied' for the specific determination of facts, which are thought of as pure potentialities realized in the becoming of an actual entity and contributing to its definiteness. It should be noted here that prehensions of actual entities are termed 'physical prehensions', whereas prehensions of eternal objects are termed 'conceptual prehensions'.

I also retain the fundamental notion of concrescence (from the Category of the Ultimate) which may be associated with the process of becoming of an actual entity in which the potential unity of many entities (actual and non-actual) in disjunctive diversity acquires the real unity of one actual entity in its having become. Here lies a metaphysical foundation of Whitehead's categorial scheme in that "the potentiality for being an element in a real concrescence of many entities into one actuality is the one general metaphysical character attaching to all entities, actual and non-actual" (Whitehead (1978), p. 22). In White-head's ontological notion of the world of actuality, the Category of the Ultimate is the most fundamental inasmuch as it is based on the notion of process which is meant as an 'advancing progress' (or concrescence) of actual entities by which they acquire their real unity from a plurality of potentia in disjunctive diversity. In this categorial scheme the mode an actual entity becomes constitutes what the actual entity is; its 'being' is constituted by its 'becoming'. This conditions Whitehead's

Ontological Principle, termed also the 'principle of efficient and final causation', to the extent that the process of becoming has its reason either in the character of some actual entity in the actual world of that concrescence or in the character of the subject which is in the process of concrescence (Whitehead (1978), p. 24).

Dealing with the notion of extensive continuum Whitehead regarded extension in abstraction, defined as a relational scheme grounding the possibility of integrating a plurality of objects within the real unity of experience, as a given 'substratum' susceptible of contemporary actualisations of multiplicities of definite actual or non-actual entities. In that sense it is divisible but not divided and through its real division by each occurrence of actual entities, the notions associated with the epochal theory of time and also that of the spatialisation of corresponding actual entities come into play. Actual entities in the sense of real objectifications are evident presentations (cf. with the Husserlian Gegenwartigungen) to the experience of a prehending subject in which case "they are only directly relevant to the subject in their character of arising from a datum which is an extensive continuum. They do, in fact, atomize this continuum." (Whitehead (1978), p. 62). The extensive continuum is, in this regard, a unique relational complex in which all potential objectifications find their actualisations and in which there are always actual entities beyond actual entities as non-entities necessarily imply absence of relations (prehensions). Whitehead considered this continuum in its proper generality as independent of any historicity and also as not implying any shapes, dimensions or measurability which are thought of "as additional determinations of real potentiality arising from our cosmic epoch." (ibid. p. 66).

It seems justifiable, at this point, to call attention to a parellel impredicative notion of process in the axiomatical foundation of non-standard theory as such and in its reformulation as a quantum-mechanics supportive theory and in the content of Whitehead's notion of process in Process and Reality. In both approaches we can clearly see an underlying subjective (and non-objectifiable) factor in shaping respectively standard mathematical objects and concrete actualities in objective reality. It may be helpful here to cite A.N. Whitehead's view of the interrelation of coordinate analysis vs. genetic analysis in describing the passage from real potentiality to actuality.

"Physical time makes its appearance in the 'coordinate' analysis of the 'satisfaction'. The actual entity is the enjoyment of a certain quantum of physical time. But the genetic process is not the temporal succession: such a view is exactly what is denied by the epochal theory of time. Each phase in the genetic process presupposes the entire quantum, and so does each feeling in each phase. The subjective unity dominating the process forbids the division of that extensive continuum which originates with the primary phase of the subjective aim. The problem dominating the concrescence is the actualization of the quantum in solido.[...] There is a spatial element in the quantum as well as a temporal element. Thus the quantum is an extensive region. This region is the determinate basis which the concrescence presupposes.[... ] The concrescence presupposes its basic region, and not the region its concrescence. Thus the subjective unity of the concrescence is irrelevant to the divisibility of the region. In dividing the region we are ignoring the subjective unity which is inconsistent with such division."(Whitehead (1978), pp. 283-284).

Further, the subjective form of the coordinate division is associated with the emergence of conceptual feelings which are related to the totality of the region (of an actual entity) and are not restricted to the divided subregion but only as merely potential coordinate subdivisions which is equivalent to saying that conceptual feelings are related to the actual entity in its entireness and not to its 'coordinate subdivisions' (Whitehead (1978), pp. 286-287). We should point here to the fact that A.N. Whitehead, in evident divergence from the Husserlian subjectivist approach, is led to an assumption of an extensive connection serving as the foundational ground for consecutive actualisations, those running e.g. from an antecedent actual entity A through to a next actual entity B. Thus, a fundamental scheme of extensive connection is assumed to articulate on a uniform plan: 1) the general conditions corresponding to the bonds that unite the atomic actualisations in a unique nexus; 2) the general conditions corresponding to the bonds that unite the infinite number of coordinate subdivisions of the satisfaction of an actual entity (Whitehead (1978), p. 286). In short, common extensiveness provides for the possibility to treat an atomic actuality as it were a multiplicity of coordinate actualities and, in reverse, to treat a nexus of many actualities as it were one actuality. There are no meaningful physical relations out of the extensiveness scheme in the sense that any actual occasion in the physical world cannot but be a correlate of a concrescence within this extensive connection scheme. Taken in a restricted sense, common extensiveness may be linked with the classical ontological question of whole and parts inasmuch as this may be taken equivalent to the notion of an extensive whole and extensive parts.

On this account, even in adopting T. de Laguna's more general notion of extensive connection (ibid., p. 287), a major deficiency of the corresponding Whiteheadian approach, which initially led to a confusing notion of 'point' defined in terms of a theory of durations, may be found out in suspending the question of a subjectivity that underlies extensive connection, the latter being merely thought of as an objectified state of things. In this respect, Whitehead admitted in his earlier works, The Principles of Natural Knowledge, Whitehead (2007a) and The Concept of Nature, Whitehead (2007b), to a certain inaptitude of the extensive abstraction method to define a 'point' without entering a theory of duration, whereas his ambiguous re-evaluation of the notion of point further in Process and Reality does not seem to much clarify things (Whitehead (1978), pp. 297-301). He also seems to lapse into circularity with respect to the problem of time as he was further driven to admit that space and time are aspects of nature that presuppose the extensiveness scheme, whereas extensiveness as such cannot at all determine by itself the special processes that relate to the physical time and space (Whitehead (1978), p. 68-69).

Of course, Whitehead admitted in the Categories of Explanation, the autonomous internal real constitution of an actual entity within the process of creative advance. In such an interpretation, the actual entity is the 'subject' of its own 'immediacy', in the sense of the completion of a process of transformation from a decoherence (state) within concrescence to a unique coherence (state) upon satisfaction. However, there is a certain deficiency of the descriptive means to account for the constitutional capacities of an actual entity in the process of its own self-creation which may seem all the more evident in taking into account the constitution of time as an objective process within actual world and even deeper, taken as a reflexion of an entity's ever in-act self-constituting subjectivity. In response, A.N. Whitehead applied certain principles of his categorial scheme, e.g. the Ontological Principle, the Relativity Principle, the notions of process and prehension, etc. to account for, which to one or the other extent may be ultimately taken to imply a constituting subjectivity associated with concrescent processes within actual world.

It should be noted that in a Whiteheadian sense, the term extensiveness refers to something more fundamental than epistemic spatio-temporality and can be thought of as the general scheme of relations that permit to a plurality of objects to 'fuse' into the real unity of a unique experience. However, the act of becoming, though it may concern anything having a temporal extension, it is nonetheless not extensive itself, in the sense that it may be divisible in anterior and posterior acts of becoming corresponding to the extensive divisibility of what has become (Whitehead (1978), pp. 67, 69).

In conclusion, Whitehead's suspension of the role of constituting subjectivity led him in the first place to a reduction of the relational extensiveness to the classical question of extensive whole and extensive parts, whereas his eventual attempt to achieve a satisfactory treatment of the resulting circularity, by defining a spatial point through the notion of an abstractive ensemble, hurt to the problem of an infinitely regressing sequence of connecting regions (Whitehead (1978), pp. 297-300). In this respect, I comparatively refer to Husserl's general notion of intentionality and its a priori directedness towards an (intentional) object by which we can reduce the intentional perception of a 'thingness'-individual to the abstract form of an empty-something (Leersubstrat) without any material content, even without a temporal, analytically describable one (Husserl (1977), Husserl (1974), resp. pp. 33-34 & p. 211). Moreover, on a constitutional level associated with the phenomenology of temporal consciousness, Husserl reduced the notion of a spatiotemporal point to that of a specious present conceived of as a non-point-like temporal unit within the immanence of consciousness in the a priori connection: protention - original impression - retention (Husserl (1966), pp. 76-83).

Generally, in a formal-mathematical context, points in the sense of irreducible individuals of standard mathematical theories are associated with zero-level elements within a general cumulative structure (mainly by means of the Foundation Axiom of ZFC Theory), whereas in the version of non-standard theories by ultrapower construction they are associated with a definition of infinitesimals of various orders in which the infinitesimals of a given order appear to be atoms without inner structure to the immediately higher order until we unravel their own structure in a kind of Russian doll game and reveal a class of elements of a lower order playing provisionally the part of indecomposable atoms-points.

1 In nonstandard analysis this leads to a view of points-elements, e.g. of the nonstandard extension R* of the set of real numbers R, as having an inner structure to the extent that they are formally defined as equivalence classes of infinite sequences modulo an ultrafilter F over the set of natural numbers. In this context, the standard real numbers of R are represented as the constant equivalence classes of R*.

1 1 In nonstandard analysis this leads to a view of points-elements, e.g. of the nonstandard extension R* of the set of real numbers R, as having an inner structure to the extent that they are formally defined as equivalence classes of infinite sequences modulo an ultrafilter F over the set of natural numbers. In this context, the standard real numbers of R are represented as the constant equivalence classes of R*.

2 Some prompts from questions of quantum measurement

As already stated in section 1, there have been various approaches as to the relevance of Whitehead's cosmological scheme with quantum mechanics in general, either in the so-called 'old' version associated with the early work of Planck and Einstein and applied to Bohr's 1913 atomic model or the 'new' version associated with the work of Heisenberg, Schrödinger, Bohr et al, commonly referred to as the Copenhagen Interpretation. Here are some correlated features of quantum theory taken in its 'historical entirety', as referred to in Epperson (2004) (pp. 129-132), and of the Whiteheadian philosophy of organism, beyond the most common correlation which is that of the quantum state evolution in the former and the concept of concrescence in the latter. These are:

(i) The presupposition of a world of existing, mutually interrelated facts. This presupposition grounds the logical necessity of nonlocal correlations of physical objects taken as serial historical routes of quantum actual occasions such as those encountered in EPR-type experiments; it grounds, as well, the possibility of an actual infinity in presentational immediacy in forming in abstraction infinite extensions of finitistic in scope predicate formulas, as we'll see in next section, (ii) The inclusion of some of these facts in the state specification or in the act of measurement and the exclusion of the rest of facts with their potentia. The exclusions relate to a process of negative selection productive of the decoherence effect, (iii) The evolution of a system of a multiplicity of facts to the unity of a novel fact, (iv) The requirement that this evolution proceeds relative to a particular fact, belonging to a subsystem of facts, referred to in quantum mechanics respectively as 'indexical eventuality' and the 'measuring apparatus'; Whitehead's equivalent term, is the prehending subject as given in his Ontological Principle and the Category of Subjective Unity, (v) The measurement or state specification that entails the actualization (or concrescence) of one novel fact/actual entity from a multiplicity of valuated potential facts/entities which themselves arise from antecedent facts (data); in quantum mechanical description this non-unitary evolution terminates in a matrix of probability valuations, anticipative of a final unitary reduction to a single actuality/quantum state. Ultimately then, concrescence/state evolution is a unitary evolution from multiplicities of potentia of actualities to a unique actuality. In a yet alternative quantum mechanical description, there is a vector projection of the actual, evolving multiplicity of facts onto a vector (or subspace) representing a potential 'formally integrated' eigenstate. The Whiteheadian analog of the vector projection onto a potential integration is the ingression -where a potential formal (in the sense of applying a form to the facts) integration arises from the ingression of a specific 'potentiality of definiteness' via a 'conceptual prehension' of that specific potentiality (the term 'potentiality of definiteness' is used as equivalent to that of eternal object, see p. 211). Though Whiteheadian 'ingression' and the quantum mechanical vector 'projection' are thought of as conceptually equivalent, a certain divergence is pointed out as to the 'primacy' of each one of the two in the process of actualization (Epperson (2004), p. 131).

Further, there are two characteristics, shared by both the quantum mechanical and Whiteheadian notions of potentia. First, there is a sense of pure potentia, meaning that an eternal object, in Whiteheadian approach, 'is a pure potential in the universe' which, conceptually felt, is itself neutral as to the fact of its physical ingression in any particular actual entity of the temporal world. In quantum mechanics, this pure potentiality is reflected in that the state vector | Ψ > can be expressed as the infinite sum of vectors belonging to an infinite number of subspaces of infinite dimension, representing an infinite number of potential states or 'potentialities of definiteness' referent to no specific actual occasions and potentially referent to all. Second, as quantum mechanical projections are 'inherited' from the facts constituting the initial state of the system, similarly, in the Whiteheadian scheme antecedent facts, when prehended, are typically 'objectified' by one of their own historical 'potential forms of definiteness' (Epperson (2004), pp. 131-132).

Next, I will point to a possibility of interpretation of the incommensurability condition in quantum measurement in terms of Whitehead's epochal theory of time, something that leaves aside the question of the divisibility of a complex quantum event into atomic component events. As it stands, the adoption of the indeterminacy principle in the Copenhagen interpretation of quantum mechanics is linked with the necessity of a statistical treatment as one cannot predict the future behavior of an individual quantum entity due to the indivisibility of the relation of the observer and the observed. In an attempt to provide a common interpretational framework for N. Bohr's complementarity and White-head's epochal theory of time, Y. Tanaka's approach in Tanaka (2004) can be essentially summed up in that the incommensurability condition

The event a has the character of 'individuality' is a way of disconnecting the individuality of a quantum event from a notion of atomism in the sense of the divisibility of a complex event into atomic component events. Instead, what comes up here is a notion of the 'individuality' of an actual occasion (or entity) in the process of concrescence, insofar this is associated with the genetic division of the actual occasion in question taken in its entirety, as it is described in Whitehead's epochal theory of time (Tanaka (2004), p. 3). In a certain sense, the character of the 'individuality' of a quantum event a is associated with its indivisibility in relation to any event b, which may be further reduced to Whitehead's metaphysical distinction between the coordinate and genetic division of an actual occasion inasmuch as genetic division entails also the principle of causality within the 'internal' development of an actual occasion; moreover, it is not associated with physical time as each 'phase' of the genetic process presupposes the entire quantum (actual entity). In contrast with the atomistic view of events implicitly presupposed in classical physics, 'indivisibility'(or 'individuality') of an event in quantum mechanics is associated with an irreducible contingency in the respective context of measurement. This special character of quantum 'individuality' is shown to formally result in the breakdown of Bell's inequality (Tanaka (2004), pp. 1415). In relation to this sort of quantum 'individuality', I note that in Whitehead's view each instance of concrescence "is itself the novel individual 'thing' (clar. of the author: actual entity) in question. There are not 'the concrescence' and 'the novel thing': when we analyze the novel thing we find nothing but the concrescence." (Whitehead (1978), p. 211).

There is a deeper question here, as Whitehead's genetic division may be ultimately rooted on a kind of 'internal' temporal transcendence in the shaping of a quantum measurement along the triangle quantum object - measuring apparatus - conscious observer and it is going to be touched again in the final conclusions of this paper. As a matter of fact, Heisenberg, (in The Physical Principles of the Quantum Theory, Heisenberg (1930)), had notably refrained to talk about the objective reality of the intermediate state of a quantum system between the experimental preparation and measurement in the sense of a transition from the 'possible' to the 'actual', as he believed that the description of the intermediate development between two objectively measured or measurable states does not correspond to a physical reality. This non-objectifiability is echoed in Whitehead's Principle of Relativity according to which every actuality is a potential determinant in the becoming of every new actuality, in a way that the potentiality of being an element of a real concrescence of many entities into a unique actuality 'is the one general metaphysical character attaching to all entities, actual and non-actual', (Whitehead (1978), p. 22). This allegedly metaphysical characteristic within each real concrescence implies the implicit presence of a transcendental factor underlying the process of becoming of a real entity into actuality, which is 'self-annulled' upon actualization (in terms of being) of the real entity in question.

In any case and on a metatheoretical level, the jump of truth values in the process of measurement which is the formal result of the absence of an isomorphism between Boolean and non-Boolean structures - assuming that a quantum object, considered as an objective existence, is the non-distributive lattice (i.e., non-Boolean) of its properties - forces for a Boolean observer the need of the existence of an objective time in which he must 'move' (Grib (1993), p. 2396). I note that the question of formally 'filling-in' the existing gap in the process of quantum measurement is associated with J. von Neumann's projection postulate (or 'the reduction of the wave function' postulate) which assignes to the mathematical translation τ(s(t)) of the physical state s(t) of a quantum quantity Qi upon a first-kind measurement at time t the same eigenvector ψκ as to the translation τ(s(ti)) of the state s(ti) of the quantity Qi at time ti soon after the measurement (dalla Chiara (1977), p. 334).

As a matter of fact, even if we assume Von Neumann's projection postulate or Van Fraassen's modal interpretation of quantum mechanics as 'external' metatheoretical conditions in a purely logical way, we cannot be led by any analytic linguistic means to a complete description of the change of states that takes place during the measurement process in the compound system 'quantum system + measuring apparatus'.

3 Nonstandardness within formal-mathematical theories as such

In case we characterize the predicate standard in non-standard mathematical analysis as referring to a notion of 'fixedness' in informal mathematical discourse, then we can see in the following how this formal notion acquires by its axiomatical underpinning in E. Nelson s Internal Set Theory, (Nelson (1986)) a meaning that can be taken under certain assumptions as analogous to the meaning of an actual entity in Whiteheadian sense. Moreover, it will be made at the same time clear the extent to which the axiomatical foundation of non-standard numbers is conditioned on the implicit assumption of a standard universe meant as the 'outcome' of 'fixed horizon' processes concerning well-meant mathematical objects. In the present context by the designation, well-meant mathematical objects, I characterize those formal objects which may be taken as finitistic outcomes of complete, reproducible, finite-time, discrete mental processes. It is notable that historically the development of the theory of infinitesimal and infinite entities within classical mathematics was always facing questions concerning their objective existence as they entered as shadowy entities in definite mathematical problems (e.g. the area of curved surfaces, the instantaneous velocity of a moving body, etc.) while, nevertheless, their approximative status in calculations was leading to empirically sound results. As a matter of fact, even in discarding the infinitesimal quantities of the type of the 17th century calculus and adopting the famous Weierstrass δ - criterion for the definition of a function (or a sequence) limit, no one could still give an articulate description of the sort of the infinitesimal numbers involved, all the more so to provide a recursive process to produce, for instance, each time a number such that < for any (standard) number > 0. In the next, we will see that non-standard theories, even though they qualify as axiomatically consistent theories in generating non-standard entities, they cannot define them in terms, for instance, of a recursively enumerable process over sets and functions explicitly defined over natural numbers, but must instead employ principles that are conditioned on the assumption of an impredicative (i.e., one that the definiens cannot be defined but in terms of the definiendum) boundless actual infinity.

A standard application of the (upward) Skolem-Löwenheim Theorem over the class P of Peano structures is that there exist nonstandard models of Peano arithmetic PA for every cardinality κ > which are not isomorphic to the standard model of arithmetic N =< N, +,' ,s, 0 >. It is critical to see, though, that this is a result basically reduced to mathematical statements conditioned on the assumption of an impredicative infinity in (mental) presentational immediacy. For one thing, the Skolem-Löwenheim Theorem is not only conditioned on the Model Existence Lemma but also on the compactness theorem whose well-known formulation is that an arbitrary set of sentences Γ has a model iff each finite subset of Γ has a model.

Let us consider, in the context of Henkin theories,

It is notable too in the above that in proving the compactness theorem one has to rely on the Model Existence Lemma 4 The classical formulation of the Model Existence Lemma is that if Γ is a consistent set of sentences, then .. has a set-theoretical model. 4 4 The classical formulation of the Model Existence Lemma is that if Γ is a consistent set of sentences, then .. has a set-theoretical model. and consequently on the assumption of the Axiom of Choice or alternatively on certain weaker logical forms (e.g. the Boolean prime ideal theorem). The common denominator in all logical variants of the Axiom of Choice is that they are not formally derivable from the Zermelo-Fraenkel Set Theory and that on a metatheoretical-cognitive level they are constrained (as the Axiom of Choice itself) on the possibility of extension of a well-meant process of choice over an indefinite, unbounded infinity. The pivotal role of the Axiom of Choice may be seen in its direct application in the proof of the Model Existence Lemma in van Dalen (2004) (pp. 106-109); as a matter of fact, the implicit, at least, application of this actual infinity axiom or of certain logically equivalent forms is necessary in all known proofs of the Skolem-Löwenheim Theorem. For instance, a major key to the proof of the Skolem-Löwenheim Theorem is the lemma that if a set Γ of sentences in a language L is consistent, then Γ has a model of cardinality at most the cardinality of language L. In the first place, the construction of a model U of Γ is based on the definition by recursion of a Henkin extension Tw of a theory T which is conservative over T and then on the application of Zorn's lemma for the construction of a maximal extension Tm of Tw. As it is wellknown, Zorn's lemma is proved to be logically equivalent to the Axiom of Choice.

In the next, I will review the way in which non-standard mathematical theories are built as consistent extensions of standard theories provided with an extra axiomatical construction that essentially 'projects' the universe of standard processes (these meant in a nonrigorous fashion as distinct, finite-time operations carried out within objective world) over an unbounded, indefinite horizon. Taking into account that in Whiteheadian approach the ultimate acts of immediate actual experience are actual entities, prehensions and nexus and all the rest are (with regard to our experience) derived abstractions then we can see in the following how a particular branch of non-standard analysis, the Internal Set Theory (1ST), formalizes, by three extra ad hoc axioms added to the corpus of axioms of Zermelo-Fraenkel theory, the possibility of forming indefinite collections, in the sense of wholes, of formal objects taken in the sense of actual entities. This is done in a way which may be associated on the one hand with a concrescence of prehensions associated with the subjective unity of a performing subject and on the other hand with the actualization of corresponding entities in the co-ordinate division of 'satisfaction'.

The Internal Set Theory is generally considered, if properly interpreted, as an intensional part of nonstandard analysis along with other nonstandard and non-Cantorian theories such as the Alternative Set Theory (AST) (Vopênka (1979)), ultrafinitist theories (J. Hjelmslev, S. Lavine, A. S. Yessenin-Volpin) and more recently Nonstandard Class Theory Andreev & Gordon (2001) and the Theory of Hyperfinite Sets Andreev & Gordon (2006).

In a non-Cantorian sense infinitesimals and infinitely large numbers do not exist in an objective way as in the extensional part of nonstandard mathematics (e.g. Robinsons nonstandard analysis) but their meaning is indirectly related to the subjective limitations of an 'observer performing his 'observations in a local and non-Cantorian way (roughly meant, not over a pre-established actual infinity) in his witnessed universe.

From a syntactical point of view, E. Nelson introduced in the classical ZFC theory a new unary undefined predicate standard together with three axioms, the Transfer (T), the Idealization (I) and the Standardization (S) principles (Nelson (1986), pp. 3-11). The ad hoc axiomatical equipment of the new predicate standard consists precisely of these three axioms, which in spite of their syntactical role in the theory induce 'in rem' a nonstandard extension in the domain of 'fixed' objects, where the term 'fixed' (in a broad sense finitistic) can be used as the intuition of the new predicate standard in informal mathematical discourse.

One of the simple consequences of the Idealization Principle (I) is the very existence of nonstandard elements, in particular the fact that every infinite set contains a non-standard element, a result de facto invalidating the induction theorem of standard arithmetic. But this result, is conditioned on the assumption 'encoded in the Idealization Principle, that predicating a property formalized by an existential internal formula A for any standard element x (of an indefinite collection of such elements) is equivalent to predicating the same property for an element x of any standard finite set of such x's. This is as- sociated again, as in the proof of the compactness theorem (p. 223), with the subjectively generated possibility of 'concretizing' an indefinite aggregation of formal objects, conditioned on the existence of an impredicative continuous substratum in presentational immediacy, into a 'tangible' finite ensemble of such objects associated in turn with the concrete intuition of discrete finite-time acts. In a certain sense, both the transfer and idealization principles may be viewed as essentially an axiomatical means of formalizing the passage from the indefiniteness associated with concrescent processes to the 'fixedness' associated with actual entities upon their actualization.

Concerning Robinson's introduction of nonstandard elements by the construction of B-enlargements of standard models in Robinson (1966) or Zakon's introduction of nonstandard numbers by the non-constructive version of the space of equivalence classes of infinite sequences modulo an ultrafilter over the set of natural numbers in Robinson & Zakon (1969), one has to eventually apply some form of the Axiom of Choice or of its logically equivalent Zorn's lemma. In Zakon's version, Zorn's lemma is applied to guarantee the existence of an ul-trafilter extending the Fréchet filters of all cofinal subsets of natural numbers, whereas in Robinson's nonstandard analysis the Axiom of Choice and Zorn's lemma are both applied through the compactness theorem in the construction of an appropriate ultraproduct as a model for a certain set K of sentences (Robinson (1966), pp. 13-19).

In any case, the application of the Axiom of Choice or of its various logically equivalent versions, by the very assumption of a Cantorian-type infinity in presentational immediacy conditioning this principle (which may be ultimately reducible to a continuous subjectively constituted temporal substratum), attests to the fact that, no matter the formal ways to represent a nonstandard mathematical entity, there is no way to suppress the residuum of a 'midway' non-predicative process in reaching the non-standard entity in question as a concrete objectiv- ity. It is reasonable to assume that in such a case a non-standard entity should be taken as the final phase towards the actualization, or 'satisfaction , of a concrescent process in Whiteheadian sense. A clue from standard mathematics may help to better comprehend the place of the Whiteheadian notion of concrescence in formal mathematical discourse: taking a mathematical entity in its 'entirety', e.g. the open unit interval (0,1), as an actual entity in the sense of a fact associated with an immediate subjective experience, then a divisibility up to exhaustion to its constituent elements may be founded on the genetic division in the process of 'satisfaction of the actual entity in question. The actualization of this entity is, in fact, tied up with the emergence of prehensions associated with the totality of the corresponding region which are not confinable to any of its subregions, these latter ones meant as already complete actualizations. Therefore the open unit interval may not be conceivable in a Whiteheadian sense but as an actualization of an otherwise impredicative concrescence of prehensions. As it is well-known, an open real interval (a, b), in a kind of circularity in definition, cannot be defined but in terms of subintervals of the same genus as the original interval (a, b).

4 A Whiteheadian approach of the construction of a nonstandard quantum mechanics

There is a pioneering research activity in the last decades to apply nonstandard mathematical theory as a formal metatheory of quantum mechanics. On this account, it is hoped that nonstandard mathematics will play an important role in the interpretation of a number of results in number theory, in quantum physics and also in their relation (Fesenko (2006), p. 2).

An important methodological tool in applying nonstandard mathematical methods is the hyperfinite

There have been several approaches to this problem in conventional formalism (e.g. via the rigged-Hilbert-space formalism or the partial-inner-product spaces) but with limited success mainly due to a difficulty in the definition of the scalar product. However, hyperfinite constructions seem to correspond well to the need for a combination of both the discrete and continuous properties of hyperfinite objects; for example, hyperfinite objects are ideally suited to describe the peculiarity of wave-particle behavior in quantum physics through shadow images of nonstandard objects (Fesenko (2006), p. 8). Moreover, several 'exotic' objects of mathematical metatheory, e.g. divergent integrals common in field theories (Dirac's delta function in particular), viewed as hypercomplex unlimited numbers can well have a nonstandard interpretation. Some works in this orientation are Albeverio et al. (1986), Francis (1981), Friedman (1994) and Yamashita (2002).

In the following, I will take into consideration two major nonstan-dard approaches to quantum mechanics, those of M.O. Farrukh in Farrukh (1975) and of A. Raab in Raab (2004), which are mainly motivated by the aforementioned problematic, to discuss the conceptual connection between the nonstandard approach in quantum mechanical theory and Whitehead s notions of genetic and coordinate division in his theory of extension. In Farrukh (1975), the nonstandard construction of ultra eigenvectors corresponding to all spectral points of internal self-adjoint operators makes it possible to set up a formalism valid for both the discrete and continuous spectra, whereas in Raab (2004), the introduction of nonstandard hulls in the calculus of nonstandard extensions of self-adjoint operators is motivated by the assumption of the indistinguishability of infinitesimally different states in performing a measurement (Raab (2004), p. 5). In the following, a certain amount of nonstandard quantum theory formalism is deemed necessary which the reader may wish to skip to enter the conclusions on pages 238-239.

Concerning the mathematical formalism in Raab (2004), it is proved that the spectrum of a self-adjoint operator A is well approximated by the eigenvalues of its nonstandard extension B, i.e. given the eigenvalue λ of standard operator A, then there exists an eigenvalue λ of B such that λ ≈ λ; this nonstandard proximity relation « between any two elements x, y is roughly defined as equivalent to the statement that their normed difference || x - y || is infinitesimal. Then, as the nonstandard operator B has a complete set of normed eigenvectors, there exists a vector x belonging to the hyperfinite-dimensional Hilbert space H (externally containing the standard Hilbert space H by a certain nonstandard extension) such that Bx = λ' x. However as stated above, the nonstandard proximity relation between the standard eigenvalues λ and the nonstandard ones λ', is based on the existence of infinitesimals and on their definition by an external syntactical formula a the one below:

x is infinitesimal in case ∀st > 0, we have || x ||<

Further, the possibility to deduce a well-defined Loeb measure

A remark also to be made regarding nonstandard extensions of standard self-adjoint operators is that the set of Borel functions contains all functions we need to retrieve standard results by the relation g(A)(x) = °(f (B)*x) = °f (B)x (where *x is the nonstandard copy of x, A is a standard self-adjoint operator and B its nonstandard extension, g(A) a real-valued Borel function and f(B) its nonstandard extension; Raab (2004), p. 15). In this respect, it is noteworthy that Borel functions to the extent that they refer to σ-algebras of opens of the respective topology essentially incorporate all that can be said, on the formal level, relative to the underlying continuous spatiotem-poral configuration as a whole and for that reason they can be taken to 'encode' all that can be standardized in this respect. These open (or respectively closed) sets can be taken as formally representing the possibility of existence of a field of genetic division with regard to a Whiteheadian-type concrescence in the actual world.

Concerning the nonstandard approach in Farrukh (1975), there is, at first, a review of P. Dirac's efforts to extend the principles of quantum measurement for an observable whose corresponding operator has a discrete spectrum to those for an observable whose operator is of a continuous spectrum. In those equations, Dirac had to invoke the δ- Dirac function δ(ξ - ξ ) for all ξ, ξ in the continuous spectrum of the corresponding operator, which, as it is well-known, fails to be a function in the conventional sense of a unique-valued mapping for each element of its domain. One of the remedies to this awkward situation is the introduction by Gelfand of an extension of the standard Hilbert space K, which is the rigged Hilbert space (Φ, K, Φ), where Φ CK is a dense subset endowed with a finer topology than K and Φ is the dual space of Φ (the space of continuous linear forms on Φ) equipped with the strong dual topology (Farrukh (1975), p. 178).

Yet, the difficulty with this formal approach lies in the fact that there is no possibility to enlarge the Hilbert space K of physical states to include eigenstates of the continuous spectrum, that is, to include transition eigenstates induced by the measuring process itself. As a matter of fact, the space Φ fails to be a true enlargement of K in the sense of giving an admissible result for any eigenvalue of a measured observable because it lacks the definition of a scalar product (Farrukh (1975), p. 178-179). On this account, M.O. Farrukh's nonstandard approach envisages a nonstandard extension K* of a Hilbert space K possessing by the Transfer Axiom all the standard properties of K. The extension K* can moreover give a definite meaning to a well-known property of standard Hilbert spaces, namely that if an eigenvalue λ belongs to the continuous spectrum of a self-adjoint operator A densely defined on K, then for any e > 0 there exists a vector f G K (corresponding to a state of the system) with || f || = 1 such that || Af - λf||< e (Farrukh (1975), p. 178). Due to the absence of infinitesimal quantities in standard theory, this property cannot determine a unique eigenvalue λ for a given eigenvector f even in taking an upper bound of the errors || Af - ||.

On the contrary, by postulating e as infinitesimal we can guarantee the existence of at least one ultra eigenvector f (associated with an ultra eigenstate) corresponding to a unique standard value of λ, irrespective of whether λ belongs to the discrete or continuous spectrum of an operator, in a way that no distinction between the discrete and continuous spectra is warranted. Therefore, we may have a sound definition of Dirac's 'exotic' δ-function in terms of st(< fx, fx >) = δχ , for any λ, λ in the standard part of the spectrum of a self-adjoint operator A whose family {fx} of ultra eigenvectors normalized to unity satisfies: || Afx - || 0 (Farrukh (1975), p. 184). Moreover, in a nonstandard Hilbert space *K, the necessity of definition of a standard probability function, so as to have a proper physical interpretation within nonstandard quantum theory, implies the explicit acceptance of only the standard values of the inner product < f, Ex(A)f >.

To sum up Farrukh s nonstandard construction: a nonstandard formulation of quantum mechanics defines the set of physical states of a quantum system as the quotient space S(*K) = U(*K)/ Ό-*k, where U(*K) is the set of unit vectors of nonstandard Hilbert space *K and is an equivalence relation on U(*K) putting in the same equivalence class all unit eigenvectors of the form eip · f,g (φ G *R) that have an infinitesimal difference, i.e. || elLp · f - g ||« 0. Accordingly, it is proved that those eigenvectors f,g G *K for which f <H>*/c g, i.e. those representing the same physical state and therefore corresponding to the same standard self-adjoint operator, generate the same probability function VfAA = vg,A (Farrukh (1975), p. 191).

It is also important, in view of my scope, to take into account one of the axioms of Farrukh s nonstandard formulation of quantum mechanics, namely that, "The result of any measurement of an observable can only be one of the standard spectral values of the corresponding operator. As a result of the measurement, the physical system finds itself in a state represented by an ultra eigenvector of the operator representing the measured observable, corresponding to the measured spectral value". This axiom together with another axiom stating that:

"If a system makes a transition between the state represented by the vector f i and the state represented by the vector f2, then the transition probability is given by:'

tran prob(f1f2 = st |< f2, f1>|2

points to a standardized form of the probability of getting the value λas a result of a measurement on a quantum system (in a state represented by f), and subject also to the condition that the system may undergo a transition to one of a countable subclass of states {gi : i G J}, (Farrukh (1975), Axioms 3 & 5, pp. 191-192).

It is notable that the very existence of a nonstandard Hilbert space in the nonconstructive version of Farrukh (1975) (akin to Zakon's non-constructive definition of a nonstandard extension of real numbers, p. 227) rests upon the acceptance of the Axiom of Choice by relying on the existence of a free ultrafilter F on the set of natural numbers N. In this way, one can induce an equivalence relation of nonstandard proximity « (not directly associated with the notion of infinitesimals as in Raab (2004)) on the set G of all sequences of natural numbers by:

for a,bG, ab iff {n; nN & an = bn} ∈ F

and thus define a nonstandard structure, based on the Transfer Axiom and the definition of relations holding for almost every nN, that is, for any element of N belonging to the ultrafilter F.

At this point one might rightfully wonder why there is a need to go that far into the formalism of nonstandard mathematical theory as such and also as a metatheory of quantum mechanics to see a possible connection with the core of the Whiteheadian cosmological scheme in general and his theory of extension in particular. My view is that, as long as we consider mathematical operations as abstract forms of subjective acts carried out within the objective domain of spatiotemporality, then we can interpret the formal axiomatical means applied in the construction of nonstandard mathematical structures as simply 'filling-in' a gap in the description 'generated' by the genetic division corresponding to a concrescence within the actual world, which evidently cannot be a description in terms of the coordinate division of an already carried out process. Then, irrespective of whether one is based on the intensional part of nonstandard analysis or on the extensional part, the way to formalize the genetic division as referent to a concrescent immediacy where the primary fact is the dative phase of the actual occasion in question, is either by applying an actual infinity principle in an abstraction of the real-world infinity given in mental presentational immediacy or by applying standardized norms to the formal-axiomatical description of that concrescent process. 8 Here the term standardized is used in the informal sense of fixedness but it nevertheless retains a deep underlying sense of 'finite' expressibility in relation to the fulfilled phase of satisfaction in the becoming of an actual entity. 8 8 Here the term standardized is used in the informal sense of fixedness but it nevertheless retains a deep underlying sense of 'finite' expressibility in relation to the fulfilled phase of satisfaction in the becoming of an actual entity. In the extensional nonstandard approach this is essentially done by applying the Axiom of Choice or its logical equivalents, while in E. Nelson's intensional approach it is carried out mainly by axiomatically representing a concrescent process in abstraction by the standardized form of a bounded by universal quantification internal formula (Transfer Principle) and also by an internal formula whose bounded variables are reducible to a standard finite ensemble (Idealization Principle).

Reasonably, in principle, it may seem as going too far in drawing a parallel between the Whiteheadian notions of genetic and coordinate division in the process of becoming of an actual entity and even a vague notion of fixedness in the context of a formal theory. Yet, there is a possibility to soundly eastablish such a connection, under the sole assumption that we regard mathematical objects, especially objects of formal-axiomatical theories, as special-kind abstractions of perceptual objects implying a subjectivity of certain constitutive modes. In any case, the role of subjective unity in the completeness of the phases of a process of concrescence was explicitly stated in Whitehead s categorial scheme, specifically in the Category of Subjective Unity and that of Freedom and Determination, among his nine Categoreal Obligations. Concerning the Category of Freedom and Determination, he stated that, "This category can be condensed into the formula, that in each concrescence whatever is determinable is determined, but that there is always a remainder for the decision of the subject-superject of that concrescence" (Whitehead (1978), pp. 26-28).

At this point, it seems worthy to turn to Whitehead s limited reference to infinitesimals in the chapter referring to measurement in Process and Reality. There, in dealing with the problem of measurement in a classical sense, Whitehead argued that measurement depends 'upon counting and upon permanence . For instance, inches can be counted on a metal rod taken as a yard-measure but on the condition that the rod is permanent in both its internal relations and with respect to some of its extensive relations to the geometry of the world. In the first place, counting depends on its straightness and straightness, in turn, on its place in the space-time geometry of the world. Whitehead s answer to those who could argue that the measurement reduces to a comparison of infinitesimals or to an approximation of infinitesimals is simply negative for, in his view, there are no infinitesimals (Whitehead (1978), p. 328). In such view, his next statement seems quite interesting in that, "In mathematics, all phraseology about infinitesimals is merely a disguised statement about a class of finites", inasmuch as it is related with the problem of measurement, e.g. taking the yard-measure as a reference unit then any measurement entails an approximation as to its straightness. This in turn, may be further reduced to purely subjective factors that guarantee the invariability of relevant circumstances independently of the exactitude reached by a constant improvement of the physical conditions in the set-up of measurement. In Whitehead s view there is a final dependence upon direct intuitions in such a way that relevant circumstances remain unchanged in the sense that there is an appearance of invariability of relevant circumstances which is 'always a perception in the mode of presentational immediacy' (Whitehead (1978), p. 329). Moreover, this presentational mode of perception which reminds us of the Husserlian notion of Gegenwartigung (presentification) can be in no sense private as it would then be valid only for a particular observer, a claim which also indirectly points to the Husserlian notion of intersubjectivity.

In the final count, even though Whitehead explicitly denied the existence of infinitesimals as a theoretical interpretation of the question of approximation in the measurement of magnitudes, he nevertheless alluded to an associated finitistic content and, most important, reduced the question of minute approximation and indirectly that of infinitesimals to constitutional processes associated with a subjectivity whose presentational perception is intersubjectively the same within the world. Moreover, he seems to go further towards a convergence with the Husserlian notion of a unity-constituting ego by reducing the feelings (or rather quasi feelings in his statement) involved in the coordinate division of actual entities, 'to subjective forms which are only explicable by categoreal demands arising from the unity of the subject'. Whitehead concluded that the coordinate division of an actual entity produces feelings whose subjective forms are partially eliminated and partially inexplicable (Whitehead (1978), p. 292).

In view of the above, one can reach a two-fold conclusion. First, insofar as the question of infinitesimals and therefore of nonstandard magnitudes is linked to a process of constantly closer approximations in measurement, the 'existence' of nonstandard magnitudes is ultimately associated with subjective forms of feelings which may reduce to the unity of the subject. It seems that a vagueness concerning the notion of the immanent unity of a subject (leading to an atemporal subjectivity in Husserl s phenomenology of temporal consciousness) lies behind Whitehead s claim, namely, that the coordinate division of an actual entity produces feelings whose subjective forms are partially eliminated and partially inexplicable. Second, insofar the subjective forms of feelings are associated with the unity of the subject, then objects/actual occasions must be always a perception in the mode of presentational immediacy and therefore at the stage of 'satisfaction they should be registered as actual occasions in the character of concrete objects (or state-of-affairs) which entails their 'fixedness' or, in other words, their 'finitistic' objectification.

5 Conclusion

As it stands out, actual entities to the extent that, taken as ultimate facts of immediate actual experience, are associated with coordinate division, they are presented as complete, finitistic objects at the stage of 'satisfaction' and for this reason they acquire a 'fixedness' as temporal forms, in contrast with the metaphysical character of the genetic division associated with potentia of actual occasions as multiplicities of real world concrescences. In the latter case, we face the persistent question of the impossibility 'of penetrating into the stage of concrescence as such by any formal linguistic means including those of a nonstandard theory, which means that the genetic division is exempt, as a process, of any kind of objectification. This may help better assess from Whitehead's own viewpoint his claim that 'in mathematics all phraseology about infinitesimals is merely a disguised statement about a class of finites'.

The whole approach seems to be leading to a kind of transcendence at the stage of a real concrescence towards the conjunctive unity of a new actual entity out of a disjunctive multiplicity of potentialities which may be further reduced, in a phenomenologically motivated view, to the transcendental character of an absolute subjectivity conceived of as a constituting factor behind A.N. Whitehead's advancing process this one taken as an objectivity. A further discussion could touch on the possibility of laying a common foundation, especially on an immanent transcendental level, between the Whiteheadian philosophy of organism and certain aspects of the Husserlian phenomenology.

Received: 10.03.2012

Revised: 07.09.2012

Accepted: 21.12.2012

  • ALBEVERIO, S. et al. Nonstandard Methods in Stochastic Analysis and Mathematical Physics, Orlando: Academic Press, 1986.
  • ANDREEV, P., GORDON, E. "An Axiomatics for Nonstandard Set Theory, Based on Von Neumann-Bernays-Gõdel Theory". The Journal of Symbolic Logic, 66, 3, pp. 1321-1341, 2001.
  • _____. "A Theory of Hyperfinite Sets". Annals of Pure and Applied Logic, 143, 1-3, pp. 3-19, 2006.
  • dalla CHIARA, M.L. "Logical Self Reference, Set Theoretical Paradoxes and the Measurement Problem in Quantum Mechanics". Journal ofPhilosophical Logic, 6, pp. 331-347, 1977.
  • EPPERSON, M. Quantum Mechanics and the Philosophy of A.N. Whitehead. New York: Fordham University Press, 2004.
  • FARRUKH, O.M. "Application of nonstandard analysis to quantum mechanics". Journal of Mathematical Physics, 16, 2, pp. 177200, 1975.
  • FESENKO, I. "Several nonstandard remarks". AMS/IP Advances in the Mathematical Sciences, AMS Transl. Series 2, 217, pp. 37-50, 2006.
  • FOLSE, H. "The Copenhagen Interpretation of Quantum Theory and Whitehead's Philosophy of Organism". Tulane Studies in Philosophy, 23, 33, 1974.
  • FRANCIS, E.C. "Application of non-standard analysis to relativistic quantum mechanics". Journal of Physics A: Mathematical and General, 14, 10, pp. 2539-2551, 1981.
  • FRIEDMAN, A. "Nonstandard extension of quantum logic and Dirac's braket formalism of quantum mechanics". International Journal of Theoretical Physics, 33, 2, pp. 307-38, 1994.
  • GOLDBLATT, R. Lectures on the Hyperreals: An Introduction to Nonstandard Analysis, New York: Springer-Verlag, 1998.
  • GRIB, A.A. "Quantum Logical Interpretation of Quantum Mechanics: The Role of Time". Int. Jour. of Theoretical Physics, 32, 12, pp. 2389-2400, 1993.
  • HEISENBERG, W. The Physical Principles of the Quantum Theory, Chicago: The University of Chicago Press, 1930.
  • HUSSERL, E. Ideen zu einer reinen Phánomenologie und phánomenologischen Philosophie, Erstes Buch, Hua Band III/I, hsgb. K. Schuhmann, Den Haag: M. Nijhoff, 1977.
  • _____.Formale und Transzendentale Logik, Hua Band XVII, hsgb. P. Janssen, Den Haag: M. Nijhoff, 1974.
  • _____. Zur Phanomenologie des Inneren Zeibewusstseins, Hua Band X, hsgb. R. Boehm, Den Haag: M. Nijhoff, 1966.
  • NELSON, E. Predicative Arithmetic. Mathematical notes, Princeton: Princeton Univ. Press, 1986.
  • RAAB, A. : "An approach to nonstandard quantum Mechanics". Journal of Mathematical Physics, 45, 12, pp. 4791-4809, 2004.
  • ROBINSON, A. Non-standard Analysis, Amsterdam: North-Holland Pub, 1966.
  • ROBINSON, A. & ZAKON, E. A set-theoretical characterization of enlargements, in: Applications of Model Theory to Algebra, Analysis and Probability, (ed. W.A.J. Luxembourg), pp. 109-122, New York: Holt, Rinehart & Winston, 1969.
  • SHIMONY, A. Quantum Physics and the Philosophy of Whitehead, in: Boston Studies in the Philosophy of Science, (eds. R. Cohen & M. Wartofsky), 2:308, New York: Humanities Press, 1965.
  • TANAKA, Y. The Individuality of a Quantum Event: Whitehead's Epochal Theory of Time and Bohr's Framework of Complementarity in: Physics and Whitehead: Quantum, Process and Experience, (eds. Timothy Eastman & Hank Keeton), pp. 164-179, New York: State University of New York Press, 2004.
  • van DALEN, D. Logic and Structure, Berlin: Springer-Verlag, 2004.
  • VOPENKA, P. Mathematics in the Alternative Set Theory, Teubner-Texte zur Mathematik, Leipzig: Teubner Verlag, 1979.
  • WHITEHEAD, N.A. Process and Reality. An Essay in Cosmology, New York: The Free Press, 1978.
  • _____. An Enquiry Concerning the Principles of Natural Knowledge, (originally published in 1919), New York: Cosimo, Inc, 2007.
  • _____. The Concept of Nature, (originally published in 1920), New York: Cosimo, Inc, 2007.
  • YAMASHITA, H. Nonstandard methods in quantum field theory I: a hyperfinite formalism of scalar fields, Intern. J. Theor. Physics, 41, pp. 511-527, 2002.
  • The notion of process in nonstandard theory and in Whiteheadian metaphysics

    Stathis Livadas
  • 2 The incommensurability condition is defined as the indivisibility in both directions ~ aDb and ~ bDa of two quantum events a and b. The formal definition of the divisibility of a by b is: . Therefore
    2 2 The incommensurability condition is defined as the indivisibility in both directions ~ aDb and ~ bDa of two quantum events a and b. The formal definition of the divisibility of a by b is: . Therefore of two quantum events
    a and b, and the characterization of the individuality of a quantum event
    a in terms of the formal definition of the indivisibility of
    a by a certain event x:
  • 3 A theory T is called a Henkin theory if for each sentence ∃χφ(χ) there is a constant c such that ∃χφ(χ) - φ(ο) ∈ T; in that case c is called a witness for ∃χφ(χ) (van Dalen (2004), p. 104).
    3 3 A theory T is called a Henkin theory if for each sentence ∃χφ(χ) there is a constant c such that ∃χφ(χ) - φ(ο) ∈ T; in that case c is called a witness for ∃χφ(χ) (van Dalen (2004), p. 104). a standard proof of the compactness theorem which consists in proving the logically equivalent assertion, namely, that a set of sentences Γ has no model iff some finite subset of it has no model (van Dalen (2004), p. 111). The non-trivial part of the proof is getting that whenever Γ has no model then some finite subset of it has no model too. By the Model Existence Lemma, Γ is inconsistent, that is Γ h_L, therefore there exists a finite collection of sentences
    τ1,...,τ
    η G Γ such that
    τ1,...,τη h_L. By the same lemma, the collection G = {
    τ1, ...,τη} has no model which proves compactness theorem. But working with Henkin theories in which for each sentence 3x φ(χ) one has to provide a closed term
    t making φ(ί) true within the corresponding model, one must admittedly pass from an indefinite totality of closed terms
    ti making the arbitrary set Γ of sentences
    Ti inconsistent to a finite collection {ti, ..,
    tn} making inconsistent the finite set of sentences
    1,..., τη}. What is noteworthy, is that we have to conceive of a process by which we essentially reduce the intuition of an impredicative substratum incorporating an indefinite aggregation of objects within an ever expanding 'horizon to an intuition associated with the perception of finitely many well-defined objects.
  • 5 Roughly talking, the notion of a hyperfinite set (or space) is an extension of the corresponding standard notion of finiteness in a nonstandard environment. For details on the definition and properties of hyperfinite sets by ultrapower construction see: Goldblatt (1998), p. 188.
    5 5 Roughly talking, the notion of a hyperfinite set (or space) is an extension of the corresponding standard notion of finiteness in a nonstandard environment. For details on the definition and properties of hyperfinite sets by ultrapower construction see: Goldblatt (1998), p. 188. (or more generally hyperdiscrete) extension of a standard space together with the Transfer Principle. As a matter of fact, a major question of quantum mechanical theory is to construct an appropriate mathematical tool to handle on the same footing both the discrete and continuous spectrum of a self-adjoint linear operator, as it is the Hamiltonian of a quantum system. More specifically, while eigenvectors of the Hamiltonian can be identified to bound states of the system in the discrete case, the representation of continuum states by appropriate vectors is problematic and is treated in the conventional Hilbert space framework only approximately.
  • 6 As with hyperfinite spaces (sets) encountered already and taking account of the scope of this paper, I prefer to inclusively describe all variations of Loeb measure as referring to a σ-additive probability measure specifically defined for a nonstandard environment; for details, see: Albeverio et al. (1986)
    6 6 As with hyperfinite spaces (sets) encountered already and taking account of the scope of this paper, I prefer to inclusively describe all variations of Loeb measure as referring to a σ-additive probability measure specifically defined for a nonstandard environment; for details, see: Albeverio et al. (1986) °
    Ε
    Ω associated with the probability function
    , for vectors
    x, y belonging to the nonstandard hull
  • 7 The nonstandard hull °H of the hyperfinite-dimensional Hilbert space H is defined as the quotient space of the set of finite hyperreals finH, by the equivalence relation ~ which defines two vectors as equivalent iff their difference has infinitesimal norm. This equivalence relation defines the set Ho by: Ho = {x ∈ H; || χ||« 0}, Raab (2004)
    7 7 The nonstandard hull °H of the hyperfinite-dimensional Hilbert space H is defined as the quotient space of the set of finite hyperreals finH, by the equivalence relation ~ which defines two vectors as equivalent iff their difference has infinitesimal norm. This equivalence relation defines the set Ho by: Ho = {x ∈ H; || χ||« 0}, Raab (2004) °H of the Hilbert space
    H, is ultimately reduced to the nonstandard proximity relation
    , by setting
    whenever
    x
    y.
  • 1
    1 In nonstandard analysis this leads to a view of points-elements, e.g. of the nonstandard extension R* of the set of real numbers R, as having an inner structure to the extent that they are formally defined as equivalence classes of infinite sequences modulo an ultrafilter F over the set of natural numbers. In this context, the standard real numbers of R are represented as the constant equivalence classes of R*. In nonstandard analysis this leads to a view of points-elements, e.g. of the nonstandard extension
    R* of the set of real numbers
    R, as having an inner structure to the extent that they are formally defined as equivalence classes of infinite sequences modulo an ultrafilter
    F over the set of natural numbers. In this context, the standard real numbers of
    R are represented as the constant equivalence classes of
    R*.
  • 2 The incommensurability condition is defined as the indivisibility in both directions ~ aDb and ~ bDa of two quantum events a and b. The formal definition of the divisibility of a by b is: . Therefore
    2 2 The incommensurability condition is defined as the indivisibility in both directions ~ aDb and ~ bDa of two quantum events a and b. The formal definition of the divisibility of a by b is: . Therefore The incommensurability condition is defined as the indivisibility in both directions ~
    aDb and ~
    bDa of two quantum events
    a and b. The formal definition of the divisibility of
    a by
    b is:
    . Therefore
  • 3 A theory T is called a Henkin theory if for each sentence ∃χφ(χ) there is a constant c such that ∃χφ(χ) - φ(ο) ∈ T; in that case c is called a witness for ∃χφ(χ) (van Dalen (2004), p. 104).
    3 3 A theory T is called a Henkin theory if for each sentence ∃χφ(χ) there is a constant c such that ∃χφ(χ) - φ(ο) ∈ T; in that case c is called a witness for ∃χφ(χ) (van Dalen (2004), p. 104). A theory
    T is called a Henkin theory if for each sentence ∃χφ(χ) there is a constant
    c such that ∃χφ(χ)
    - φ(ο) ∈ T; in that case
    c is called a witness for ∃χφ(χ) (van Dalen (2004), p. 104).
  • 4
    The classical formulation of the Model Existence Lemma is that if Γ is a consistent set of sentences, then .. has a set-theoretical model.
  • 5 Roughly talking, the notion of a hyperfinite set (or space) is an extension of the corresponding standard notion of finiteness in a nonstandard environment. For details on the definition and properties of hyperfinite sets by ultrapower construction see: Goldblatt (1998), p. 188. 5 Roughly talking, the notion of a hyperfinite set (or space) is an extension of the corresponding standard notion of finiteness in a nonstandard environment. For details on the definition and properties of hyperfinite sets by ultrapower construction see: Goldblatt (1998), p. 188.
  • 6 As with hyperfinite spaces (sets) encountered already and taking account of the scope of this paper, I prefer to inclusively describe all variations of Loeb measure as referring to a σ-additive probability measure specifically defined for a nonstandard environment; for details, see: Albeverio et al. (1986) 6 As with hyperfinite spaces (sets) encountered already and taking account of the scope of this paper, I prefer to inclusively describe all variations of Loeb measure as referring to a σ-additive probability measure specifically defined for a nonstandard environment; for details, see: Albeverio et al. (1986)
  • 7 The nonstandard hull °H of the hyperfinite-dimensional Hilbert space H is defined as the quotient space of the set of finite hyperreals finH, by the equivalence relation ~ which defines two vectors as equivalent iff their difference has infinitesimal norm. This equivalence relation defines the set Ho by: Ho = {x ∈ H; || χ||« 0}, Raab (2004) 7 The nonstandard hull
    °H of the hyperfinite-dimensional Hilbert space
    H is defined as the quotient space
    of the set of finite hyperreals finH, by the equivalence relation ~ which defines two vectors as equivalent iff their difference has infinitesimal norm. This equivalence relation defines the set Ho by: Ho =
    {x ∈ H; || χ||« 0}, Raab (2004)
  • 8
    Here the term standardized is used in the informal sense of fixedness but it nevertheless retains a deep underlying sense of 'finite' expressibility in relation to the fulfilled phase of satisfaction in the becoming of an actual entity.
  • Publication Dates

    • Publication in this collection
      24 June 2013
    • Date of issue
      June 2013

    History

    • Received
      10 Mar 2012
    • Accepted
      21 Dec 2012
    • Reviewed
      07 Sept 2012
    UNICAMP - Universidade Estadual de Campinas, Centro de Lógica, Epistemologia e História da Ciência Rua Sérgio Buarque de Holanda, 251, 13083-859 Campinas-SP, Tel: (55 19) 3521 6523, Fax: (55 19) 3289 3269 - Campinas - SP - Brazil
    E-mail: publicacoes@cle.unicamp.br