# ABSTRACT

In addition to radio navigation, optical navigation has been used successfully in deep-space missions since the launch of the Voyager spacecraft, in the 1970s. In the 1990s, the NASA’s Deep Space-1 mission successfully tested an autonomous optical navigation system which allowed great reduction in mission costs, and maximized scientific results. The ASTER mission, the first Brazilian deep space mission, shall count on the support of optical navigation for all its phases. The platform of the probe is the Russian Pilgrim spacecraft developed by the Russian Space Research Institute for the Finnish-Russian mission to Mars (MetNet). As such, besides the scientific camera on board (which may also be used for navigation purposes), the probe will also dispose of a navigation camera (NAVCAM). This study is related to the formulation of a general proposal of optical navigation, that can be used in the ASTER mission, which takes into account the equipment available on board, especially the NAVCAM, along with tracking software suitable for the conduction of optical navigation. The description of an appropriate navigation algorithm together with its successful application to simulated and real images (from NASA’s New Horizons mission) is carried out.

Keywords

# INTRODUCTION

Planetary navigation as performed today includes radio navigation, conventional optical navigation and autonomous optical navigation, the latter encompassing different methods and approaches applied to the multiple phases of a deep space mission. In the next lines, a brief introduction to the methods applied to the navigation of spacecraft in deep space is presented.

## Deep Space Navigation of Spacecraft

Deep space missions involve enormous distances (of the order of astronomical units; 1 AU ≈ 150 million km). The navigation is performed mainly using standard radio navigation techniques. These are used for orbit determination and trajectory prediction. To make it possible, a network of deep space antennas, as the ESTRACK of the European Space Agency (ESA), is required as part of the ground structure of the mission. In conjunction with radio navigation, optical navigation is also used for the determination of orbits and trajectories calculation. The images taken by the on-board camera during approach phases with solar system bodies (planets, moons, asteroids and others), containing these images against the background of stars, are processed and directional vectors are calculated. This information, combined with standard radio navigation data, enables orbit determination and trajectory calculation.

In the conventional approach, the calculations are made on the ground and the corrections and maneuvers to be performed are transmitted to the ship, where they are performed. ESA (2014)[ESA] European Space Agency (2014) Comet 67P/C-G in Rosetta’s navigation camera. ESA. [accessed Aug 10 2020]. http://blogs.esa.int/rosetta/2014/06/25/comet-67pc-g-in-rosettas-navigation-camera/
http://blogs.esa.int/rosetta/2014/06/25/...
shows a series of navigation images of Comet 67P/Churyumov-Gerasimenko against the background of stars, as seen by Rosetta’s NAVCAM on June 03, 2014. The information extracted from such images is usually fed into the orbital determination process to confirm the location of the Comet and to help refine the trajectory of the spacecraft during the approach.

Autonomous optical navigation occurs when acquired images are processed on board, in real time, for the extraction of information to be used in the autonomous navigation of the spacecraft. In the case of missions in deep space, with the huge distances involved, the need for autonomy in navigation is very important for the success of the missions, mainly in the stages where fast navigation solutions need to be found and implemented. This is the case of missions where there is one or more upcoming encounters (rendezvous). Additionally, as the mission operations segment represents a significant part of the cost of the mission, the space agencies have included autonomy explicitly in the guidelines for deep space missions with the clear purpose of reducing total costs.

The Deep Space Mission ASTER (Macau et al. 2011Macau EEN, Winter OC, Velho HFC, Sukhanov AA, Brum AGV, Ferreira JL, Hetem, Sandonato GM, Sfair R (2011) The Aster mission: Exploring for the first time a triple system asteroid. Paper presented 62nd International Astronautical Congress. Cape Town, South Africa. [accessed Aug 10 2020]. https://repositorio.unesp.br/handle/11449/72980
https://repositorio.unesp.br/handle/1144...
; Sukhanov et al. 2010Sukhanov AA, Velho HFC, Macau EE, Winter OC (2010) The Aster project: Flight to a near earth asteroid. Cosmic Res 48:443-450. https://doi.org/10.1134/S0010952510050114
https://doi.org/10.1134/S001095251005011...
) is a Russian-Brazilian cooperation for the investigation of the near Earth (triple) asteroid 2001 SN263. The probe for the mission is based on the Russian Pilgrim spacecraft, developed by the Russian Space Research Institute for the Finnish-Russian Mission to Mars, MetNet. This spacecraft uses solar electric propulsion and the ASTER mission intends to test electric thrusters developed in Brazil. The scientific instrumentation will collect data for approximately 6 months, and the main scientific objectives include determining the size, mass, volume, gravity field and rotations of the bodies in the triple system, in conjunction with the identification of the composition, morphology and topography of the surface of each body. An investigation of the system dynamics is scheduled in order to obtain system formation evidences. According to Brum et al. (2021)Brum AGV, Hussmann H, Wickhusen K, Stark A (2021) Encounter trajectories for deep space mission Aster to the triple near Earth asteroid 2001-SN263. The laser altimeter (ALR) point of view. Adv Space Res 67(1):648-661. https://doi.org/10.1016/j.asr.2020.10.042
https://doi.org/10.1016/j.asr.2020.10.04...
, the nearest launch opportunities include June 2022 (with insertion into heliocentric orbit in February 2023, arrival in December 2024, and end of investigations in April 2025), and June 2025 (with insertion into heliocentric orbit in February 2026, arrival in September 2027, and end of investigations in January 2028).

The knowledge base about this asteroid dates back to 2001, the year of its discovery (by the Lincoln Near-Earth Asteroid Research, LINEAR). However, only in 2008 it was revealed that the asteroid was in fact a triple system. More recent information about the system can be obtained in the works of Becker et al. (2015)Becker T, Howell E, Nolan M, Magri C, Pravec P, Taylor PA, Oey J, Higgins D, Világi J, Kornoš L, et al. (2015) Physical modeling of triple near-Earth Asteroid (153591) 2001 SN263 from radar and optical light curve observations. Icarus 248:499-515. https://doi.org/10.1016/j.icarus.2014.10.048
https://doi.org/10.1016/j.icarus.2014.10...
, Winter et al. (2020)Winter OC, Valvano G, Moura TS, Borderes-Motta G, Amarante A, Sfair R (2020) Asteroid triple-system 2001 SN263: Surface characteristics and dynamical environment. Monthly Notices of the Royal Astronomical Society 492(3):4437-4455. https://doi.org/10.1093/mnras/staa097
https://doi.org/10.1093/mnras/staa097...
. The most complete set of updated data on this system is found at the Small-Body Database Browser (NAS[NASA] 2020[NASA] National Aeronautics and Space Administration (2020) Small-Body Database Lookup. NASA. [accessed Jul 29 2020]. https://ssd.jpl.nasa.gov/sbdb.cgi?sstr=153591
https://ssd.jpl.nasa.gov/sbdb.cgi?sstr=1...
). There is also a Wiki page dedicated to general information about this triple asteroid system (Wikipedia 2020Wikipedia (2020) (153591) 2001 SN263. Wikipedia. [accessed Jul 8 2020]. https://en.wikipedia.org/wiki/(153591)_2001_SN263
https://en.wikipedia.org/wiki/(153591)_2...
). Figure 1 illustrates (artistically) the system together with some known data about it.

Figure 1
Artistic illustration of the triple asteroid system 2001-SN263.

As a low-budget mission, the use of autonomous navigation is clearly mandatory. In this case, the use of autonomous optical navigation for supplementary navigation is a good option and, as a minimum requirement, Ephemerides estimation. A navigation system of the type required in similar missions has been tested with success on Deep Space-1 Mission (along with the Deep Impact mission; Cangahuala et al. 2012Cangahuala A, Bhaskaran S, Owen B (2012) Science benefits of onboard spacecraft navigation. Eos Trans AGU 93(18):177-178. https://doi.org/10.1029/2012EO180001
https://doi.org/10.1029/2012EO180001...
). Since then, many other missions used autonomous navigation systems. As examples, the Hayabusa 1 and 2 missions (Tsuda et al. 2013Tsuda Y, Yoshikawa M, Abe M, Minamino H, and Nakazawa S (2013) System design of the Hayabusa 2 — Asteroid sample return mission to 1999 JU3. Acta Astronaut 91:356-362. https://doi.org/10.1016/j.actaastro.2013.06.028
https://doi.org/10.1016/j.actaastro.2013...
; Uo et al. 2006Uo M, Shirakawa K, Hasimoto T, Kubota T, and Kawaguchi J (2006) Hayabusa touching-down to Itokawa - Autonomous guidance and navigation. The Journal of Space Technology and Science 22(1):32-41. https://doi.org/10.11230/jsts.22.1_32
https://doi.org/10.11230/jsts.22.1_32...
), and the ESA/Rosetta space mission (2014). For this reason, it is appropriate to devote efforts to the development of a kind of autonomy similar to those made available for these missions, starting with a simpler version.

Although the complete project aims at the creation of software to navigate the probe in all phases of the deep space mission (cruise, encounter, approach, proximity and touchdown), the present study was done with the intention of presenting a proposal of an autonomous optical navigation software to be tested in the first phase of ASTER mission (cruise, when the spacecraft is in the heliocentric transfer orbit).

In general terms, a proposal for similar space missions has already been presented earlier by Brum et al. (2013b)Brum AGV, Pilchowski HU, Faria SD (2013b) Autonomous navigation of spacecraft in deep space missions. Paper presented 22nd International Congress of Mechanical Engineering. COBEM; Ribeirão Preto, São Paulo, Brazil. [accessed Aug 10 2020]. http://www.abcm.org.br/app/webroot/anais/cobem/2013/PDF/754.pdf
http://www.abcm.org.br/app/webroot/anais...
. This work puts forward that proposal, from the discussion of the details of the adequate type of optical navigation for the first phase of the mission, cruise, taking into consideration the instrumentation expected to be available onboard, and existing techniques that can be used for this purpose. Other mission phases shall be the subject of other works.

# FOUNDATIONS OF CONVENTIONAL OPTICAL NAVIGATION

ESA (2014)[ESA] European Space Agency (2014) Comet 67P/C-G in Rosetta’s navigation camera. ESA. [accessed Aug 10 2020]. http://blogs.esa.int/rosetta/2014/06/25/comet-67pc-g-in-rosettas-navigation-camera/
http://blogs.esa.int/rosetta/2014/06/25/...
shows many examples of the sort of image used for this purpose. Basically, almost all the proposed techniques make use of the angles measured between two solar system bodies (planets, moons, asteroids, bodies with well-known ephemeris) and two or more stars to determine the vehicle position (Scull 1966Scull JR (1966) Space Technology Volume IV: Spacecraft Guidance and Control. Washington, DC: NASA.). An example of this situation is shown in Fig. 2, along with the related geometry. In Fig. 2a, the angle between star 1, S1, and planet A localizes the probe on a cone with origin in A, axis in the direction of S1, with semiangle α1; similarly, the angle between star 2 and planet A localizes the probe on a cone with origin in A, axis in the direction of S2, with semi angle α2. The two cones with origin in A intersect in two lines, one of which contains the spacecraft position. Suppose this straight line is l1, which passes by the spacecraft position and has its direction given by $L→1$, that can be calculated as show in Eq. 1.

Figure 2
(a) Conventional optical navigation scheme; (b) Geometry of a specific case of optical navigation with 2 planets and 3 stars in the camera FOV.
(1)

The procedure described is reapplied to planet B. This way, another line passing by B and the spacecraft is determined. The intersection of these two lines determines the vehicle position.

The ambiguity related to the two lines can be resolved using sparse knowledge about the vehicle position or, also and when available, from the use of the image of a third star, S3, with semiangle α3. The equations for this specific case are presented below.

$S ^ 1 ⋆ L ^ 1 = cos ( α 1 )$ (2)
$S ^ 2 * L ^ 1 = cos ( α 2 )$ (3)
$S ^ 3 ∗ L ^ 1 = cos ( α 3 )$ (4)

Where $S^1$, $S^2$ and $S^3$ are the sighting directions of the three identified stars, given by the star catalogue.

Solving this system of equations, one determines $L^1$, the direction vector of line l1, which is common to the three cones, passes by planet A, and follows toward the spacecraft. Equations 24 compose a system of equations to be solved. The process is repeated for planet B identifying $L^2$, the vector connecting planet B to the probe. Thus, $L^1$ and $L^2$ are determined as the versors of lines l1 and l2, which pass through the spacecraft and planets A and B, respectively, whose intersection mark the vehicle position. To determine vehicle position, however, one needs to calculate the distance value between the spacecraft and a planet, given by the scalar ρ, as in Eq. 1. Figure 2b shows the geometry of this situation. The heliocentric equatorial inertial system (J2000) is taken for obtaining the ephemerides of the planets, $R→1$ and $R→2$, and star directions, $S^1$ and $S^2$.

Equations 5a and 5b follow from Fig. 2b. The solution of Eq. 5b provides the values of ρ1 e ρ2, distances from the vehicle to the planets A and B, respectively. With the value of ρ1 or ρ2, from Eq. 5a, one can obtain the heliocentric position of the vehicle. Equation 5b is resolved by forming a system of three linear equations where ρ1 e ρ2 are the unknowns. The solution of this system requires that the best pair (ρ1, ρ2) which satisfies the system is identified. A method should be used for this solution (in this work, the least squares method was used). Once calculated those parameters, Eq. 5a offers the vehicle position, as determined with the use of this method, together with an estimate of the error made.

$r → = R → 1 + ρ 1 L ^ 1 = R → 2 + ρ 2 L ^ 2$ (5a)
$ρ 1 L ^ 1 − ρ 2 L ^ 2 = R → 2 − R → 1$ (5b)

The algorithm created for this research uses the heliocentric equatorial inertial position (or geocentric; J2000) of the targets as input data and the greater or lesser accuracy of the result of its application depends directly on the accuracy of this information which, therefore, should be available on board. Other dependencies exist with regard to the image processing and extraction of information.

To determine the trajectory, the velocity also needs to be calculated. Its direct measurement, with use of stellar spectral deviation or from relativistic methods, does not result in sufficiently precise values (± 100 m·s–1) due to fluctuations and stars brightness nonuniformity. The most suitable method seems to be the velocity calculation from two or more positions calculated in time. The combination with the navigation data is used to improve the state estimate. A review of the foundation of orbits determination with use of optical data is found in Bhaskaran et al. (1996)Bhaskaran S, Riedel JE, Synnott SP (1996) Autonomous optical navigation for interplanetary missions. Proceedings of the SPIE’s 1996 International Symposium on Optical Science, Engineering, and Instrumentation; 1996 Oct 28; Denver, CO. Bellingham: SPIE. https://doi.org/10.1117/12.255151
https://doi.org/10.1117/12.255151...
. An application example of this methodology is presented in the next items, with a view to the ASTER mission. As this work involves a NAVCAM (to be modeled), an explanation about its operation and parameters is offered.

# PROPOSAL OF OPTICAL NAVIGATION FOR THE ASTER MISSION – PHASE 1: CRUISE

Currently, on-board cameras dedicated primarily to navigation are used. For example, the NAVCAM of the Rosetta spacecraft was used during the mission to perform the maneuvers required for the passage by asteroids Steins and Lutetia and, additionally, to the tracking of comet 67 p/Churyumov-Gerasimenko, the target of the mission (ESA 2014).

According to Macau et al. (2011)Macau EEN, Winter OC, Velho HFC, Sukhanov AA, Brum AGV, Ferreira JL, Hetem, Sandonato GM, Sfair R (2011) The Aster mission: Exploring for the first time a triple system asteroid. Paper presented 62nd International Astronautical Congress. Cape Town, South Africa. [accessed Aug 10 2020]. https://repositorio.unesp.br/handle/11449/72980
https://repositorio.unesp.br/handle/1144...
, who present a list of the devices expected to compose the future probe (based on the Russian spacecraft Pilgrim), the ASTER spacecraft shall count, among others, on a NAVCAM (narrow field of view [FOV]), a pair of star sensors and a solar sensor, all fixed. Additionally, as part of the mission instrumentation, one scientific camera is expected to be on board (it could eventually be used for navigation purposes), and a laser altimeter. Considering the availability of these devices, a proposal of an optical navigation strategy is presented.

Initially, the proposal is linked to the NAVCAM, which has its imaging acquisition procedure modeled aiming at the extraction of information useful for the vehicle navigation. Important to remark that it is not uncommon to find this camera with dual function. In the case of the Rosetta mission, the device designed by Officine Galileo played a dual role, NAVCAM and Star tracker sensor (Suetta et al. 1999Suetta E, Cherubini G, Mondello G, Piccini G (1999) Four cover mechanisms for Rosetta mission. Proceedings of the 8th European Space Mechanisms and Tribology Symposium; Toulouse, France. [accessed Aug 10 2020]. http://adsabs.harvard.edu/full/1999ESASP.438..127S
).

The following paragraphs discuss some necessary developments for the testing of an optical navigation system during the ASTER mission. The modeling and simulations that were carried out with a view to this possibility are also presented.

## Identification/definition of the software to be used in the NAVCAM operation

Because of the similarity between their operations, the software commonly used in star sensors is considered as a starting point for the developments to come. The algorithms involved are listed:

• Centroids calculation, related corrections, obtaining of corrected centroids and calculation of their respective directions (in the body frame).

• Stars identification (Star ID software). The image processing provides a list of stars identified on the image obtained and processed, along with their respective directions (versors in the body frame). In addition to these, other software for specific use on optical navigation are needed and will be targeted for development.

• Target ID software – SIRA (capable of recognizing targets: planets, moons, asteroids, comets). This software is needed for target identification and tracking. Its operation will provide useful information for (optical) navigation in the cruise phase. The targets to be sought in the images are part of a list of targets previously selected for the specific mission. The simplest case involves only the recognition of the triple asteroid mission target. However, the testing of the navigation software, suggested to be conducted during the cruise phase, shall need a list with larger number of targets. In each case, the knowledge of the vehicle attitude, as well as the vehicle and target positions (in the inertial frame), obtained from the available measurements and with use of dynamic modeling, shall be essential for the correct pointing and taking of images by the NAVCAM, which should occur in time intervals prescheduled for this activity. The acquired image processed by the Star ID software, shall be also subject of processing by the Target ID software (SIRA), which should identify the specific target (or targets) sought in it. The starting point for the development of the SIRA software is the Star ID software, usually present in autonomous star trackers. The experience with the Autonomous Star Tracker (AST), developed by the Brazilian Institute of Space Research (INPE), shall be useful (Brum et al. 2013aBrum AGV, Fialho MAA, Selingardi ML, Borrego ND, Lourenço J (2013a) The Brazilian autonomous star tracker – AST. WSEAS Trans Syst 10(12):459-470.). A possibility to follow here is the creation of an additional operating mode in the software developed for the AST. In this new mode, the target will be identified and tracked against the background of stars. In this case, after the image is processed by both Star ID and SIRA, that is, once stars and target in the image were already identified, a list will be created with their specific directions (directional vectors in the body frame) within the image under analysis. This information will be used as input by the optical navigation algorithm.

• Orbit estimator with optical data algorithm (EODA). This algorithm receives the information described in the previous paragraph and uses it to estimate the inertial position of the vehicle in the current path. To do so, it must perform the calculations described in section 2 (see Fig. 3), studied and described earlier by Bhaskaran et al. (1996)Bhaskaran S, Riedel JE, Synnott SP (1996) Autonomous optical navigation for interplanetary missions. Proceedings of the SPIE’s 1996 International Symposium on Optical Science, Engineering, and Instrumentation; 1996 Oct 28; Denver, CO. Bellingham: SPIE. https://doi.org/10.1117/12.255151
https://doi.org/10.1117/12.255151...
. The optical navigation software described above should have its use compromised with increasing approach to the target. At a distance of about 50 km of the target asteroid Alpha, for example, the aperture angle of sight equals approximately 3.2°. In this case, the centroid of the image of the target should be calculated, but errors in this case interfere with the quality of the obtained navigation estimate. When the encounter occurs, D ≈ 50 km, the orbital maneuvers that take the vehicle to this stage of the mission are applied, and the cruise phase of the mission ends. With it, the first phase of optical navigation discussed in this paper also ends. After that, phase 2 of optical navigation begins, where the primary function will be the target tracking. At this stage, the laser altimeter is already functional and available for use in the measurement of distances spacecraft-target.

Figure 3
Coordinate systems associated with the electronic array. (a) Non-normalized system of coordinates of the electronic array; (b) Normalized electronic array reference system. Versor û indicates the sensor pointing direction.

## NAVCAM Modeling

This geometric modeling involves camera parameters (number of pixels in the electronic array, FOV, focal length of the optical system, and others). This modelling also involves the gaps inherent to the imaging process and corrections for these deviations are made (as it occurs in star sensors). An example of image instrument modeling can be seen in Brum and Pilchowiski (1999)Brum AGV, Pilchowiski HU (1999) SPOT imaging instrument (HRV) modeling for interaction with the attitude determination and control system of the imaging satellite. Paper presented XV Congresso Brasileiro de Engenharia Mecânica. COBEM; Lindóia, São Paulo, Brazil. [accessed Aug 10 2020]. http://www.abcm.org.br/app/webroot/anais/cobem/1999/pdf/aaache.pdf
http://www.abcm.org.br/app/webroot/anais...
. Because of the similarity between the operations of a NAVCAM and of a star tracker, much of the experience gained in the development of software for the latter will be useful in this study (Brum et al. 2013aBrum AGV, Fialho MAA, Selingardi ML, Borrego ND, Lourenço J (2013a) The Brazilian autonomous star tracker – AST. WSEAS Trans Syst 10(12):459-470.). The modeling of an imaging system for a star tracker, along with the identification of the main sources of errors and proposals of corrections for these are discussed in Fialho (2003)Fialho MAA (2003) Ambiente de simulações e testes de algoritmos para sensores de estrelas autônomos (undergraduate thesis). São José dos Campos: Instituto Tecnológico de Aeronáutica. In Portuguese., Albuquerque (2005)Albuquerque BFC (2005) Estudo dos erros sistemáticos inerentes a um sensor de estrelas de cabeça fixa, quanto à localização relativa de estrelas (master’s thesis). São José dos Campos: Instituto Nacional de Pesquisas Espaciais. In Portuguese. and Albuquerque and Fialho (2005)Albuquerque BFC, Fialho MAA (2005) Estudo das fontes de erro existentes no sensor de estrelas autônomo em desenvolvimento no INPE. Aerospace Electronics Division: DEA-EO-005/05.. The study by Albuquerque and Fialho (2005)Albuquerque BFC, Fialho MAA (2005) Estudo das fontes de erro existentes no sensor de estrelas autônomo em desenvolvimento no INPE. Aerospace Electronics Division: DEA-EO-005/05. on the existing sources of errors in the autonomous star tracker in development at the Brazilian Institute of Spaces Research (INPE) is written in Portuguese and constitutes an internal publication of the Aerospace Electronics Division: DEA-EO-005/05.

As a result of this modeling, starting from centroid coordinates of the objects present in the image taken, (xc, yc), the pointing directions (versors) of each of these objects, in the camera system of coordinates (body frame), are obtained. The results of this modeling are presented in the following paragraphs.

## Sensor frame

Regarding the electronic sensor array, Fig. 3, the uvw system, fixed on the sensor body, has u-axis perpendicular to the matrix plane, by its center, with positive pointing direction coincident with the off-camera direction; the v and w axes define a plane parallel to the array electronics (w axis up).

## Coordinate systems associated with the electronic array

### Non-normalized coordinate system

In this system (Fig. 3a), the coordinates are represented in terms of pixels down and to the right, relative to the upper-left corner of the image. Being the width of the image in pixels imgw, and the height in pixels imgh, xe varies from 0 to imgw and ye varies from 0 to imgh. The coordinates of a generic point in this coordinate system are given by the ordered pair (xe, ye). As the sensor works with subpixel accuracy (because of the algorithm that calculates centroids), both xe and ye can assume fractional values. In this coordinate system, the center of the upper left pixel is (0.5; 0.5), and the center of the lower right pixel is (imgw-0.5; imgh-0.5). This coordinate system is used in the early stages of image processing.

### Normalized electronic array frame

Be Ŝ the versor that indicates the position of a star on the celestial sphere. Imagining a plane perpendicular to the axis of the sensor at a unitary distance of the origin of the uvw frame, the extension of the versor Ŝ crosses this perpendicular plane at the position (Sx΄, Sy΄), as illustrated in Fig. 3b. In this reference frame, the star coordinates, represented by versor Ŝ, are given by the ordered pair (Sx΄, Sy΄). This coordinate system is used in an intermediate step between the image processing algorithm and the Stars ID algorithm, being utilized to convert stars coordinates from the non-normalized electronic array coordinate system to the body fixed vectorial cartesian frame (uvw frame) (Fig. 3b). In this way, according to Fialho (2003)Fialho MAA (2003) Ambiente de simulações e testes de algoritmos para sensores de estrelas autônomos (undergraduate thesis). São José dos Campos: Instituto Tecnológico de Aeronáutica. In Portuguese., the target object identified in the image has (Eq. 6):

• Its coordinates (in the sensor array frame) calculated as (xc,yc), centroid coordinates;

• From (xc,yc), the coordinates of the object identified in the normalized electronic array reference frame are calculated as Ŝtgt = (Sx΄, Sy΄). These values depend on the height and width of the camera FOV (fov_h and fov_w, in degrees), and on the height and width of the electronic array, in terms of its number of pixels, imgw and imgh;

$S x ′ = 0 , 5 i m g w − x c img w 2 S x ′ max$ (6a)
$S y ′ = 0 , 5 i m g h − y c i m g h 2 S y ′ max ;$ (6b)

Where = $Sx′max=tg(fov−w/2)$ and $Sy′maxtg(fov_h/2)$.

• Here, Ŝ = (Sx΄, Sy΄) is used for the calculation of star directions in the sensor body frame, Ŝuvw = (Su, Sv, Sw), according to Fialho (2003)Fialho MAA (2003) Ambiente de simulações e testes de algoritmos para sensores de estrelas autônomos (undergraduate thesis). São José dos Campos: Instituto Tecnológico de Aeronáutica. In Portuguese. (Eq. 7);

$S u = 1 1 + S x ′ + 2 S y ′ 2 ; S v = S x ′ ⋅ S u ; S w = S y ′ ⋅ S u$ (7)
• This way, one calculates Ŝuvw for the target, and also for two or more identified stars in the image, being ŜAuvw, the target direction vector, and Ŝ1uvw, Ŝ2uvw the direction vectors of two stars in the sensor frame.

• With vectors ŜAuvw, Ŝ1uvw and Ŝ2uvw, the dot product is used to calculate the cosines of the angles α1 and α2 between these vectors, which will be used by the navigation algorithm to determine the spacecraft ephemeris at the time of image capture, as previously described.

# APPLICATION OF THE OPTICAL NAVIGATION ALGORITHMS TO A SIMULATED NAVIGATION IMAGE

Figure 4 is a simulated image generated by the Solar System Simulator (an online tool created by NASA/JPL – Jet Propulsion Laboratory), with a resolution of 800 × 800 pixels, and FOV of 5°. It is possible to observe Jupiter and its moons: Io, Europe, Ganymede and Callisto on the 5th of May 2015, at midnight (00:00 UTC). It represents how the sky would be viewed from Mars at that epoch. The goal was to calculate the position vector of Mars using the image coordinates (image frame) of the targets (the moons of Jupiter), and those of the stars identified in the image, using the methods described in the theoretical foundation. The expected value for the position of Mars at the given date are given by vector $R→M$, collected at the Horizons online tool (JPL/NASA) in Geocentric Equatorial Inertial J2000 coordinates (Eq. 8).

(8)
Figure 4
Simulated image of the sky viewed from Mars on the 5th of March 2015, at midnight (00:00 UTC). On it: Jupiter and its moons (Io, Europe, Ganymede and Callisto) with some stars in the background.

The targets and the stars in Fig. 4 were manually identified. For the algorithm test, Jupiter (target A), Callisto (target B), Europe (target C), 34 Leo (star 1), HIP 50950 (star 2), HIP 51179 (star 3), HIP 50473 (star 4) and Regulus (star 5). were selected. The coordinates (xic, yic) of each target or star in the image were calculated using a centroid calculation algorithm based on the weighted average of the position and brightness of each pixel in the group of pixels defining each image. The results (centroid coordinates) are presented in Table 1.

Table 1
Calculated centroids for selected targets and stars in Fig. 4 (image/pixel coordinates).

The Ephemerides of the stars presented on Table 2 were obtained with use of the software Stellarium (https://stellarium-web.org), which uses coordinates of star charts, like Gaia and Hipparcos, generally used for astronomy and engineering applications (Chéreau 2022Chéreau F (2002) Stellarium. Poole: Noctua Software.). The positions of the targets on Table 3 were obtained with use of the online software Horizons (JPL/NASA), expressed as position vectors (km) in Geocentric Equatorial Inertial coordinates (at J2000 epoch).

Table 2
Ephemerides of the stars in Fig. 4 in Geocentric Inertial Equatorial J2000 coordinates.
Table 3
Position vectors of Jupiter, Europe and Callisto at the given epoch (Geocentric Inertial Equatorial J2000 coordinates).

With use of Fig. 4, six tests were performed using the identified targets combined two by two, and two groups of three stars. Table 4 presents the obtained results for the calculated position of Mars.

From Table 4, one can observe that the obtained values were close to the expected ones. Besides that, the mean of the relative errors of the results is 0.5%. Thus, it is possible to say that the method is valid when applied to a simulated image.

Table 4
Calculated Mars’ position vectors based on the processing of Fig. 4 (Geocentric Inertial Equatorial J2000 coordinates).

Since the method was validated with respect to simulated images, some tests with real images were performed. The data set used in this case was acquired during NASA’s New Horizons mission, taken by the instrument Long-Range Reconnaissance Imager (LORRI). The quality of the images, the availability of the images in grayscale (the pixel range values are from 0 to 255, as the image varies from black to white), clear and accessible information of the image data were taken as parameters for using them. The images have a FOV of 0.29° × 0.29° and resolution of 256 × 256 pixels (NASA/New Horizons).

The identification of the stars was made with help of the Astrometry software (http://nova.astrometry.net/), meanwhile the targets identification was done manually. The Ephemerides of the stars were obtained from Tycho-2 catalogue (ESA) with help of the VizieR system of the Centre de Données Astronomiques de Strasbourg (CDS), and are presented in Table 5. The position vectors of Pluto, Charon and the expected position vector of the New Horizons spacecraft was obtained from HORIZONS software (NASA/JPL).

Table 5
Ephemerides of the stars obtained from the real images in Tycho-2 stars catalogue.

In this work five accomplished tests are presented using images taken in the interval from 25th to 27th of January 2015. They are shown in Fig. 5 and their respective data are presented on Table 5. They were collected at the PDS Image Atlas (NASA 2019[NASA] National Aeronautics and Space Administration (2019) PDS Image Atlas. NASA. [accessed June 3 2019]. https://pds-imaging.jpl.nasa.gov/search
https://pds-imaging.jpl.nasa.gov/search...
) (JPL). Since the images are presented in grayscale, no additional image processing technique was necessary, thus avoiding new error sources. However, in Fig. 5, a luminous tail can be observed on the right side of Pluto. This fact occurs as a consequence of Pluto’s brightness exceeding the saturation capacity of the LORRI detector. Although this factor may be considered a source of error, the tests were successfully completed.

Figure 5
Images of the New Horizons mission (NASA). In the images, Pluto, Charon, and some stars are identified. The images were used during the tests of the algorithms. Images correspondent of tests (a) 1, (b) 2, (c) 3, (d) 4 and (e) 5.
Table 6
Image data used in the tests with real navigation images.

Table 7 presents the results of the five first tests. The mean relative error for those tests amounts 2.06%, that is, the obtained errors in this test are close to those obtained in the tests with simulated images. These results show that the method is also valid for real images applications and, therefore, its usage can be thought of as an element of an autonomous navigation system in a space mission.

Table 7
Results of five tests made with real images (km).

# CONCLUSION

A set of useful algorithms for optical navigation in the cruise phase of a space mission was described, aiming at its test in the ASTER mission.

The modeling of a NAVCAM was performed and an optical navigation algorithm was created and successfully applied to a simulated navigation image and to real navigation images. Obtained results confirm the validity of the method, with a relative error of the order of 0.5% for the simulated image and 2% for the real images.

Clearly, the method is highly dependent on the quality of the images used and on the precision of the information extracted from the images (with regard to quality of the used image processing). For this reason, the first error source is related to the selected images, since they present a luminous tail at the right of Pluto. Ideally, an image without those particularities would be preferred, because the centroid calculation used does not apply any correction to this error. A solution for that would be to build an image database using only high-quality real images. However, it is common to have images with flaws in space missions. Thus, a more interesting solution consists of searching in the literature for algorithms that can deal with those flaws (of common use in star trackers).

Another suggestion for future studies is that these algorithms should be implemented in a NAVCAM which, because of the similarity between devices, would be built from an adaptation in a star tracker. That could be done through the insertion of an additional operation mode in the adapted star tracker control software, the optical navigation mode. A critical point here is the target ID software, since star trackers are built to identify stars only. A solution, however, could be to have embarked in this system some additional software, like the SIRA (target ID software), described in section 3.1, capable of recognizing targets (planets, moons, asteroids, comets) from the available measurements (images), and with use of dynamic modeling. Star trackers usually have a big FOV (the star tracker cited in session 3.2, for example, has a 20° × 20° FOV), so ensuring that enough images with at least two targets are provided for the navigation module is also a critical point.

# ACKNOWLEDGEMENTS

The authors express their gratitude to Marcus Vinícius Cardoso Macêdo for his contribution to a previous unpublished study related to this topic. Also, we would like to express our gratitude to the scholarship provided by the Programa de Educação Tutorial (PET) of the Brazilian Ministry of Education (MEC).

• Peer Review History: Single Blind Peer Review.
• ### DATA AVAILABILITY STATEMENT

The data will be available upon request.
• ### FUNDING

Not applicable.

# REFERENCES

• [ESA] European Space Agency (2014) Comet 67P/C-G in Rosetta’s navigation camera. ESA. [accessed Aug 10 2020]. http://blogs.esa.int/rosetta/2014/06/25/comet-67pc-g-in-rosettas-navigation-camera/
• [NASA] National Aeronautics and Space Administration (2019) PDS Image Atlas. NASA. [accessed June 3 2019]. https://pds-imaging.jpl.nasa.gov/search
» https://pds-imaging.jpl.nasa.gov/search
• [NASA] National Aeronautics and Space Administration (2020) Small-Body Database Lookup. NASA. [accessed Jul 29 2020]. https://ssd.jpl.nasa.gov/sbdb.cgi?sstr=153591
» https://ssd.jpl.nasa.gov/sbdb.cgi?sstr=153591
• Albuquerque BFC (2005) Estudo dos erros sistemáticos inerentes a um sensor de estrelas de cabeça fixa, quanto à localização relativa de estrelas (master’s thesis). São José dos Campos: Instituto Nacional de Pesquisas Espaciais. In Portuguese.
• Albuquerque BFC, Fialho MAA (2005) Estudo das fontes de erro existentes no sensor de estrelas autônomo em desenvolvimento no INPE. Aerospace Electronics Division: DEA-EO-005/05.
• Becker T, Howell E, Nolan M, Magri C, Pravec P, Taylor PA, Oey J, Higgins D, Világi J, Kornoš L, et al (2015) Physical modeling of triple near-Earth Asteroid (153591) 2001 SN263 from radar and optical light curve observations. Icarus 248:499-515. https://doi.org/10.1016/j.icarus.2014.10.048
» https://doi.org/10.1016/j.icarus.2014.10.048
• Bhaskaran S, Riedel JE, Synnott SP (1996) Autonomous optical navigation for interplanetary missions. Proceedings of the SPIE’s 1996 International Symposium on Optical Science, Engineering, and Instrumentation; 1996 Oct 28; Denver, CO. Bellingham: SPIE. https://doi.org/10.1117/12.255151
» https://doi.org/10.1117/12.255151
• Brum AGV, Fialho MAA, Selingardi ML, Borrego ND, Lourenço J (2013a) The Brazilian autonomous star tracker – AST. WSEAS Trans Syst 10(12):459-470.
• Brum AGV, Hussmann H, Wickhusen K, Stark A (2021) Encounter trajectories for deep space mission Aster to the triple near Earth asteroid 2001-SN263. The laser altimeter (ALR) point of view. Adv Space Res 67(1):648-661. https://doi.org/10.1016/j.asr.2020.10.042
» https://doi.org/10.1016/j.asr.2020.10.042
• Brum AGV, Pilchowiski HU (1999) SPOT imaging instrument (HRV) modeling for interaction with the attitude determination and control system of the imaging satellite. Paper presented XV Congresso Brasileiro de Engenharia Mecânica. COBEM; Lindóia, São Paulo, Brazil. [accessed Aug 10 2020]. http://www.abcm.org.br/app/webroot/anais/cobem/1999/pdf/aaache.pdf
» http://www.abcm.org.br/app/webroot/anais/cobem/1999/pdf/aaache.pdf
• Brum AGV, Pilchowski HU, Faria SD (2013b) Autonomous navigation of spacecraft in deep space missions. Paper presented 22nd International Congress of Mechanical Engineering. COBEM; Ribeirão Preto, São Paulo, Brazil. [accessed Aug 10 2020]. http://www.abcm.org.br/app/webroot/anais/cobem/2013/PDF/754.pdf
» http://www.abcm.org.br/app/webroot/anais/cobem/2013/PDF/754.pdf
• Cangahuala A, Bhaskaran S, Owen B (2012) Science benefits of onboard spacecraft navigation. Eos Trans AGU 93(18):177-178. https://doi.org/10.1029/2012EO180001
» https://doi.org/10.1029/2012EO180001
• Chéreau F (2002) Stellarium. Poole: Noctua Software.
• Fialho MAA (2003) Ambiente de simulações e testes de algoritmos para sensores de estrelas autônomos (undergraduate thesis). São José dos Campos: Instituto Tecnológico de Aeronáutica. In Portuguese.
• Jones T, Lee P, Bellerose J, Fahnestock E, Farquhar R, Gaffey M, Heldmann J, Lawrence D, Nolan M, Prettyman T, et al (2011) Amor: A lander to investigate a C-type triple near-Earth asteroid system: 2001 SN263. Paper presented 42nd Lunar and Planetary Science Conference. Texas, United States of America. [accessed Aug 10 2020]. http://www.lpi.usra.edu/meetings/lpsc2011/pdf/2695.pdf
» http://www.lpi.usra.edu/meetings/lpsc2011/pdf/2695.pdf
• Macau EEN, Winter OC, Velho HFC, Sukhanov AA, Brum AGV, Ferreira JL, Hetem, Sandonato GM, Sfair R (2011) The Aster mission: Exploring for the first time a triple system asteroid. Paper presented 62nd International Astronautical Congress. Cape Town, South Africa. [accessed Aug 10 2020]. https://repositorio.unesp.br/handle/11449/72980
» https://repositorio.unesp.br/handle/11449/72980
• Scull JR (1966) Space Technology Volume IV: Spacecraft Guidance and Control. Washington, DC: NASA.
• Ning X, Huang P, Fang J (2013) A new celestial navigation method for spacecraft on gravity assist trajectory. Math Probl Eng 2013:950675. https://doi.org/10.1155/2013/950675
» https://doi.org/10.1155/2013/950675
• Suetta E, Cherubini G, Mondello G, Piccini G (1999) Four cover mechanisms for Rosetta mission. Proceedings of the 8th European Space Mechanisms and Tribology Symposium; Toulouse, France. [accessed Aug 10 2020]. http://adsabs.harvard.edu/full/1999ESASP.438..127S
• Sukhanov AA, Velho HFC, Macau EE, Winter OC (2010) The Aster project: Flight to a near earth asteroid. Cosmic Res 48:443-450. https://doi.org/10.1134/S0010952510050114
» https://doi.org/10.1134/S0010952510050114
• Tsuda Y, Yoshikawa M, Abe M, Minamino H, and Nakazawa S (2013) System design of the Hayabusa 2 — Asteroid sample return mission to 1999 JU3. Acta Astronaut 91:356-362. https://doi.org/10.1016/j.actaastro.2013.06.028
» https://doi.org/10.1016/j.actaastro.2013.06.028
• Uo M, Shirakawa K, Hasimoto T, Kubota T, and Kawaguchi J (2006) Hayabusa touching-down to Itokawa - Autonomous guidance and navigation. The Journal of Space Technology and Science 22(1):32-41. https://doi.org/10.11230/jsts.22.1_32
» https://doi.org/10.11230/jsts.22.1_32
• Wikipedia (2020) (153591) 2001 SN263. Wikipedia. [accessed Jul 8 2020]. https://en.wikipedia.org/wiki/(153591)_2001_SN263
» https://en.wikipedia.org/wiki/(153591)_2001_SN263
• Winter OC, Valvano G, Moura TS, Borderes-Motta G, Amarante A, Sfair R (2020) Asteroid triple-system 2001 SN263: Surface characteristics and dynamical environment. Monthly Notices of the Royal Astronomical Society 492(3):4437-4455. https://doi.org/10.1093/mnras/staa097
» https://doi.org/10.1093/mnras/staa097

# Publication Dates

• Publication in this collection
01 Apr 2022
• Date of issue
2022