## Why looking for periodic orbits?

Periodic orbits play a central role in the description and analysis of dynamical systems as they represent the skeleton of the dynamics. It means that some important dynamical properties can be deduced from these simple orbits. A fair amount of information is provided by the linear stability properties of these invariant structures. As an example, cycle expansions according to the length, stability or action of these orbits, are carried out to describe long time behavior such as averages of observables.

Various algorithms have been developed for finding periodic orbits. Among them, some are commonly used in atomic systems with a few number of electrons: Newton-Raphson algorithm or modified version of it such as the damped Newton-Raphson algorithm offers high speed convergence provided one has a sufficiently accurate initial guess for the periodic orbit. The precision of this initial guess is a thorny problem since the basin of attraction shrinks exponentially with the instability and the length of the orbit. Overall, the goal is to set up a deterministic dissipative dynamical system which has the periodic orbit as its attractor.

Independently of the chosen algorithm, the chance to see the algorithm converging strongly depends on the basin of attraction of the periodic orbit of the deterministic dynamical system. Often, there is a trade-off between the size of the basin of attraction with the speed of convergence of the algorithm. This is in particular the case for the Newton-Raphson method.

In [

CNSNS **16**, 2845 (2011)] we proposed a method which extends this basin of attraction by use of systematized trial and error converging procedure. In order to do that, we combine a deterministic method with a Simulated Annealing algorithm to refine initial guesses for periodic orbit search. In other words, the goal of this stochastic method is to enable the determination of initial guesses with sufficient accuracy to lay them into the basin of attraction of a fast converging algorithm like the Newton-Raphson algorithm. As a consequence of the underlying stochastic nature of the algorithm, it enables one to determine several different periodic orbits for the considered dynamical system by launching the algorithm several times.

## Simulated annealing algorithm

The Simulated Annealing algorithm is a metaheuristic algorithm designed for optimization under constraints which generalizes the ideas developed from the Metropolis algorithm. The main idea of the algorithm is to automate a trial and error search for a better solution within a controlled neighborhood whose diameter is successively reduced.

For a given function

, called cost function, the goal is to approximate the global minimum of

(within a given set). First a random initial condition

is chosen to initiate the simulated annealing algorithm and we define two parameters

, the initial temperature, and

which serves as an error tolerance. The simulated annealing algorithm proceeds in successive stages of annealing and cooling steps.

- During the
**annealing** process, a random initial guess is chosen in the neighborhood of (the size of the neighborhood is controlled by ; for simplicity one usually chooses a ball with diameter ). It corresponds to a melting of the system with a controlled temperature .
- If < then replaces in the following step.
- If not, replaces with a probability .
- Otherwise, the same is kept in the next step.

The process of melting for a fixed temperature is iterated until the system reaches a steady state.

- The cooling consists in decreasing the temperature to a new temperature < , realized at the end of the annealing process. Then a new iteration of annealing with the updated temperature , is performed.

The successions of annealing and cooling steps are iterated until the system reaches a global steady state which is the resulting approximation of the global minimum. Since the size of the neighborhood where the perturbation is performed is controlled by the temperature, it has to shrink with iterations of cooling. At the limit for zero temperature, the neighborhood of control is reduced to a single point. We give a draft for the simulated annealing algorithm in the following figure.

##
Simulated annealing algorithm for finding periodic orbits

Periodic orbits correspond to loops in phase space. As a consequence, considering the function

as the distance between the initial and final points for a given trajectory, global vanishing minima of correspond to periodic orbits and thus may be investigated using the simulated annealing algorithm. In the following figure, we give a draft of the simulated annealing algorithm adapted for the search for periodic orbits (User-defined constants are labeled by "#" and "%" starts comments. The function

**ReturnDist** plays the role of the function

. Finally, the function

**rand** is a random generator uniformly distributed over

).

For improving the convergence, it is more efficient to combine the simulated annealing algorithm with a second method based on a Newton-Raphson method for instance. In the programs we use the function

**fsolve** of

**Matlab**, which is a Newton-Gauss minimizing scheme.

The parameter #

**DistError** is used to stop the simulated annealing algorithm when a sufficiently accurate approximation of the periodic orbit is reached (i.e., the cost function is smaller than #

**DistError**). It should be note that even when the simulated annealing algorithm has not converged (the cost function never reached #

**DistError**) the final solution of the algorithm is worth of investigation with a fast converging method as it nevertheless may lie in the basin of attraction of this method.

In [

CNSNS 16, 2845 (2011)] we give evidence of the feasibility of the method by applying it to the helium atom in the ground state for one to three spatial dimensions

##
Reference and program