This paper deals with the problem of numerically computing the roots of polynomials $p_k(x)$, $k=1,2,\ldots$, of degree $n=2^k-1$ recursively defined by $p_1(x)=x+1$, $p_k(x)=xp_{k-1}(x)^2+1$. An algorithm based on the Ehrlich-Aberth simultaneous iterations complemented by the Fast Multipole Method, and by the fast search of near neighbors of a set of complex numbers, is provided. The algorithm has a cost of $O(n\log n)$ arithmetic operations per step. A Fortran 95 implementation is given and numerical experiments are carried out. Experimentally, it turns out that the number of iterations needed to arrive at numerical convergence is $O(\log n)$. This allows us to compute the roots of $p_k(x)$ up to degree $n=2^{24}-1$ in about 16 minutes on a laptop with 16 GB RAM, and up to degree $n=2^{28}-1$ in about one hour on a machine with 256 GB RAM. The case of degree $n=2^{30}-1$ would require higher memory and higher precision to separate the roots. With a suitable adaptation of FMM to the limit of 256 GB RAM and by performing the computation in extended precision (i.e. with 10-byte floating point representation) we were able to compute all the roots in about two weeks of CPU time for $n=2^{30}-1$. From the experimental analysis, explicit asymptotic expressions of the real roots of $p_k(x)$ and an explicit expression of $\min_{i\ne j}|\xi_i^{(k)}-\xi_j^{(k)}|$ for the roots $\xi_i^{(k)}$ of $p_k(x)$ are deduced. The approach is extended to classes of polynomials defined by a doubling recurrence.
翻译:暂无翻译