When an optical beam propagates through a turbulent medium such as the atmosphere or ocean, the beam will become distorted. It is then natural to seek the best or optimal beam that is distorted least, under some metric such as intensity or scintillation. We seek to maximize the light intensity at the receiver using the paraxial wave equation with weak-fluctuation as the model. In contrast to classical results that typically confine original laser beams to be from a special class, we allow the beam to be general, which leads to an eigenvalue problem of a large-sized matrix with each entry being a multi-dimensional integral. This is an expensive and sometimes infeasible computational task in many practically reasonable settings. To overcome this, we utilize an asymptotic expansion and transform the derivation to Fourier space, which allows us to incorporate some optional turbulence assumptions, such as homogeneous-statistics assumption, small-length-scale cutoff assumption, and Markov assumption, to reduce the dimension of the numerical integral. The proposed methods provide a computational strategy that is numerically feasible, and results are demonstrated in several numerical examples.
翻译:当光束通过大气或海洋等动荡媒介传播时,光束就会被扭曲。然后自然地在强度或闪烁等某些标准下寻找最不扭曲的最佳或最佳光束。 我们试图利用以微弱变形为模型的对角波方程式使接收器的光强度最大化。 与通常将原始激光束限制在特殊等级的典型结果相比,我们允许光束成为一般的结果,导致一个大型矩阵的异常值问题,每个条目都是多维的组成部分。 在许多实际合理的环境中,这是一个昂贵的、有时是不可行的计算任务。 为了克服这一点,我们利用无孔扩展并将衍生物转化为Fourier空间,这使我们能够纳入一些可选的动荡假设,例如单一统计假设、小长度断层假设和Markov假设,以减少数字整体的维度。 提议的方法提供了数字上可行的计算战略,其结果在若干数字例子中得到了证明。