The Hybrid Genetic Optimisation framework (HyGO) is introduced to meet the pressing need for efficient and unified optimisation frameworks that support both parametric and functional learning in complex engineering problems. Evolutionary algorithms are widely employed as derivative-free global optimisation methods but often suffer from slow convergence rates, especially during late-stage learning. HyGO integrates the global exploration capabilities of evolutionary algorithms with accelerated local search for robust solution refinement. The key enabler is a two-stage strategy that balances exploration and exploitation. For parametric problems, HyGO alternates between a genetic algorithm and targeted improvement through a degradation-proof Dowhill Simplex Method (DSM). For function optimisation tasks, HyGO rotates between genetic programming and DSM. Validation is performed on (a) parametric optimisation benchmarks, where HyGO demonstrates faster and more robust convergence than standard genetic algorithms, and (b) function optimisation tasks, including control of a damped Landau oscillator. Practical relevance is showcased through aerodynamic drag reduction of an Ahmed body via Reynolds-Averaged Navier-Stokes simulations, achieving consistently interpretable results and reductions exceeding 20% by controlled jet injection in the back of the body for flow reattachment and separation bubble reduction. Overall, HyGO emerges as a versatile hybrid optimisation framework suitable for a broad spectrum of engineering and scientific problems involving parametric and functional learning.
翻译:暂无翻译