We introduce optHIM, an open-source library of continuous unconstrained optimization algorithms implemented in PyTorch for both CPU and GPU. By leveraging PyTorch's autograd, optHIM seamlessly integrates function, gradient, and Hessian information into flexible line-search and trust-region methods. We evaluate eleven state-of-the-art variants on benchmark problems spanning convex and non-convex landscapes. Through a suite of quantitative metrics and qualitative analyses, we demonstrate each method's strengths and trade-offs. optHIM aims to democratize advanced optimization by providing a transparent, extensible, and efficient framework for research and education.
翻译:暂无翻译