quest for processing speed potential. In fact, we always get a fraction of the technically available computing power (so-called {\em theoretical peak}), and the gap is likely to go hand-to-hand with the hardware complexity of the target system. Among the key aspects of this complexity, we have: the {\em heterogeneity} of the computing units, the {\em memory hierarchy and partitioning} including the non-uniform memory access (NUMA) configuration, and the {\em interconnect} for data exchanges among the computing nodes. Scientific investigations and cutting-edge technical activities should ideally scale-up with respect to sustained performance. The special case of quantitative approaches for solving (large-scale) problems deserves a special focus. Indeed, most of common real-life problems, even when considering the artificial intelligence paradigm, rely on optimization techniques for the main kernels of algorithmic solutions. Mathematical programming and pure combinatorial methods are not easy to implement efficiently on large-scale supercomputers because of {\em irregular control flow}, {\em complex memory access patterns}, {\em heterogeneous kernels}, {\em numerical issues}, to name a few. We describe and examine our thoughts from the standpoint of large-scale supercomputers.
翻译:事实上,我们总能得到技术上可用的计算能力(所谓的理论高峰 ) 的一小部分,而这种差距很可能与目标系统的硬件复杂程度同步。在这种复杂性的关键方面,我们有:计算单位的 ~ 异质性, 记忆级和分区, 包括非统一的内存存访问(NUMA)配置, 以及计算节点之间数据交换的连接 。科学调查和尖端技术活动最好在持续业绩方面扩大规模。 解决(大规模)问题的量化方法的特例值得特别关注。 事实上,大多数常见的现实问题,即使考虑到人工智能模式,也依赖优化技术,用于算法解决方案的主要内核。 数学编程和纯组合方法不容易在大型超级计算机上高效地实施,因为超标准控制流 、 复杂的记忆访问模式 、 QQQQQQ 、 QQQQQQ 、 和 QQQQQQ 、 、 QQQQQQQ 、 QQQQQQ 、 和 QQQQQQQQQ 、 、 QQQQQQQQQQQ 、 、 QQQQQQQQQQQQQQQQQ 、Q 、QQQQ 。