State minimization of combinatorial filters is a fundamental problem that arises, for example, in building cheap, resource-efficient robots. But exact minimization is known to be NP-hard. This paper conducts a more nuanced analysis of this hardness than up till now, and uncovers two factors which contribute to this complexity. We show each factor is a distinct source of the problem's hardness and are able, thereby, to shed some light on the role played by (1) structure of the graph that encodes compatibility relationships, and (2) determinism-enforcing constraints. Just as a line of prior work has sought to introduce additional assumptions and identify sub-classes that lead to practical state reduction, we next use this new, sharper understanding to explore special cases for which exact minimization is efficient. We introduce a new algorithm for constraint repair that applies to a large sub-class of filters, subsuming three distinct special cases for which the possibility of optimal minimization in polynomial time was known earlier. While the efficiency in each of these three cases appeared, previously, to stem from seemingly dissimilar properties, when seen through the lens of the present work, their commonality now becomes clear. We also provide entirely new families of filters that are efficiently reducible.
翻译:国家最小化组合过滤器是一个根本问题,例如,在建设廉价、资源高效的机器人方面,这是一个根本问题。但确切的最小化是已知的NP-硬性。本文对这种硬性比现在更细致的分析比现在更细致,并揭示了造成这种复杂性的两个因素。我们显示每个因素都是问题硬性的一个明显来源,因此能够说明(1) 将兼容性关系编码的图表结构以及(2) 确定性约束因素所发挥的作用。正如先前的工作试图引入更多的假设和确定导致实际减少状态的子类一样,我们接下来将使用这种新的、更敏锐的理解来探索精确最小化的特殊情况。我们引入了一种新的约束性修复算法,适用于大型的子类过滤器,并包含三个不同的特殊案例,这些案例之前就已经知道在多元时间内实现最佳最小化的可能性。尽管这三起案件中每个案例的效率过去都显现出来,但从表面上看似不相近乎的特性的属性,但现在通过彻底的过滤镜片,我们也清楚了它们的新的共同性。