The desire to apply machine learning techniques in safety-critical environments has renewed interest in the learning of partial functions for distinguishing between positive, negative and unclear observations. We contribute to the understanding of the hardness of this problem. Specifically, we consider partial Boolean functions defined by a pair of Boolean functions $f, g \colon \{0,1\}^J \to \{0,1\}$ such that $f \cdot g = 0$ and such that $f$ and $g$ are defined by disjunctive normal forms or binary decision trees. We show: Minimizing the sum of the lengths or depths of these forms while separating disjoint sets $A \cup B = S \subseteq \{0,1\}^J$ such that $f(A) = \{1\}$ and $g(B) = \{1\}$ is inapproximable to within $(1 - \epsilon) \ln (|S|-1)$ for any $\epsilon > 0$, unless P=NP.
翻译:在安全临界环境中应用机器学习技术的愿望使人们重新有兴趣学习部分功能,以区分正、负和不明确的观测结果。 我们有助于理解这一问题的难度。 具体地说, 我们考虑由一对布林函数定义的部分布林函数 $f, g\cron ⁇ 0, 1 ⁇ J\to ⁇ 0, 1 ⁇ ⁇ $, 这样, $f\cdot g = 0美元 = 美元, 并且 美元和 $g 美元由分流的正常形式或二进制决定树来定义。 我们显示: 将这些表格的长度或深度之和最小化, 分离时设置 $A\ cup B = S\subseseq ⁇ 0, 1 ⁇ J 美元, 美元= $g(B) = $1 ⁇ = ⁇ 1 ⁇ = 美元, 除非 P=NP 。