Jitendra Malik once said, "Supervision is the opium of the AI researcher". Most deep learning techniques heavily rely on extreme amounts of human labels to work effectively. In today's world, the rate of data creation greatly surpasses the rate of data annotation. Full reliance on human annotations is just a temporary means to solve current closed problems in AI. In reality, only a tiny fraction of data is annotated. Annotation Efficient Learning (AEL) is a study of algorithms to train models effectively with fewer annotations. To thrive in AEL environments, we need deep learning techniques that rely less on manual annotations (e.g., image, bounding-box, and per-pixel labels), but learn useful information from unlabeled data. In this thesis, we explore five different techniques for handling AEL.
翻译:Jitendra Malik曾经说过, “ 监督是AI 研究人员的鸦片 ” 。 大多数深层次的学习技术都严重依赖极端数量的人类标签来有效工作。 在当今世界,数据生成速度大大超过数据注释的速度。 完全依赖人类注释只是解决AI目前封闭问题的暂时手段。 实际上,只有一小部分数据是附加说明的。 说明有效学习(AEL)是用较少的注释来有效培训模型的算法研究。 要在 AEL 环境中发扬光大,我们需要不那么依赖手动说明的深层次学习技术(例如图像、捆绑框和每像素标签),而是从未贴标签的数据中学习有用的信息。 在这份论文中,我们探索了五个不同的处理 AEL 的技术 。