Current deep learning techniques for style transfer would not be optimal for design support since their "one-shot" transfer does not fit exploratory design processes. To overcome this gap, we propose parametric transcription, which transcribes an end-to-end style transfer effect into parameter values of specific transformations available in an existing content editing tool. With this approach, users can imitate the style of a reference sample in the tool that they are familiar with and thus can easily continue further exploration by manipulating the parameters. To enable this, we introduce a framework that utilizes an existing pretrained model for style transfer to calculate a perceptual style distance to the reference sample and uses black-box optimization to find the parameters that minimize this distance. Our experiments with various third-party tools, such as Instagram and Blender, show that our framework can effectively leverage deep learning techniques for computational design support.
翻译:目前用于风格传输的深层次学习技术对于设计支持来说并不理想,因为其“一发”传输不适合探索性设计流程。 为了克服这一差距, 我们建议了参数转录, 将一个端到端的样式转换效应转换成现有内容编辑工具中特定变异的参数值。 采用这种方法, 用户可以模仿他们熟悉的工具中的参考样本样式, 从而可以通过调控参数来方便地继续进一步探索。 为了实现这一点, 我们引入了一个框架, 该框架将利用现有的“ 一发” 传输模式, 来计算与参考样本的感知性样式距离, 并使用黑盒优化来找到最小化这一距离的参数。 我们用各种第三方工具( 如Instagram和Blender) 进行的实验显示, 我们的框架可以有效地利用深层学习技术来计算设计支持 。