Advancing artificial intelligence for physical sciences requires representations that are both interpretable and compatible with the underlying laws of nature. We introduce METASTRINGS, a symbolic language for photonics that expresses nanostructures as textual sequences encoding materials, geometries, and lattice configurations. Analogous to molecular textual representations in chemistry, METASTRINGS provides a framework connecting human interpretability with computational design by capturing the structural hierarchy of photonic metasurfaces. Building on this representation, we develop Meta-GPT, a foundation transformer model trained on METASTRINGS and finetuned with physics-informed supervised, reinforcement, and chain-of-thought learning. Across various design tasks, the model achieves <3% mean-squared spectral error and maintains >98% syntactic validity, generating diverse metasurface prototypes whose experimentally measured optical responses match their target spectra. These results demonstrate that Meta-GPT can learn the compositional rules of light-matter interactions through METASTRINGS, laying a rigorous foundation for AI-driven photonics and representing an important step toward a metasurface genome project.
翻译:推进人工智能在物理科学中的应用需要兼具可解释性且符合自然基本定律的表征方法。我们引入了METASTRINGS,一种用于光子学的符号语言,它将纳米结构表示为编码材料、几何形状和晶格配置的文本序列。类似于化学中的分子文本表示,METASTRINGS通过捕捉光子超表面的结构层次,提供了一个连接人类可解释性与计算设计的框架。基于此表征,我们开发了Meta-GPT,这是一个基于METASTRINGS训练的基础Transformer模型,并通过物理信息监督学习、强化学习和思维链学习进行微调。在各种设计任务中,该模型实现了<3%的均方光谱误差,并保持>98%的句法有效性,生成的多样化超表面原型其实验测得的光学响应与目标光谱相匹配。这些结果表明,Meta-GPT能够通过METASTRINGS学习光-物质相互作用的组合规则,为人工智能驱动的光子学奠定了严谨基础,并代表了迈向超表面基因组项目的重要一步。