We propose a simple yet a novel approach to improve completion in domain modeling activities. Our approach exploits the power of large language models by using few-shot prompt learning without the need to train or fine-tune those models with large datasets that are scarce in this field. We implemented our approach and tested it on the completion of static and dynamic domain diagrams. Our initial evaluation shows that such an approach is effective and can be integrated in different ways during the modeling activities.
翻译:我们提出了一个简单而新颖的方法来改进领域建模活动的完成情况。我们的方法利用了大语言模型的力量,利用了微小的速成学习,而不需要用该领域稀缺的大型数据集来培训或微调这些模型。我们实施了我们的方法,并在完成静态和动态域图时测试了我们的方法。我们的初步评估表明,这种方法是有效的,可以在建模活动中以不同方式整合。