GPT-Jass : A Text-to-model Pipeline for ACT-R Models
The GPT-family of Large Language Models has garnered significant attention in the past year. Its ability to digest natural language has opened up previously unsolvable natural language problem domains. We tasked GPT-3 with generating complex cognitive models from plain text instructions. The quality of the generated models is dependent upon the quality and quantity of fine-tuning samples, but is otherwise quite promising, producing executable and correct models in four of six task areas.
Keywords
There is nothing here yet. Be the first to create a thread.
Cite this as: