aitextgen is a Python package that leverages PyTorch, Hugging Face Transformers and pytorch-lightning with specific optimizations for text generation using GPT-2, plus many added features. It is the successor to textgenrnn and gpt-2-simple, taking the best of both packages (Github repo)
I think this might be a way to train a GPT model from scratch that is trained in reverse. Which means, given an outcome, it should be possible to create a cluster of text options that would construct that string on a regular GPT. And it should be possible to verify by using that phrase on a forward GPT to evaluate the likelihood that the originating/target text gets generated.