Generating Text Summaries Using GPT-2 on PyTorch | Paperspace Blog

In this article I will describe an abstractive text summarization approach, first mentioned in [1], to train a text summarizer. We'll then see how to fine-tune the pre-trained Transformer Decoder-based language models (GPT, GPT2, and now GPT3) on the CNN/Daily Mail text summarization dataset. Without adding any new parameters, we'll obtain a very powerful abstractive text summarizer after training for just 5 epochs on 3000 examples from the training dataset.


This is a companion discussion topic for the original entry at https://blog.paperspace.com/generating-text-summaries-gpt-2

Can you please share the requirements file with the versions you have used in your github repo?

Added in the repo. Since I had done this work a while back, I don’t have the complete detail of the environment, but I have added all the essential dependencies which I could find.

Hi Rohit,

Great Article. I’m new to Abstractive Text Summarization and I noticed that your code in your github repo is a bit more extensive than the snippets of code in the article. Should I consult your github repo instead of the code in your article? I’m looking to reproduce your experiments with another dataset. Let me know

-Bryan

1 Like

The article should be enough to get you going. However, you may find more information within the Github repo.