Categories
Albert BERT data science DistilBert Huggingface Machine learning NLP paraphrase python question answering question generation sentiment sentiment span seq to seq Summary Text2TextGeneration transformers translation

Text2TextGeneration pipeline by Huggingface transformers

Text2TextGeneration is a single pipeline for all kinds of NLP tasks like Question answering, sentiment classification, question generation, translation, paraphrasing, summarization, etc.
Let’s see how the Text2TextGeneration pipeline by Huggingface transformers can be used for these tasks.

Huggingface released a pipeline called the Text2TextGeneration pipeline under its NLP library transformers.

Text2TextGeneration is the pipeline for text to text generation using seq2seq models.

Text2TextGeneration is a single pipeline for all kinds of NLP tasks like Question answering, sentiment classification, question generation, translation, paraphrasing, summarization, etc.

This Text2TextGenerationPipeline pipeline can currently be loaded from pipeline() using the following task identifier: "text2text-generation"

Let’s see how the Text2TextGeneration pipeline by Huggingface transformers can be used for these tasks.

1.Install Transformers library in colab.
!pip install transformers

or, install it locally,

pip install transformers
 2. Import transformers pipeline,
from transformers import pipeline
3. Set the “text2text-generation” pipeline.
text2text = pipeline("text2text-generation") 
4. Task: Question Answering

Question-answering is the task of extracting answers from a tuple of a candidate paragraph and a question.

Huggingface transformer has a pipeline called question answering But, we are not going to use this here. Instead, we define it as a text2text-generation pipeline.

text2text("question: Which is capital city of India? context: New Delhi is India's capital")
[{'generated_text': 'New Delhi'}]
5. Task Translation

The translation is the task of translating from one language to another.

Let’s translate from English to French.

text2text("translate English to French: New Delhi is India's capital")

Lets see the translation,

[{'generated_text': "New Delhi est la capitale de l'Inde."}]

Translate from English to German,

text2text("translate English to German: New Delhi is India's capital")

The German translation,

[{'generated_text': 'Neu Delhi ist die Hauptstadt Indiens'}]
6. Task: Summarization

Summarize a text document.

text2text("summarize: Natural language processing (NLP) is a subfield of linguistics, computer science, and artificial intelligence concerned with the interactions between computers and human language, in particular how to program computers to process and analyze large amounts of natural language data.")

Summarized text,

[{'generated_text': 'natural language processing (NLP) is a subfield of linguistics, computer science'}]
7. Task: Sentiment

Classify sentiment of a text, whether its positive or negative.

text2text("sst2 sentence: New Zealand is a beautiful country")

output:

[{'generated_text': 'positive'}]
8. Task: Sentiment Span Extraction

Here the phrase responsible for the sentiment of a text is extracted.

We will use it as a Question-answering task.

text2text("question : positive context: New Zealand is a beautiful country.")

Here is the text span that is responsible for ‘positive’ sentiment,

[{'generated_text': 'a beautiful country'}]
9. Task: Question Generation

This task generate questions given a context.

text2text = pipeline("text2text-generation", model = "valhalla/t5-base-e2e-qg")
text2text("generate questions : New Delhi is India's capital.", num_beams=4, max_length = 8)

Question generated,

[{'generated_text': "What city is India's capital"}]
10. Task: paraphrase

Given an input sentence, the goal of the Paraphrase Generation is to generate an output sentence which is semantically identical to the input sentence but contains variations in lexicon or syntax.

Here is an example,

text2text = pipeline('text2text-generation', model = "Vamsi/T5_Paraphrase_Paws")
text2text("paraphrase: This is something which I cannt understand at all.")

Output,

[{'generated_text': 'This is something that I cant understand at all.'}]

Here is the Colab link

Let me know in comments section, if you are facing any issues.

My other articles about BERT,

Question answering using transformers and BERT

How to cluster text documents using BERT

How to do semantic document similarity using BERT

Zero-shot classification using Huggingface transformers

Summarize text document using transformers and BERT

Follow me on Twitter, Instagram, Pinterest, and Tumblr for new post notification.

4 replies on “Text2TextGeneration pipeline by Huggingface transformers”

Leave a Reply

Your email address will not be published. Required fields are marked *

How to do semantic document similarity using BERT Zero-shot classification using Huggingface transformers