Category: Huggingface
-
QA system using Haystack and InMemoryDocumentStore
Haystack is a scalable QA system to search in large collections of documents. See how to build a QA system using Haystack and InMemoryDocumentStore.
-
Topic modeling using Roberta and transformers
See how to do topic modeling using Roberta and transformers. We will use a pre-trained Roberta model finetuned on the NLI dataset for getting embeddings and then do topic modelling.
-
Conversational response generation using DialoGPT
See how to do conversational response generation using DialoGPT – a SOTA dialogue response generation model for multiturn conversations.
-
Faster transformer NLP pipeline using ONNX
See how ONNX can be used for faster CPU inference performance using the Huggingface transformer NLP pipeline with few changes.
-
Text2TextGeneration pipeline by Huggingface transformers
Text2TextGeneration is a single pipeline for all kinds of NLP tasks like Question answering, sentiment classification, question generation, translation, paraphrasing, summarization, etc. Let’s see how the Text2TextGeneration pipeline by Huggingface transformers can be used for these tasks.
-
Question answering using transformers and BERT
How to do Question answering using Huggingface transformers and BERT? See how you can use the transformers pipeline for Question answering using BERT.
-
How to cluster text documents using BERT
Cluster text documents using BERT embeddings and Kmeans. See how you can apply the K-means algorithm on the embedding to cluster documents.
-
How to do semantic document similarity using BERT
To get semantic document similarity between documents, get the embedding using BERT and calculate the cosine similarity score between them.
-
Zero-shot classification using Huggingface transformers
Learn how to do zero-shot classification of text using the Huggingface transformers pipeline. Also, see where it fails and how to resolve it.
-
Summarize text document using transformers and BERT
Summarize text document using Huggingface transformers and BERT. Use different transformer models for summary and findout the performance.