Haystack is a scalable QA system to search in large collections of documents. See how to build a QA system using Haystack and InMemoryDocumentStore.
Tag: BERT
See how to do topic modeling using Roberta and transformers. We will use a pre-trained Roberta model finetuned on the NLI dataset for getting embeddings and then do topic modelling.
See how to do conversational response generation using DialoGPT – a SOTA dialogue response generation model for multiturn conversations.
How to do Question answering using Huggingface transformers and BERT? See how you can use the transformers pipeline for Question answering using BERT.
Cluster text documents using BERT embeddings and Kmeans. See how you can apply the K-means algorithm on the embedding to cluster documents.
Summarize text document using Huggingface transformers and BERT. Use different transformer models for summary and findout the performance.