Taking HuggingFace DistilBERT for a ride
Image from https://www.pexels.com/@kindelmedia/ I have never dealt with BERT (Bidirectional Encoder Representations from Transformers is a transformer-based machine learning technique for natural language processing). And, I am trying it out. This simple experiment is made easy for Hugging Face ecosystem is making it easy for me to get started. There are already pre-trained models that I can use. Therefore I do not need to do ML training myself. Here is the code from transformers import DistilBertTokenizer, DistilBertForQuestionAnswering import torch tokenizer = DistilBertTokenizer.from_pretrained( "distilbert-base-uncased", return_token_type_ids=True ) model = DistilBertForQuestionAnswering.from_pretrained( "distilbert-base-uncased-distilled-squad", return_dict=False ) print('Enter your statement:') context = input() print() print('Enter your question:') question = input() while question: encoding = tokenizer.encode_plus(question, c