English distil_bert_ft_qa_model_7up_pipeline pipeline DistilBertForQuestionAnswering from cadzchua

Description

Pretrained DistilBertForQuestionAnswering, adapted from Hugging Face and curated to provide scalability and production-readiness using Spark NLP.distil_bert_ft_qa_model_7up_pipeline is a English model originally trained by cadzchua.

Download Copy S3 URI

How to use


pipeline = PretrainedPipeline("distil_bert_ft_qa_model_7up_pipeline", lang = "en")
annotations =  pipeline.transform(df)   


val pipeline = new PretrainedPipeline("distil_bert_ft_qa_model_7up_pipeline", lang = "en")
val annotations = pipeline.transform(df)

Model Information

Model Name: distil_bert_ft_qa_model_7up_pipeline
Type: pipeline
Compatibility: Spark NLP 5.5.0+
License: Open Source
Edition: Official
Language: en
Size: 247.3 MB

References

https://huggingface.co/cadzchua/distil-bert-ft-qa-model-7up

Included Models

  • MultiDocumentAssembler
  • DistilBertForQuestionAnswering