Malay (macrolanguage) indobert_for_eqa_finetuned_pipeline pipeline BertForQuestionAnswering from primasr

Description

Pretrained BertForQuestionAnswering, adapted from Hugging Face and curated to provide scalability and production-readiness using Spark NLP.indobert_for_eqa_finetuned_pipeline is a Malay (macrolanguage) model originally trained by primasr.

Download Copy S3 URI

How to use


pipeline = PretrainedPipeline("indobert_for_eqa_finetuned_pipeline", lang = "ms")
annotations =  pipeline.transform(df)   


val pipeline = new PretrainedPipeline("indobert_for_eqa_finetuned_pipeline", lang = "ms")
val annotations = pipeline.transform(df)

Model Information

Model Name: indobert_for_eqa_finetuned_pipeline
Type: pipeline
Compatibility: Spark NLP 5.5.0+
License: Open Source
Edition: Official
Language: ms
Size: 411.7 MB

References

https://huggingface.co/primasr/indobert-for-eqa-finetuned

Included Models

  • MultiDocumentAssembler
  • BertForQuestionAnswering