Description
Pretrained Named Entity Recognition model, uploaded to Hugging Face, adapted and imported into Spark NLP. bert-large-uncased-finetuned-ner
is a English model orginally trained by Jorgeutd
.
Predicted Entities
LOC
, PER
, ORG
, MISC
How to use
documentAssembler = DocumentAssembler() \
.setInputCol("text") \
.setOutputCol("document")
sentenceDetector = SentenceDetectorDLModel.pretrained("sentence_detector_dl", "xx")\
.setInputCols(["document"])\
.setOutputCol("sentence")
tokenizer = Tokenizer() \
.setInputCols("sentence") \
.setOutputCol("token")
tokenClassifier = BertForTokenClassification.pretrained("bert_ner_bert_large_uncased_finetuned_ner","en") \
.setInputCols(["sentence", "token"]) \
.setOutputCol("ner")
pipeline = Pipeline(stages=[documentAssembler, sentenceDetector, tokenizer, tokenClassifier])
data = spark.createDataFrame([["I love Spark NLP"]]).toDF("text")
result = pipeline.fit(data).transform(data)
val documentAssembler = new DocumentAssembler()
.setInputCol("text")
.setOutputCol("document")
val sentenceDetector = SentenceDetectorDLModel.pretrained("sentence_detector_dl", "xx")
.setInputCols(Array("document"))
.setOutputCol("sentence")
val tokenizer = new Tokenizer()
.setInputCols(Array("sentence"))
.setOutputCol("token")
val tokenClassifier = BertForTokenClassification.pretrained("bert_ner_bert_large_uncased_finetuned_ner","en")
.setInputCols(Array("sentence", "token"))
.setOutputCol("ner")
val pipeline = new Pipeline().setStages(Array(documentAssembler,sentenceDetector, tokenizer, tokenClassifier))
val data = Seq("I love Spark NLP").toDF("text")
val result = pipeline.fit(data).transform(data)
import nlu
nlu.load("en.ner.bert.conll.uncased_large_finetuned").predict("""I love Spark NLP""")
Model Information
Model Name: | bert_ner_bert_large_uncased_finetuned_ner |
Compatibility: | Spark NLP 3.4.2+ |
License: | Open Source |
Edition: | Official |
Input Labels: | [document, token] |
Output Labels: | [ner] |
Language: | en |
Size: | 1.3 GB |
Case sensitive: | true |
Max sentence length: | 128 |
References
- https://huggingface.co/Jorgeutd/bert-large-uncased-finetuned-ner
- https://paperswithcode.com/sota?task=Token+Classification&dataset=conll2003