Replies: 1 comment 6 replies
-
Hi, If you can convert your Pytorch model to TensorFlow (like if you are using Transformers you should be able to save it as SavedModel) or using ONNX and onnx-tensorflow then you can use it easily in Spark NLP with your TF SavedModel. This sounds like a very good topic to discuss, sharing ideas on how easily to fine-tune BERT, ELMO, ALBERT, ELECTRA, and XLNET which are available in Spark NLP by using libraries like HuggingFace (pytorch, tensorflow, or their native Trainer) and save it in TF SavedModel at the end. |
Beta Was this translation helpful? Give feedback.
-
Is your feature request related to a problem? Please describe.
I fine-tuned a BERT model for tweets classification and I need to load it to classify a stream of tweets.
Does Spark-NLP supports Pytorch BERT models or only tensorflow models? can we load our models fine-tuned outside Spark? Please advise if this is supported by Spark-NLP.
Describe the solution you'd like
Import a fine-tuned Pytorch BERT model and use it to classify tweets/text in real-time.
Additional context
I am using Spark 2.4.7 and PySpark.
Beta Was this translation helpful? Give feedback.
All reactions