BERT (MLM) model integration with uagent to predict the masked words! #161
Closed
sangramsam
started this conversation in
Integrations
Replies: 2 comments
-
Hi @sangramsam Are you ready with the integration, if you need any support please feel free to reach back to us and we will help you develop the integration. If you are not working on this any further we can close this discussion. |
Beta Was this translation helpful? Give feedback.
0 replies
-
As there is no response from your side, we are closing this discussion. If you have further queries or ever want to work on this we are good to guide and support you. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello Team,
Happy to see my last PR being merged. I have been working parallelly on another integration bert-base-uncased which is 2nd most downloaded model on huggingface. I m close to testing and finishing this integration, will try to raise PR asap.
BERT is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labeling them in any way with an automatic process to generate inputs and labels from those texts.
I'd appreciate any guidance or feedback to this integration.
Beta Was this translation helpful? Give feedback.
All reactions