docs/source/en/model_doc/mobilebert.md
This model was released on 2020-04-06 and added to Hugging Face Transformers on 2020-11-16.
<div style="float: right;"> <div class="flex flex-wrap space-x-1"></div>
MobileBERT is a lightweight and efficient variant of BERT, specifically designed for resource-limited devices such as mobile phones. It retains BERT's architecture but significantly reduces model size and inference latency while maintaining strong performance on NLP tasks. MobileBERT achieves this through a bottleneck structure and carefully balanced self-attention and feedforward networks. The model is trained by knowledge transfer from a large BERT model with an inverted bottleneck structure.
You can find the original MobileBERT checkpoint under the Google organization.
[!TIP] Click on the MobileBERT models in the right sidebar for more examples of how to apply MobileBERT to different language tasks.
The example below demonstrates how to predict the [MASK] token with [Pipeline], [AutoModel], and from the command line.
from transformers import pipeline
pipeline = pipeline(
task="fill-mask",
model="google/mobilebert-uncased",
device=0
)
pipeline("The capital of France is [MASK].")
import torch
from transformers import AutoModelForMaskedLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained(
"google/mobilebert-uncased",
)
model = AutoModelForMaskedLM.from_pretrained(
"google/mobilebert-uncased",
device_map="auto",
)
inputs = tokenizer("The capital of France is [MASK].", return_tensors="pt").to(model.device)
with torch.no_grad():
outputs = model(**inputs)
predictions = outputs.logits
masked_index = torch.where(inputs['input_ids'] == tokenizer.mask_token_id)[1]
predicted_token_id = predictions[0, masked_index].argmax(dim=-1)
predicted_token = tokenizer.decode(predicted_token_id)
print(f"The predicted token is: {predicted_token}")
[[autodoc]] MobileBertConfig
[[autodoc]] MobileBertTokenizer
[[autodoc]] MobileBertTokenizerFast
[[autodoc]] models.mobilebert.modeling_mobilebert.MobileBertForPreTrainingOutput
[[autodoc]] MobileBertModel - forward
[[autodoc]] MobileBertForPreTraining - forward
[[autodoc]] MobileBertForMaskedLM - forward
[[autodoc]] MobileBertForNextSentencePrediction - forward
[[autodoc]] MobileBertForSequenceClassification - forward
[[autodoc]] MobileBertForMultipleChoice - forward
[[autodoc]] MobileBertForTokenClassification - forward
[[autodoc]] MobileBertForQuestionAnswering - forward