當前位置: 首頁>>代碼示例>>Python>>正文


Python BertForSequenceClassification.from_pretrained方法代碼示例

本文整理匯總了Python中transformers.BertForSequenceClassification.from_pretrained方法的典型用法代碼示例。如果您正苦於以下問題:Python BertForSequenceClassification.from_pretrained方法的具體用法?Python BertForSequenceClassification.from_pretrained怎麽用?Python BertForSequenceClassification.from_pretrained使用的例子?那麽, 這裏精選的方法代碼示例或許可以為您提供幫助。您也可以進一步了解該方法所在transformers.BertForSequenceClassification的用法示例。


在下文中一共展示了BertForSequenceClassification.from_pretrained方法的5個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Python代碼示例。

示例1: set_classifier

# 需要導入模塊: from transformers import BertForSequenceClassification [as 別名]
# 或者: from transformers.BertForSequenceClassification import from_pretrained [as 別名]
def set_classifier(self, classifier_type=CLASSIFIER_TYPE_RNN, classifier=None):
        """ Set the classifier from prepackaged option or provide custom classifier

        :param classifier_type: One of ['BERT', 'RNN', 'BERT_RNN']
        :type: str
        :param classifier: Custom provided classifier
        :type: Any
        """
        if classifier_type == CLASSIFIER_TYPE_RNN or classifier_type == CLASSIFIER_TYPE_BERT_RNN:
            self.classifier = ClassifierModule(self.model_config, self.preprocessor.word_vocab)
        elif classifier_type == CLASSIFIER_TYPE_BERT:
            self.classifier = BertForSequenceClassification.from_pretrained(
                "bert-base-uncased",
                num_labels=self.model_config.num_labels,
                output_hidden_states=False,
                output_attentions=False,
            )
        else:
            self.classifier = classifier 
開發者ID:interpretml,項目名稱:interpret-text,代碼行數:21,代碼來源:explainer.py

示例2: set_anti_classifier

# 需要導入模塊: from transformers import BertForSequenceClassification [as 別名]
# 或者: from transformers.BertForSequenceClassification import from_pretrained [as 別名]
def set_anti_classifier(self, classifier_type=CLASSIFIER_TYPE_RNN, anti_classifier=None):
        """ Set anti classifier from prepackaged option or provide custom anti classifier

        :param classifier_type: One of ['BERT', 'RNN', 'BERT_RNN']
        :type str
        :param anti_classifier: Custom provided anti classifier
        :type Any
        """
        if classifier_type == CLASSIFIER_TYPE_RNN or classifier_type == CLASSIFIER_TYPE_BERT_RNN:
            self.anti_classifier = ClassifierModule(self.model_config, self.preprocessor.word_vocab)

        elif classifier_type == CLASSIFIER_TYPE_BERT:
            self.anti_classifier = BertForSequenceClassification.from_pretrained(
                "bert-base-uncased",
                num_labels=self.model_config.num_labels,
                output_hidden_states=False,
                output_attentions=False,
            )
        else:
            self.anti_classifier = anti_classifier 
開發者ID:interpretml,項目名稱:interpret-text,代碼行數:22,代碼來源:explainer.py

示例3: set_generator_classifier

# 需要導入模塊: from transformers import BertForSequenceClassification [as 別名]
# 或者: from transformers.BertForSequenceClassification import from_pretrained [as 別名]
def set_generator_classifier(self, classifier_type=CLASSIFIER_TYPE_RNN, generator_classifier=None):
        """ Set classifier for the Generator

        :param classifier_type: One of ['BERT', 'RNN', 'BERT_RNN']
        :type classifier_type: str
        :param generator_classifier: Custom provided classifier for generator
        :type generator_classifier: Any
        :return: Any
        """
        if classifier_type == CLASSIFIER_TYPE_RNN:
            self.generator_classifier = ClassifierModule(self.model_config, self.preprocessor.word_vocab)
        elif classifier_type == CLASSIFIER_TYPE_BERT or classifier_type == CLASSIFIER_TYPE_BERT_RNN:
            self.generator_classifier = BertForSequenceClassification.from_pretrained(
                "bert-base-uncased",
                num_labels=self.model_config.num_labels,
                output_hidden_states=True,
                output_attentions=True,
            )
        else:
            self.generator_classifier = generator_classifier 
開發者ID:interpretml,項目名稱:interpret-text,代碼行數:22,代碼來源:explainer.py

示例4: __init__

# 需要導入模塊: from transformers import BertForSequenceClassification [as 別名]
# 或者: from transformers.BertForSequenceClassification import from_pretrained [as 別名]
def __init__(self, cache_dir=DEFAULT_CACHE_DIR, verbose=False):
        from transformers import BertTokenizer, BertForSequenceClassification

        # download the model or load the model path
        path_emotion = download_model('bert.emotion', cache_dir,
                                       process_func=_unzip_process_func,
                                       verbose=verbose)
        path_emotion = os.path.join(path_emotion,'bert.emotion')
        path_reject = download_model('bert.noemotion', cache_dir,
                                       process_func=_unzip_process_func,
                                       verbose=verbose)
        path_reject = os.path.join(path_reject,'bert.noemotion')
        # load the models
        self.tokenizer_rejct = BertTokenizer.from_pretrained(path_reject)
        self.model_reject = BertForSequenceClassification.from_pretrained(path_reject)
        
        self.tokenizer = BertTokenizer.from_pretrained(path_emotion)
        self.model = BertForSequenceClassification.from_pretrained(path_emotion)
        
        # load the class names mapping
        self.catagories = {5: 'Foragt/Modvilje', 2: 'Forventning/Interrese',
                           0: 'Glæde/Sindsro', 3: 'Overasket/Målløs',
                           1: 'Tillid/Accept',
                           4: 'Vrede/Irritation', 6: 'Sorg/trist',
                           7: 'Frygt/Bekymret'} 
開發者ID:alexandrainst,項目名稱:danlp,代碼行數:27,代碼來源:bert_models.py

示例5: __init__

# 需要導入模塊: from transformers import BertForSequenceClassification [as 別名]
# 或者: from transformers.BertForSequenceClassification import from_pretrained [as 別名]
def __init__(self, pretrain_path, max_length, cat_entity_rep=False): 
        nn.Module.__init__(self)
        self.bert = BertModel.from_pretrained(pretrain_path)
        self.max_length = max_length
        self.tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
        self.cat_entity_rep = cat_entity_rep 
開發者ID:thunlp,項目名稱:FewRel,代碼行數:8,代碼來源:sentence_encoder.py


注:本文中的transformers.BertForSequenceClassification.from_pretrained方法示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。