当前位置: 首页>>代码示例>>Python>>正文


Python KaggleWord2VecUtility.sku_to_sentences方法代码示例

本文整理汇总了Python中KaggleWord2VecUtility.KaggleWord2VecUtility.sku_to_sentences方法的典型用法代码示例。如果您正苦于以下问题:Python KaggleWord2VecUtility.sku_to_sentences方法的具体用法?Python KaggleWord2VecUtility.sku_to_sentences怎么用?Python KaggleWord2VecUtility.sku_to_sentences使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在KaggleWord2VecUtility.KaggleWord2VecUtility的用法示例。


在下文中一共展示了KaggleWord2VecUtility.sku_to_sentences方法的1个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1:

# 需要导入模块: from KaggleWord2VecUtility import KaggleWord2VecUtility [as 别名]
# 或者: from KaggleWord2VecUtility.KaggleWord2VecUtility import sku_to_sentences [as 别名]
    print "Read %d labeled train skucollection " % (train["product_title"].size)



    # Load the punkt tokenizer
    tokenizer = nltk.data.load('tokenizers/punkt/english.pickle')



    # ****** Split the labeled and unlabeled training sets into clean sentences
    #
    sentences = []  # Initialize an empty list of sentences

    print "Parsing sentences from training set"
    for sku in train["product_title"]:
        sentences += KaggleWord2VecUtility.sku_to_sentences(sku, tokenizer)

    # ****** Set parameters and train the word2vec model
    #
    # Import the built-in logging module and configure it so that Word2Vec
    # creates nice output messages
    logging.basicConfig(format='%(asctime)s : %(levelname)s : %(message)s',\
        level=logging.INFO)

    # Set values for various parameters
    num_features = 300    # Word vector dimensionality
    min_word_count = 40   # Minimum word count
    num_workers = 4       # Number of threads to run in parallel
    context = 10          # Context window size
    downsampling = 1e-3   # Downsample setting for frequent words
开发者ID:aviralmathur,项目名称:Word2Vec,代码行数:32,代码来源:Word2Vec_AverageVectors.py


注:本文中的KaggleWord2VecUtility.KaggleWord2VecUtility.sku_to_sentences方法示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。