本文整理汇总了Python中gensim.models.ldamodel.LdaModel.log_perplexity方法的典型用法代码示例。如果您正苦于以下问题:Python LdaModel.log_perplexity方法的具体用法?Python LdaModel.log_perplexity怎么用?Python LdaModel.log_perplexity使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类gensim.models.ldamodel.LdaModel
的用法示例。
在下文中一共展示了LdaModel.log_perplexity方法的1个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1: LdaModel
# 需要导入模块: from gensim.models.ldamodel import LdaModel [as 别名]
# 或者: from gensim.models.ldamodel.LdaModel import log_perplexity [as 别名]
run_id = "ldaU_K{K}_a{alpha_frac}-K_b{beta}_iter{iter}.gensim".format(K=num_topics, alpha_frac=alpha_frac, beta=beta, iter=num_iterations)
print run_id
output_file = output_file_template.format(run_id=run_id)
# Train and save
print 'Training...'
model = LdaModel(corpus,
alpha=alpha, eta=beta,
id2word=dictionary, num_topics=num_topics, iterations=num_iterations
)
# model = LdaMulticore(corpus,
# alpha=alpha, eta=beta,
# id2word=dictionary, num_topics=num_topics, iterations=num_iterations, workers=2
# )
print 'Done training.'
model.save(output_file)
# Print top 10 words in topics, if desired
if print_topics:
topics = model.show_topics(num_topics=100, formatted=False)
for topic in topics:
for tup in topic[1]:
print tup[0] + ": " + str(tup[1])
print '\n'
# Evaluate perplexity
ll = model.log_perplexity(test_corpus)
print "LL: "+str(ll)
print "Perp: "+str(np.exp2(-ll))