本文整理汇总了Python中data_utils.read_names方法的典型用法代码示例。如果您正苦于以下问题:Python data_utils.read_names方法的具体用法?Python data_utils.read_names怎么用?Python data_utils.read_names使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类data_utils
的用法示例。
在下文中一共展示了data_utils.read_names方法的2个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1: train
# 需要导入模块: import data_utils [as 别名]
# 或者: from data_utils import read_names [as 别名]
def train(data_dir, checkpoint_path, config):
"""Trains the model with the given data
Args:
data_dir: path to the data for the model (see data_utils for data
format)
checkpoint_path: the path to save the trained model checkpoints
config: one of the above configs that specify the model and how it
should be run and trained
Returns:
None
"""
# Prepare Name data.
print("Reading Name data in %s" % data_dir)
names, counts = data_utils.read_names(data_dir)
with tf.Graph().as_default(), tf.Session() as session:
initializer = tf.random_uniform_initializer(-config.init_scale,
config.init_scale)
with tf.variable_scope("model", reuse=None, initializer=initializer):
m = NamignizerModel(is_training=True, config=config)
tf.global_variables_initializer().run()
for i in range(config.max_max_epoch):
lr_decay = config.lr_decay ** max(i - config.max_epoch, 0.0)
m.assign_lr(session, config.learning_rate * lr_decay)
print("Epoch: %d Learning rate: %.3f" % (i + 1, session.run(m.lr)))
train_perplexity = run_epoch(session, m, names, counts, config.epoch_size, m.train_op,
verbose=True)
print("Epoch: %d Train Perplexity: %.3f" %
(i + 1, train_perplexity))
m.saver.save(session, checkpoint_path, global_step=i)
示例2: train
# 需要导入模块: import data_utils [as 别名]
# 或者: from data_utils import read_names [as 别名]
def train(data_dir, checkpoint_path, config):
"""Trains the model with the given data
Args:
data_dir: path to the data for the model (see data_utils for data
format)
checkpoint_path: the path to save the trained model checkpoints
config: one of the above configs that specify the model and how it
should be run and trained
Returns:
None
"""
# Prepare Name data.
print("Reading Name data in %s" % data_dir)
names, counts = data_utils.read_names(data_dir)
with tf.Graph().as_default(), tf.Session() as session:
initializer = tf.random_uniform_initializer(-config.init_scale,
config.init_scale)
with tf.variable_scope("model", reuse=None, initializer=initializer):
m = NamignizerModel(is_training=True, config=config)
tf.initialize_all_variables().run()
for i in range(config.max_max_epoch):
lr_decay = config.lr_decay ** max(i - config.max_epoch, 0.0)
m.assign_lr(session, config.learning_rate * lr_decay)
print("Epoch: %d Learning rate: %.3f" % (i + 1, session.run(m.lr)))
train_perplexity = run_epoch(session, m, names, counts, config.epoch_size, m.train_op,
verbose=True)
print("Epoch: %d Train Perplexity: %.3f" %
(i + 1, train_perplexity))
m.saver.save(session, checkpoint_path, global_step=i)