本文整理汇总了Python中tensorflow.enable_resource_variables方法的典型用法代码示例。如果您正苦于以下问题:Python tensorflow.enable_resource_variables方法的具体用法?Python tensorflow.enable_resource_variables怎么用?Python tensorflow.enable_resource_variables使用的例子?那么恭喜您, 这里精选的方法代码示例或许可以为您提供帮助。您也可以进一步了解该方法所在类tensorflow
的用法示例。
在下文中一共展示了tensorflow.enable_resource_variables方法的2个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。
示例1: __init__
# 需要导入模块: import tensorflow [as 别名]
# 或者: from tensorflow import enable_resource_variables [as 别名]
def __init__(self, ncf_dataset, params, num_train_steps, num_eval_steps,
use_while_loop):
self._num_train_steps = num_train_steps
self._num_eval_steps = num_eval_steps
self._use_while_loop = use_while_loop
with tf.Graph().as_default() as self._graph:
if params["use_xla_for_gpu"]:
# The XLA functions we use require resource variables.
tf.enable_resource_variables()
self._ncf_dataset = ncf_dataset
self._global_step = tf.train.create_global_step()
self._train_model_properties = self._build_model(params, num_train_steps,
is_training=True)
self._eval_model_properties = self._build_model(params, num_eval_steps,
is_training=False)
initializer = tf.global_variables_initializer()
self._graph.finalize()
self._session = tf.Session(graph=self._graph)
self._session.run(initializer)
示例2: main
# 需要导入模块: import tensorflow [as 别名]
# 或者: from tensorflow import enable_resource_variables [as 别名]
def main(argv):
del argv # Unused.
# If using update_damping_immediately resource variables must be enabled.
# (Although they probably will be by default on TPUs.)
if FLAGS.update_damping_immediately:
tf.enable_resource_variables()
tf.set_random_seed(FLAGS.seed)
# Invert using cholesky decomposition + triangular solve. This is the only
# code path for matrix inversion supported on TPU right now.
kfac.utils.set_global_constants(posdef_inv_method='cholesky')
kfac.fisher_factors.set_global_constants(
eigenvalue_decomposition_threshold=10000)
if not FLAGS.use_sua_approx:
if FLAGS.use_custom_patches_op:
kfac.fisher_factors.set_global_constants(
use_patches_second_moment_op=True
)
else:
# Temporary measure to save memory with giant batches:
kfac.fisher_factors.set_global_constants(
sub_sample_inputs=True,
inputs_to_extract_patches_factor=0.1)
config = make_tpu_run_config(
FLAGS.master, FLAGS.seed, FLAGS.model_dir, FLAGS.iterations_per_loop,
FLAGS.save_checkpoints_steps)
estimator = contrib_tpu.TPUEstimator(
use_tpu=True,
model_fn=_model_fn,
config=config,
train_batch_size=FLAGS.batch_size,
eval_batch_size=1024)
estimator.train(
input_fn=mnist_input_fn,
max_steps=FLAGS.train_steps,
hooks=[])