当前位置: 首页>>代码示例>>Python>>正文


Python backprop.implicit_val_and_grad函数代码示例

本文整理汇总了Python中tensorflow.python.eager.backprop.implicit_val_and_grad函数的典型用法代码示例。如果您正苦于以下问题:Python implicit_val_and_grad函数的具体用法?Python implicit_val_and_grad怎么用?Python implicit_val_and_grad使用的例子?那么恭喜您, 这里精选的函数代码示例或许可以为您提供帮助。


在下文中一共展示了implicit_val_and_grad函数的3个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1: testGraphModeEagerGradError

 def testGraphModeEagerGradError(self):
   with context.graph_mode():
     def f():
       x = variable_scope.get_variable(
           'v', initializer=constant_op.constant(1.0))
       return x * constant_op.constant(2.0)
     with self.assertRaisesRegexp(ValueError,
                                  'No trainable variables were accessed'):
       backprop.implicit_val_and_grad(f)()
开发者ID:dananjayamahesh,项目名称:tensorflow,代码行数:9,代码来源:function_test.py

示例2: testCustomGradient

  def testCustomGradient(self):

    @custom_gradient.custom_gradient
    def my_mul(x, y):
      result = x*y

      def grad(dr):
        return [dr*y, dr*x]
      return result, grad

    lr = 0.25
    x = resource_variable_ops.ResourceVariable(2., name='x')

    def loss(x):
      return my_mul(2., x.read_value())

    loss_grads_fn = backprop.implicit_val_and_grad(loss)

    losses = []
    for _ in range(5):
      loss, grads_and_vars = loss_grads_fn(x)
      losses.append(loss.numpy())
      for (grad, var) in grads_and_vars:
        var.assign_sub(lr*grad)
    self.assertAllEqual(losses, [4.0, 3., 2., 1., 0.])
开发者ID:Wajih-O,项目名称:tensorflow,代码行数:25,代码来源:backprop_test.py

示例3: testDifferentiatingFunctionThatReturnsNone

  def testDifferentiatingFunctionThatReturnsNone(self):

    def fn(x, y):
      result = x*y  # pylint: disable=unused-variable

    x = constant_op.constant(1)
    y = constant_op.constant(2)

    loss_grads_fn = backprop.implicit_val_and_grad(fn)
    with self.assertRaisesRegexp(
        ValueError, 'Cannot differentiate a function that returns None; '
        'did you forget to return a value from fn?'):
      loss_grads_fn(x, y)

    val_and_grads_fn = backprop.val_and_grad_function(fn)
    with self.assertRaisesRegexp(
        ValueError, 'Cannot differentiate a function that returns None; '
        'did you forget to return a value from fn?'):
      val_and_grads_fn(x, y)
开发者ID:Wajih-O,项目名称:tensorflow,代码行数:19,代码来源:backprop_test.py


注:本文中的tensorflow.python.eager.backprop.implicit_val_and_grad函数示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。