当前位置: 首页>>代码示例>>Python>>正文


Python gen_nn_ops._softmax_cross_entropy_with_logits函数代码示例

本文整理汇总了Python中tensorflow.python.ops.gen_nn_ops._softmax_cross_entropy_with_logits函数的典型用法代码示例。如果您正苦于以下问题:Python _softmax_cross_entropy_with_logits函数的具体用法?Python _softmax_cross_entropy_with_logits怎么用?Python _softmax_cross_entropy_with_logits使用的例子?那么恭喜您, 这里精选的函数代码示例或许可以为您提供帮助。


在下文中一共展示了_softmax_cross_entropy_with_logits函数的5个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Python代码示例。

示例1: softmax_cross_entropy_with_logits

def softmax_cross_entropy_with_logits(logits, labels, name=None):
    """Computes softmax cross entropy between `logits` and `labels`.

  Measures the probability error in discrete classification tasks in which the
  classes are mutually exclusive (each entry is in exactly one class).  For
  example, each CIFAR-10 image is labeled with one and only one label: an image
  can be a dog or a truck, but not both.

  **NOTE:**  While the classes are mutually exclusive, their probabilities
  need not be.  All that is required is that each row of `labels` is
  a valid probability distribution.  If using exclusive `labels`
  (wherein one and only one class is true at a time), see
  `sparse_softmax_cross_entropy_with_logits`.

  **WARNING:** This op expects unscaled logits, since it performs a `softmax`
  on `logits` internally for efficiency.  Do not call this op with the
  output of `softmax`, as it will produce incorrect results.

  `logits` and `labels` must have the same shape `[batch_size, num_classes]`
  and the same dtype (either `float32` or `float64`).

  Args:
    logits: Unscaled log probabilities.
    labels: Each row `labels[i]` must be a valid probability distribution.
    name: A name for the operation (optional).

  Returns:
    A 1-D `Tensor` of length `batch_size` of the same type as `logits` with the
    softmax cross entropy loss.
  """
    # The second output tensor contains the gradients.  We use it in
    # _CrossEntropyGrad() in nn_grad but not here.
    cost, unused_backprop = gen_nn_ops._softmax_cross_entropy_with_logits(logits, labels, name=name)
    return cost
开发者ID:surround-io,项目名称:tensorflow,代码行数:34,代码来源:nn_ops.py

示例2: _testXent

 def _testXent(self, np_features, np_labels, use_gpu=False):
     np_loss, np_backprop = self._npXent(np_features, np_labels)
     with self.test_session(use_gpu=use_gpu) as sess:
         loss, backprop = gen_nn_ops._softmax_cross_entropy_with_logits(np_features, np_labels)
         tf_loss, tf_backprop = sess.run([loss, backprop])
     self.assertAllCloseAccordingToType(np_loss, tf_loss)
     self.assertAllCloseAccordingToType(np_backprop, tf_backprop)
开发者ID:pronobis,项目名称:tensorflow,代码行数:7,代码来源:xent_op_test.py

示例3: _testSingleClass

 def _testSingleClass(self, use_gpu=False):
   for dtype in np.float16, np.float32:
     with self.test_session(use_gpu=use_gpu) as sess:
       loss, backprop = gen_nn_ops._softmax_cross_entropy_with_logits(
           np.array([[1.], [-1.], [0.]]).astype(dtype),
           np.array([[-1.], [0.], [1.]]).astype(dtype))
       tf_loss, tf_backprop = sess.run([loss, backprop])
     self.assertAllClose([0.0, 0.0, 0.0], tf_loss)
     self.assertAllClose([[2.0], [1.0], [0.0]], tf_backprop)
开发者ID:1000sprites,项目名称:tensorflow,代码行数:9,代码来源:xent_op_test.py

示例4: testNotMatrix

 def testNotMatrix(self):
   with self.test_session():
     with self.assertRaises(ValueError):
       gen_nn_ops._softmax_cross_entropy_with_logits([0., 1., 2., 3.],
                                                     [0., 1., 0., 1.])
开发者ID:1000sprites,项目名称:tensorflow,代码行数:5,代码来源:xent_op_test.py

示例5: testShapeMismatch

 def testShapeMismatch(self):
   with self.test_session():
     with self.assertRaises(ValueError):
       gen_nn_ops._softmax_cross_entropy_with_logits(
           [[0., 1.], [2., 3.]], [[0., 1., 0.], [1., 0., 0.]])
开发者ID:1000sprites,项目名称:tensorflow,代码行数:5,代码来源:xent_op_test.py


注:本文中的tensorflow.python.ops.gen_nn_ops._softmax_cross_entropy_with_logits函数示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。