本文整理汇总了Java中org.nd4j.linalg.api.ops.impl.transforms.Sign类的典型用法代码示例。如果您正苦于以下问题:Java Sign类的具体用法?Java Sign怎么用?Java Sign使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。
Sign类属于org.nd4j.linalg.api.ops.impl.transforms包,在下文中一共展示了Sign类的4个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。
示例1: computeGradient
import org.nd4j.linalg.api.ops.impl.transforms.Sign; //导入依赖的package包/类
@Override
public INDArray computeGradient(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask) {
if (labels.size(1) != preOutput.size(1)) {
throw new IllegalArgumentException(
"Labels array numColumns (size(1) = " + labels.size(1) + ") does not match output layer"
+ " number of outputs (nOut = " + preOutput.size(1) + ") ");
}
INDArray output = activationFn.getActivation(preOutput.dup(), true);
INDArray outSubLabels = output.sub(labels);
INDArray dLda = Nd4j.getExecutioner().execAndReturn(new Sign(outSubLabels));
if (weights != null) {
dLda.muliRowVector(weights);
}
if (mask != null && LossUtil.isPerOutputMasking(dLda, mask)) {
//For *most* activation functions: we don't actually need to mask dL/da in addition to masking dL/dz later
//but: some, like softmax, require both (due to dL/dz_i being a function of dL/da_j, for i != j)
//We could add a special case for softmax (activationFn instanceof ActivationSoftmax) but that would be
// error prone - but buy us a tiny bit of performance
LossUtil.applyMask(dLda, mask);
}
//dL/dz
INDArray gradients = activationFn.backprop(preOutput, dLda).getFirst(); //TODO activation function param gradients
if (mask != null) {
LossUtil.applyMask(gradients, mask);
}
return gradients;
}
示例2: computeGradient
import org.nd4j.linalg.api.ops.impl.transforms.Sign; //导入依赖的package包/类
@Override
public INDArray computeGradient(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask) {
if (labels.size(1) != preOutput.size(1)) {
throw new IllegalArgumentException(
"Labels array numColumns (size(1) = " + labels.size(1) + ") does not match output layer"
+ " number of outputs (nOut = " + preOutput.size(1) + ") ");
}
INDArray output = activationFn.getActivation(preOutput.dup(), true);
INDArray actSubPredicted = labels.sub(output);
INDArray dLda = Nd4j.getExecutioner().execAndReturn(new Sign(actSubPredicted));
INDArray absLabels = Nd4j.getExecutioner().execAndReturn(new Abs(labels.dup()));
dLda.divi(absLabels).muli(-100.0 / labels.size(1));
//Weighted loss function
if (weights != null) {
dLda.muliRowVector(weights);
}
if (mask != null && LossUtil.isPerOutputMasking(dLda, mask)) {
//For *most* activation functions: we don't actually need to mask dL/da in addition to masking dL/dz later
//but: some, like softmax, require both (due to dL/dz_i being a function of dL/da_j, for i != j)
//We could add a special case for softmax (activationFn instanceof ActivationSoftmax) but that would be
// error prone - but buy us a tiny bit of performance
LossUtil.applyMask(dLda, mask);
}
INDArray gradient = activationFn.backprop(preOutput, dLda).getFirst(); //TODO activation functions with params
if (mask != null) {
LossUtil.applyMask(gradient, mask);
}
return gradient;
}
示例3: hash
import org.nd4j.linalg.api.ops.impl.transforms.Sign; //导入依赖的package包/类
/**
* Returns hash values for a particular query
* @param data a query vector
* @return its hashed value
*/
public INDArray hash(INDArray data) {
if (data.shape()[1] != inDimension){
throw new ND4JIllegalStateException(
String.format("Invalid shape: Requested INDArray shape %s, this table expects dimension %d",
Arrays.toString(data.shape()), inDimension));
}
INDArray projected = data.mmul(randomProjection);
INDArray res = Nd4j.getExecutioner().execAndReturn(new Sign(projected));
return res;
}
示例4: getFunctionValues
import org.nd4j.linalg.api.ops.impl.transforms.Sign; //导入依赖的package包/类
@Override
public INDArray getFunctionValues(final INDArray x) {
final INDArray sin = Nd4j.getExecutioner().execAndReturn(new Sin(x.dup()));
return Nd4j.getExecutioner().execAndReturn(new Sign(sin));
}