当前位置: 首页>>代码示例>>Java>>正文


Java Sign类代码示例

本文整理汇总了Java中org.nd4j.linalg.api.ops.impl.transforms.Sign的典型用法代码示例。如果您正苦于以下问题:Java Sign类的具体用法?Java Sign怎么用?Java Sign使用的例子?那么, 这里精选的类代码示例或许可以为您提供帮助。


Sign类属于org.nd4j.linalg.api.ops.impl.transforms包,在下文中一共展示了Sign类的4个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于系统推荐出更棒的Java代码示例。

示例1: computeGradient

import org.nd4j.linalg.api.ops.impl.transforms.Sign; //导入依赖的package包/类
@Override
public INDArray computeGradient(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask) {
    if (labels.size(1) != preOutput.size(1)) {
        throw new IllegalArgumentException(
                        "Labels array numColumns (size(1) = " + labels.size(1) + ") does not match output layer"
                                        + " number of outputs (nOut = " + preOutput.size(1) + ") ");

    }
    INDArray output = activationFn.getActivation(preOutput.dup(), true);

    INDArray outSubLabels = output.sub(labels);
    INDArray dLda = Nd4j.getExecutioner().execAndReturn(new Sign(outSubLabels));

    if (weights != null) {
        dLda.muliRowVector(weights);
    }

    if (mask != null && LossUtil.isPerOutputMasking(dLda, mask)) {
        //For *most* activation functions: we don't actually need to mask dL/da in addition to masking dL/dz later
        //but: some, like softmax, require both (due to dL/dz_i being a function of dL/da_j, for i != j)
        //We could add a special case for softmax (activationFn instanceof ActivationSoftmax) but that would be
        // error prone - but buy us a tiny bit of performance
        LossUtil.applyMask(dLda, mask);
    }

    //dL/dz
    INDArray gradients = activationFn.backprop(preOutput, dLda).getFirst(); //TODO activation function param gradients

    if (mask != null) {
        LossUtil.applyMask(gradients, mask);
    }

    return gradients;
}
 
开发者ID:deeplearning4j,项目名称:nd4j,代码行数:35,代码来源:LossL1.java

示例2: computeGradient

import org.nd4j.linalg.api.ops.impl.transforms.Sign; //导入依赖的package包/类
@Override
public INDArray computeGradient(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask) {
    if (labels.size(1) != preOutput.size(1)) {
        throw new IllegalArgumentException(
                        "Labels array numColumns (size(1) = " + labels.size(1) + ") does not match output layer"
                                        + " number of outputs (nOut = " + preOutput.size(1) + ") ");

    }
    INDArray output = activationFn.getActivation(preOutput.dup(), true);

    INDArray actSubPredicted = labels.sub(output);
    INDArray dLda = Nd4j.getExecutioner().execAndReturn(new Sign(actSubPredicted));
    INDArray absLabels = Nd4j.getExecutioner().execAndReturn(new Abs(labels.dup()));
    dLda.divi(absLabels).muli(-100.0 / labels.size(1));

    //Weighted loss function
    if (weights != null) {
        dLda.muliRowVector(weights);
    }

    if (mask != null && LossUtil.isPerOutputMasking(dLda, mask)) {
        //For *most* activation functions: we don't actually need to mask dL/da in addition to masking dL/dz later
        //but: some, like softmax, require both (due to dL/dz_i being a function of dL/da_j, for i != j)
        //We could add a special case for softmax (activationFn instanceof ActivationSoftmax) but that would be
        // error prone - but buy us a tiny bit of performance
        LossUtil.applyMask(dLda, mask);
    }

    INDArray gradient = activationFn.backprop(preOutput, dLda).getFirst(); //TODO activation functions with params

    if (mask != null) {
        LossUtil.applyMask(gradient, mask);
    }

    return gradient;
}
 
开发者ID:deeplearning4j,项目名称:nd4j,代码行数:37,代码来源:LossMAPE.java

示例3: hash

import org.nd4j.linalg.api.ops.impl.transforms.Sign; //导入依赖的package包/类
/**
 * Returns hash values for a particular query
 * @param data a query vector
 * @return its hashed value
 */
public INDArray hash(INDArray data) {
    if (data.shape()[1] != inDimension){
        throw new ND4JIllegalStateException(
                String.format("Invalid shape: Requested INDArray shape %s, this table expects dimension %d",
                        Arrays.toString(data.shape()), inDimension));
    }
    INDArray projected = data.mmul(randomProjection);
    INDArray res = Nd4j.getExecutioner().execAndReturn(new Sign(projected));
    return res;
}
 
开发者ID:deeplearning4j,项目名称:deeplearning4j,代码行数:16,代码来源:RandomProjectionLSH.java

示例4: getFunctionValues

import org.nd4j.linalg.api.ops.impl.transforms.Sign; //导入依赖的package包/类
@Override
public INDArray getFunctionValues(final INDArray x) {
    final INDArray sin = Nd4j.getExecutioner().execAndReturn(new Sin(x.dup()));
    return Nd4j.getExecutioner().execAndReturn(new Sign(sin));
}
 
开发者ID:IsaacChanghau,项目名称:NeuralNetworksLite,代码行数:6,代码来源:SquareWaveMathFunction.java


注:本文中的org.nd4j.linalg.api.ops.impl.transforms.Sign类示例由纯净天空整理自Github/MSDocs等开源代码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。