當前位置: 首頁>>代碼示例>>Java>>正文


Java Sign類代碼示例

本文整理匯總了Java中org.nd4j.linalg.api.ops.impl.transforms.Sign的典型用法代碼示例。如果您正苦於以下問題:Java Sign類的具體用法?Java Sign怎麽用?Java Sign使用的例子?那麽, 這裏精選的類代碼示例或許可以為您提供幫助。


Sign類屬於org.nd4j.linalg.api.ops.impl.transforms包,在下文中一共展示了Sign類的4個代碼示例,這些例子默認根據受歡迎程度排序。您可以為喜歡或者感覺有用的代碼點讚,您的評價將有助於係統推薦出更棒的Java代碼示例。

示例1: computeGradient

import org.nd4j.linalg.api.ops.impl.transforms.Sign; //導入依賴的package包/類
@Override
public INDArray computeGradient(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask) {
    if (labels.size(1) != preOutput.size(1)) {
        throw new IllegalArgumentException(
                        "Labels array numColumns (size(1) = " + labels.size(1) + ") does not match output layer"
                                        + " number of outputs (nOut = " + preOutput.size(1) + ") ");

    }
    INDArray output = activationFn.getActivation(preOutput.dup(), true);

    INDArray outSubLabels = output.sub(labels);
    INDArray dLda = Nd4j.getExecutioner().execAndReturn(new Sign(outSubLabels));

    if (weights != null) {
        dLda.muliRowVector(weights);
    }

    if (mask != null && LossUtil.isPerOutputMasking(dLda, mask)) {
        //For *most* activation functions: we don't actually need to mask dL/da in addition to masking dL/dz later
        //but: some, like softmax, require both (due to dL/dz_i being a function of dL/da_j, for i != j)
        //We could add a special case for softmax (activationFn instanceof ActivationSoftmax) but that would be
        // error prone - but buy us a tiny bit of performance
        LossUtil.applyMask(dLda, mask);
    }

    //dL/dz
    INDArray gradients = activationFn.backprop(preOutput, dLda).getFirst(); //TODO activation function param gradients

    if (mask != null) {
        LossUtil.applyMask(gradients, mask);
    }

    return gradients;
}
 
開發者ID:deeplearning4j,項目名稱:nd4j,代碼行數:35,代碼來源:LossL1.java

示例2: computeGradient

import org.nd4j.linalg.api.ops.impl.transforms.Sign; //導入依賴的package包/類
@Override
public INDArray computeGradient(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask) {
    if (labels.size(1) != preOutput.size(1)) {
        throw new IllegalArgumentException(
                        "Labels array numColumns (size(1) = " + labels.size(1) + ") does not match output layer"
                                        + " number of outputs (nOut = " + preOutput.size(1) + ") ");

    }
    INDArray output = activationFn.getActivation(preOutput.dup(), true);

    INDArray actSubPredicted = labels.sub(output);
    INDArray dLda = Nd4j.getExecutioner().execAndReturn(new Sign(actSubPredicted));
    INDArray absLabels = Nd4j.getExecutioner().execAndReturn(new Abs(labels.dup()));
    dLda.divi(absLabels).muli(-100.0 / labels.size(1));

    //Weighted loss function
    if (weights != null) {
        dLda.muliRowVector(weights);
    }

    if (mask != null && LossUtil.isPerOutputMasking(dLda, mask)) {
        //For *most* activation functions: we don't actually need to mask dL/da in addition to masking dL/dz later
        //but: some, like softmax, require both (due to dL/dz_i being a function of dL/da_j, for i != j)
        //We could add a special case for softmax (activationFn instanceof ActivationSoftmax) but that would be
        // error prone - but buy us a tiny bit of performance
        LossUtil.applyMask(dLda, mask);
    }

    INDArray gradient = activationFn.backprop(preOutput, dLda).getFirst(); //TODO activation functions with params

    if (mask != null) {
        LossUtil.applyMask(gradient, mask);
    }

    return gradient;
}
 
開發者ID:deeplearning4j,項目名稱:nd4j,代碼行數:37,代碼來源:LossMAPE.java

示例3: hash

import org.nd4j.linalg.api.ops.impl.transforms.Sign; //導入依賴的package包/類
/**
 * Returns hash values for a particular query
 * @param data a query vector
 * @return its hashed value
 */
public INDArray hash(INDArray data) {
    if (data.shape()[1] != inDimension){
        throw new ND4JIllegalStateException(
                String.format("Invalid shape: Requested INDArray shape %s, this table expects dimension %d",
                        Arrays.toString(data.shape()), inDimension));
    }
    INDArray projected = data.mmul(randomProjection);
    INDArray res = Nd4j.getExecutioner().execAndReturn(new Sign(projected));
    return res;
}
 
開發者ID:deeplearning4j,項目名稱:deeplearning4j,代碼行數:16,代碼來源:RandomProjectionLSH.java

示例4: getFunctionValues

import org.nd4j.linalg.api.ops.impl.transforms.Sign; //導入依賴的package包/類
@Override
public INDArray getFunctionValues(final INDArray x) {
    final INDArray sin = Nd4j.getExecutioner().execAndReturn(new Sin(x.dup()));
    return Nd4j.getExecutioner().execAndReturn(new Sign(sin));
}
 
開發者ID:IsaacChanghau,項目名稱:NeuralNetworksLite,代碼行數:6,代碼來源:SquareWaveMathFunction.java


注:本文中的org.nd4j.linalg.api.ops.impl.transforms.Sign類示例由純淨天空整理自Github/MSDocs等開源代碼及文檔管理平台,相關代碼片段篩選自各路編程大神貢獻的開源項目,源碼版權歸原作者所有,傳播和使用請參考對應項目的License;未經允許,請勿轉載。