Class/Object

org.nd4j.linalg.lossfunctions.impl

LossL1L2

Related Docs: object LossL1L2 | package impl

Permalink

class LossL1L2 extends LossComm with ILossFunction

Annotations
@EqualsAndHashCode()
Linear Supertypes
LossComm, ILossFunction, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. LossL1L2
  2. LossComm
  3. ILossFunction
  4. Serializable
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new LossL1L2()

    Permalink
  2. new LossL1L2(weights: INDArray)

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. def computeGradient(labels: INDArray, preOutput: INDArray, activationFn: IActivation, mask: INDArray): INDArray

    Permalink

    Compute the gradient of the loss function with respect to the inputs: dL/dOutput

    Compute the gradient of the loss function with respect to the inputs: dL/dOutput

    labels

    Label/expected output

    preOutput

    Output of the model (neural network), before the activation function is applied

    activationFn

    Activation function that should be applied to preOutput

    mask

    Mask array; may be null

    returns

    Gradient dL/dPreOut

    Definition Classes
    LossComm → ILossFunction
    Note

    fixed, no need to change

  7. def computeGradientAndScore(labels: INDArray, preOutput: INDArray, activationFn: IActivation, mask: INDArray, average: Boolean): Pair[Double, INDArray]

    Permalink

    computeGradientAndScore Compute both the score (loss function value) and gradient.

    computeGradientAndScore Compute both the score (loss function value) and gradient. This is equivalent to calling computeScore and computeGradient individually

    labels

    Label/expected output

    preOutput

    Output of the model (neural network)

    activationFn

    Activation function that should be applied to preOutput

    mask

    Mask array; may be null

    average

    Whether the score should be averaged (divided by number of rows in labels/output) or not

    returns

    The score (loss function value) and gradient

    Definition Classes
    LossComm → ILossFunction
    Note

    fixed, no need to change

  8. def computeScore(labels: INDArray, preOutput: INDArray, activationFn: IActivation, mask: INDArray, average: Boolean): Double

    Permalink

    computeScore Compute the score (loss function value) for the given inputs.

    computeScore Compute the score (loss function value) for the given inputs.

    labels

    Label/expected preOutput

    preOutput

    Output of the model (neural network)

    activationFn

    Activation function that should be applied to preOutput

    mask

    Mask array; may be null

    average

    Whether the score should be averaged (divided by number of rows in labels/preOutput) or not

    returns

    Loss function value

    Definition Classes
    LossComm → ILossFunction
    Note

    fixed, no need to change

  9. def computeScoreArray(labels: INDArray, preOutput: INDArray, activationFn: IActivation, mask: INDArray): INDArray

    Permalink

    computeScoreArray Compute the score (loss function value) for each example individually.

    computeScoreArray Compute the score (loss function value) for each example individually. For input [numExamples,nOut] returns scores as a column vector: [numExamples,1]

    labels

    Labels/expected output

    preOutput

    Output of the model (neural network)

    activationFn

    Activation function that should be applied to preOutput

    returns

    Loss function value for each example; column vector

    Definition Classes
    LossComm → ILossFunction
    Note

    fixed, no need to change

  10. def computedLdYHat(labels: INDArray, preOutput: INDArray, activationFn: IActivation, mask: INDArray): INDArray

    Permalink

    computeGradient dLdYHat Compute the gradient wrt to the preout (which is the input to the final layer of the neural net) Use the chain rule In this case L = (y - yHat)^2 + |y - yHat| dL/dyHat = -2*(y-yHat) - sign(y-yHat), sign of y - yHat = +1 if y-yHat>= 0 else -1 dyHat/dpreout = d(Activation(preout))/dpreout = Activation'(preout) dL/dpreout = dL/dyHat * dyHat/dpreout

    computeGradient dLdYHat Compute the gradient wrt to the preout (which is the input to the final layer of the neural net) Use the chain rule In this case L = (y - yHat)^2 + |y - yHat| dL/dyHat = -2*(y-yHat) - sign(y-yHat), sign of y - yHat = +1 if y-yHat>= 0 else -1 dyHat/dpreout = d(Activation(preout))/dpreout = Activation'(preout) dL/dpreout = dL/dyHat * dyHat/dpreout

    labels

    Label/expected output

    preOutput

    Output of the model (neural network), before the activation function is applied

    activationFn

    Activation function that should be applied to preOutput

    mask

    Mask array; may be null

    returns

    Gradient dL/dYHat

    Definition Classes
    LossL1L2LossComm
  11. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  12. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  13. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  14. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  15. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  16. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  17. def name(): String

    Permalink
    Definition Classes
    LossL1L2LossComm → ILossFunction
  18. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  19. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  20. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  21. def scoreArray(labels: INDArray, preOutput: INDArray, activationFn: IActivation, mask: INDArray): INDArray

    Permalink

    scoreArray Calculates the loss for a single data point or in other words a batch size of one

    scoreArray Calculates the loss for a single data point or in other words a batch size of one

    labels

    Labels/expected output

    preOutput

    Output of the model (neural network)

    activationFn

    Activation function that should be applied to preOutput

    mask

    Mask associated with the labels

    returns

    An array the shape and size of the output of the neural net.

    Definition Classes
    LossL1L2LossComm
    Note

    needs modification based on the actual loss function.

  22. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  23. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  24. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  25. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  26. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from LossComm

Inherited from ILossFunction

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped