Class

org.nd4j.linalg.lossfunctions.impl

LossComm

Related Doc: package impl

Permalink

abstract class LossComm extends ILossFunction

LossComm LossFunction Implementation Common Utilities Only 3 parts for modification: 1) scoreArray and 2) computedLdYHat, 3) name

Annotations
@EqualsAndHashCode()
Linear Supertypes
ILossFunction, Serializable, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. LossComm
  2. ILossFunction
  3. Serializable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new LossComm()

    Permalink
  2. new LossComm(weights: INDArray)

    Permalink

Abstract Value Members

  1. abstract def computedLdYHat(labels: INDArray, preOutput: INDArray, activationFn: IActivation, mask: INDArray): INDArray

    Permalink

    computedLdYHat Compute the gradient of the loss function with respect to the prediction: dLdYHat

    computedLdYHat Compute the gradient of the loss function with respect to the prediction: dLdYHat

    labels

    Label/expected output

    preOutput

    Output of the model (neural network), before the activation function is applied

    activationFn

    Activation function that should be applied to preOutput

    mask

    Mask array; may be null

    returns

    Gradient dL/dYHat

    Note

    needs modification

  2. abstract def scoreArray(labels: INDArray, preOutput: INDArray, activationFn: IActivation, mask: INDArray): INDArray

    Permalink

    scoreArray Calculates the loss for a single data point or in other words a batch size of one

    scoreArray Calculates the loss for a single data point or in other words a batch size of one

    labels

    Labels/expected output

    preOutput

    Output of the model (neural network)

    activationFn

    Activation function that should be applied to preOutput

    mask

    Mask associated with the labels

    returns

    An array the shape and size of the output of the neural net.

    Note

    needs modification based on the actual loss function.

Concrete Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. def computeGradient(labels: INDArray, preOutput: INDArray, activationFn: IActivation, mask: INDArray): INDArray

    Permalink

    Compute the gradient of the loss function with respect to the inputs: dL/dOutput

    Compute the gradient of the loss function with respect to the inputs: dL/dOutput

    labels

    Label/expected output

    preOutput

    Output of the model (neural network), before the activation function is applied

    activationFn

    Activation function that should be applied to preOutput

    mask

    Mask array; may be null

    returns

    Gradient dL/dPreOut

    Definition Classes
    LossComm → ILossFunction
    Note

    fixed, no need to change

  7. def computeGradientAndScore(labels: INDArray, preOutput: INDArray, activationFn: IActivation, mask: INDArray, average: Boolean): Pair[Double, INDArray]

    Permalink

    computeGradientAndScore Compute both the score (loss function value) and gradient.

    computeGradientAndScore Compute both the score (loss function value) and gradient. This is equivalent to calling computeScore and computeGradient individually

    labels

    Label/expected output

    preOutput

    Output of the model (neural network)

    activationFn

    Activation function that should be applied to preOutput

    mask

    Mask array; may be null

    average

    Whether the score should be averaged (divided by number of rows in labels/output) or not

    returns

    The score (loss function value) and gradient

    Definition Classes
    LossComm → ILossFunction
    Note

    fixed, no need to change

  8. def computeScore(labels: INDArray, preOutput: INDArray, activationFn: IActivation, mask: INDArray, average: Boolean): Double

    Permalink

    computeScore Compute the score (loss function value) for the given inputs.

    computeScore Compute the score (loss function value) for the given inputs.

    labels

    Label/expected preOutput

    preOutput

    Output of the model (neural network)

    activationFn

    Activation function that should be applied to preOutput

    mask

    Mask array; may be null

    average

    Whether the score should be averaged (divided by number of rows in labels/preOutput) or not

    returns

    Loss function value

    Definition Classes
    LossComm → ILossFunction
    Note

    fixed, no need to change

  9. def computeScoreArray(labels: INDArray, preOutput: INDArray, activationFn: IActivation, mask: INDArray): INDArray

    Permalink

    computeScoreArray Compute the score (loss function value) for each example individually.

    computeScoreArray Compute the score (loss function value) for each example individually. For input [numExamples,nOut] returns scores as a column vector: [numExamples,1]

    labels

    Labels/expected output

    preOutput

    Output of the model (neural network)

    activationFn

    Activation function that should be applied to preOutput

    returns

    Loss function value for each example; column vector

    Definition Classes
    LossComm → ILossFunction
    Note

    fixed, no need to change

  10. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  11. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  12. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  14. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  15. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  16. def name(): String

    Permalink
    Definition Classes
    LossComm → ILossFunction
  17. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  18. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  19. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  20. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  21. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  22. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from ILossFunction

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped