TimeSeriesClassificationActivationFunctionsLibrary   "TimeSeriesClassificationActivationFunctions" 
Provides some activation functions useful in time series classification.
___
reference:
github.com
 method scale(dist, weights) 
  Activate values by a normalized scale.
  Namespace types: map
  Parameters:
     dist (map) : Source distribution map.
     weights (map) : Weights distribution map.
  Returns: Normalized distribution map.
 method softmax(dist, weights) 
  Activate values with a softmax algorithm.
  Namespace types: map
  Parameters:
     dist (map) : Source distribution map.
     weights (map) : Weights distribution map.
  Returns: Normalized distribution map.
 method argmax(dist, weights) 
  Activate values with a argmax algorithm.
  Namespace types: map
  Parameters:
     dist (map) : Source distribution map.
     weights (map) : Weights distribution map.
  Returns: first key of argmax value of the transformed distribution.
Activation
MLActivationFunctionsLibrary   "MLActivationFunctions" 
Activation functions for Neural networks.
 binary_step(value)  Basic threshold output classifier to activate/deactivate neuron.
  Parameters:
     value : float, value to process.
  Returns: float
 linear(value)  Input is the same as output.
  Parameters:
     value : float, value to process.
  Returns: float
 sigmoid(value)  Sigmoid or logistic function.
  Parameters:
     value : float, value to process.
  Returns: float
 sigmoid_derivative(value)  Derivative of sigmoid function.
  Parameters:
     value : float, value to process.
  Returns: float
 tanh(value)  Hyperbolic tangent function.
  Parameters:
     value : float, value to process.
  Returns: float
 tanh_derivative(value)  Hyperbolic tangent function derivative.
  Parameters:
     value : float, value to process.
  Returns: float
 relu(value)  Rectified linear unit (RELU) function.
  Parameters:
     value : float, value to process.
  Returns: float
 relu_derivative(value)  RELU function derivative.
  Parameters:
     value : float, value to process.
  Returns: float
 leaky_relu(value)  Leaky RELU function.
  Parameters:
     value : float, value to process.
  Returns: float
 leaky_relu_derivative(value)  Leaky RELU function derivative.
  Parameters:
     value : float, value to process.
  Returns: float
 relu6(value)  RELU-6 function.
  Parameters:
     value : float, value to process.
  Returns: float
 softmax(value)  Softmax function.
  Parameters:
     value : float array, values to process.
  Returns: float
 softplus(value)  Softplus function.
  Parameters:
     value : float, value to process.
  Returns: float
 softsign(value)  Softsign function.
  Parameters:
     value : float, value to process.
  Returns: float
 elu(value, alpha)  Exponential Linear Unit (ELU) function.
  Parameters:
     value : float, value to process.
     alpha : float, default=1.0, predefined constant, controls the value to which an ELU saturates for negative net inputs. .
  Returns: float
 selu(value, alpha, scale)  Scaled Exponential Linear Unit (SELU) function.
  Parameters:
     value : float, value to process.
     alpha : float, default=1.67326324, predefined constant, controls the value to which an SELU saturates for negative net inputs. .
     scale : float, default=1.05070098, predefined constant.
  Returns: float
 exponential(value)  Pointer to math.exp() function.
  Parameters:
     value : float, value to process.
  Returns: float
 function(name, value, alpha, scale)  Activation function.
  Parameters:
     name : string, name of activation function.
     value : float, value to process.
     alpha : float, default=na, if required. 
     scale : float, default=na, if required. 
  Returns: float
 derivative(name, value, alpha, scale)  Derivative Activation function.
  Parameters:
     name : string, name of activation function.
     value : float, value to process.
     alpha : float, default=na, if required. 
     scale : float, default=na, if required. 
  Returns: float

