PINE LIBRARY
Updated MLActivationFunctions

Library "MLActivationFunctions"
Activation functions for Neural networks.
binary_step(value) Basic threshold output classifier to activate/deactivate neuron.
Parameters:
value: float, value to process.
Returns: float
linear(value) Input is the same as output.
Parameters:
value: float, value to process.
Returns: float
sigmoid(value) Sigmoid or logistic function.
Parameters:
value: float, value to process.
Returns: float
sigmoid_derivative(value) Derivative of sigmoid function.
Parameters:
value: float, value to process.
Returns: float
tanh(value) Hyperbolic tangent function.
Parameters:
value: float, value to process.
Returns: float
tanh_derivative(value) Hyperbolic tangent function derivative.
Parameters:
value: float, value to process.
Returns: float
relu(value) Rectified linear unit (RELU) function.
Parameters:
value: float, value to process.
Returns: float
relu_derivative(value) RELU function derivative.
Parameters:
value: float, value to process.
Returns: float
leaky_relu(value) Leaky RELU function.
Parameters:
value: float, value to process.
Returns: float
leaky_relu_derivative(value) Leaky RELU function derivative.
Parameters:
value: float, value to process.
Returns: float
relu6(value) RELU-6 function.
Parameters:
value: float, value to process.
Returns: float
softmax(value) Softmax function.
Parameters:
value: float array, values to process.
Returns: float
softplus(value) Softplus function.
Parameters:
value: float, value to process.
Returns: float
softsign(value) Softsign function.
Parameters:
value: float, value to process.
Returns: float
elu(value, alpha) Exponential Linear Unit (ELU) function.
Parameters:
value: float, value to process.
alpha: float, default=1.0, predefined constant, controls the value to which an ELU saturates for negative net inputs. .
Returns: float
selu(value, alpha, scale) Scaled Exponential Linear Unit (SELU) function.
Parameters:
value: float, value to process.
alpha: float, default=1.67326324, predefined constant, controls the value to which an SELU saturates for negative net inputs. .
scale: float, default=1.05070098, predefined constant.
Returns: float
exponential(value) Pointer to math.exp() function.
Parameters:
value: float, value to process.
Returns: float
function(name, value, alpha, scale) Activation function.
Parameters:
name: string, name of activation function.
value: float, value to process.
alpha: float, default=na, if required.
scale: float, default=na, if required.
Returns: float
derivative(name, value, alpha, scale) Derivative Activation function.
Parameters:
name: string, name of activation function.
value: float, value to process.
alpha: float, default=na, if required.
scale: float, default=na, if required.
Returns: float
Activation functions for Neural networks.
binary_step(value) Basic threshold output classifier to activate/deactivate neuron.
Parameters:
value: float, value to process.
Returns: float
linear(value) Input is the same as output.
Parameters:
value: float, value to process.
Returns: float
sigmoid(value) Sigmoid or logistic function.
Parameters:
value: float, value to process.
Returns: float
sigmoid_derivative(value) Derivative of sigmoid function.
Parameters:
value: float, value to process.
Returns: float
tanh(value) Hyperbolic tangent function.
Parameters:
value: float, value to process.
Returns: float
tanh_derivative(value) Hyperbolic tangent function derivative.
Parameters:
value: float, value to process.
Returns: float
relu(value) Rectified linear unit (RELU) function.
Parameters:
value: float, value to process.
Returns: float
relu_derivative(value) RELU function derivative.
Parameters:
value: float, value to process.
Returns: float
leaky_relu(value) Leaky RELU function.
Parameters:
value: float, value to process.
Returns: float
leaky_relu_derivative(value) Leaky RELU function derivative.
Parameters:
value: float, value to process.
Returns: float
relu6(value) RELU-6 function.
Parameters:
value: float, value to process.
Returns: float
softmax(value) Softmax function.
Parameters:
value: float array, values to process.
Returns: float
softplus(value) Softplus function.
Parameters:
value: float, value to process.
Returns: float
softsign(value) Softsign function.
Parameters:
value: float, value to process.
Returns: float
elu(value, alpha) Exponential Linear Unit (ELU) function.
Parameters:
value: float, value to process.
alpha: float, default=1.0, predefined constant, controls the value to which an ELU saturates for negative net inputs. .
Returns: float
selu(value, alpha, scale) Scaled Exponential Linear Unit (SELU) function.
Parameters:
value: float, value to process.
alpha: float, default=1.67326324, predefined constant, controls the value to which an SELU saturates for negative net inputs. .
scale: float, default=1.05070098, predefined constant.
Returns: float
exponential(value) Pointer to math.exp() function.
Parameters:
value: float, value to process.
Returns: float
function(name, value, alpha, scale) Activation function.
Parameters:
name: string, name of activation function.
value: float, value to process.
alpha: float, default=na, if required.
scale: float, default=na, if required.
Returns: float
derivative(name, value, alpha, scale) Derivative Activation function.
Parameters:
name: string, name of activation function.
value: float, value to process.
alpha: float, default=na, if required.
scale: float, default=na, if required.
Returns: float
Release Notes
v2Added:
softmax_derivative(value) Softmax derivative function.
Parameters:
value: float array, values to process.
Returns: float
Pine library
In true TradingView spirit, the author has published this Pine code as an open-source library so that other Pine programmers from our community can reuse it. Cheers to the author! You may use this library privately or in other open-source publications, but reuse of this code in publications is governed by House Rules.
Disclaimer
The information and publications are not meant to be, and do not constitute, financial, investment, trading, or other types of advice or recommendations supplied or endorsed by TradingView. Read more in the Terms of Use.
Pine library
In true TradingView spirit, the author has published this Pine code as an open-source library so that other Pine programmers from our community can reuse it. Cheers to the author! You may use this library privately or in other open-source publications, but reuse of this code in publications is governed by House Rules.
Disclaimer
The information and publications are not meant to be, and do not constitute, financial, investment, trading, or other types of advice or recommendations supplied or endorsed by TradingView. Read more in the Terms of Use.