coremltools.models.neural_network.update_optimizer_utils¶
Neural Network optimizer utilities.
Classes
AdamParams ([lr, batch, beta1, beta2, eps]) |
Adam - A Method for Stochastic Optimization. |
Batch (value[, allowed_set]) |
|
RangeParam (value[, min, max]) |
|
SgdParams ([lr, batch, momentum]) |
SGD - Stochastic Gradient Descent optimizer. |
-
class
coremltools.models.neural_network.update_optimizer_utils.
AdamParams
(lr=0.01, batch=10, beta1=0.9, beta2=0.999, eps=1e-08)¶ Adam - A Method for Stochastic Optimization.
Attributes: - lr: float
The learning rate that controls learning step size. Adjustable in progress, default: 0.01.
- batch: int
The mini-batch size, number of examples used to compute single gradient step, default: 10.
- beta1: float
Controls the exponential decay rate for the first moment estimates, default: 0.9.
- beta2: float
Controls the exponential decay rate for the second moment estimates, default: 0.999.
- eps: float
The epsilon, a very small number to prevent any division by zero in the implementation, default: 1e-8.
Methods
set_lr(value, min, max) Set value for learning rate. set_batch(value, allow_set) Set value for batch size. set_beta1(value, min, max) Set value for beta1. set_beta2(value, min, max) Set value for beta2. set_eps(value, min, max) Set value for epsilon. -
__init__
(self, lr=0.01, batch=10, beta1=0.9, beta2=0.999, eps=1e-08)¶ x.__init__(…) initializes x; see help(type(x)) for signature
-
class
coremltools.models.neural_network.update_optimizer_utils.
SgdParams
(lr=0.01, batch=10, momentum=0)¶ SGD - Stochastic Gradient Descent optimizer.
Attributes: - lr: float
The learning rate that controls learning step size. Adjustable in progress, default: 0.01.
- batch: int
The mini-batch size, number of examples used to compute single gradient step, default: 10.
- momentum: float
The momentum factor that helps accelerate gradients vectors in the right direction, default 0.
Methods
set_lr(value, min, max) Set value for learning rate. set_batch(value, allow_set) Set value for batch size. set_momentum(value, min, max) Set value for momentum. -
__init__
(self, lr=0.01, batch=10, momentum=0)¶ x.__init__(…) initializes x; see help(type(x)) for signature