python - 随机梯度下降中添加expcost的目的是什么

标签 python neural-network gradient-descent

我正在尝试基于 stanford 在他们的第一个 cs224n 任务中提供的脚手架来实现 SGD。实现是用Python实现的。脚手架如下:

def load_saved_params():
'''A helper function that loads previously saved parameters and resets
iteration start.'''
return st, params, state #st = starting iteration

def save_params(iter, params):
'''saves the parameters'''

现在是主函数(我已经用多个哈希符号跟踪了感兴趣的语句)

def sgd(f, x0, step, iterations, postprocessing=None, useSaved=False,
    PRINT_EVERY=10):
""" Stochastic Gradient Descent

Implement the stochastic gradient descent method in this function.

Arguments:
f -- the function to optimize, it should take a single
     argument and yield two outputs, a cost and the gradient
     with respect to the arguments
x0 -- the initial point to start SGD from
step -- the step size for SGD
iterations -- total iterations to run SGD for
postprocessing -- postprocessing function for the parameters
                  if necessary. In the case of word2vec we will need to
                  normalize the word vectors to have unit length.
PRINT_EVERY -- specifies how many iterations to output loss

Return:
x -- the parameter value after SGD finishes
"""

# Anneal learning rate every several iterations
ANNEAL_EVERY = 20000

if useSaved:
    start_iter, oldx, state = load_saved_params()
    if start_iter > 0:
        x0 = oldx
        step *= 0.5 ** (start_iter / ANNEAL_EVERY)

    if state:
        random.setstate(state)
else:
    start_iter = 0

x = x0

if not postprocessing:
    postprocessing = lambda x: x

expcost = None ######################################################

for iter in xrange(start_iter + 1, iterations + 1):
    # Don't forget to apply the postprocessing after every iteration!
    # You might want to print the progress every few iterations.

    cost = None

    ### END YOUR CODE

    if iter % PRINT_EVERY == 0:
        if not expcost:
            expcost = cost
        else:
            expcost = .95 * expcost + .05 * cost ########################
        print "iter %d: %f" % (iter, expcost)

    if iter % SAVE_PARAMS_EVERY == 0 and useSaved:
        save_params(iter, x)

    if iter % ANNEAL_EVERY == 0:
        step *= 0.5

return x

就我的目的而言,我没有使用expcost。但代码中 expcost 的目的是什么?什么情况下可以使用?为什么用它来修改成本函数计算出的成本?

最佳答案

如果您注意到,expcost 仅用于打印成本。这只是平滑成本函数的一种方法,因为尽管模型有所改进,但它可以在批处理之间显着跳跃

关于python - 随机梯度下降中添加expcost的目的是什么,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/45948279/

相关文章:

python - TensorFlow:将输入发送到导出的模型

python - 如何在 Keras 中使用 ImageDataGenerator 调整标签?

machine-learning - 感知器始终只学习再现一种模式

matlab - 机器学习 - 使用批量梯度下降的线性回归

python - 为什么python类方法与其对象方法不同但它们都有相同的id?

python - Eigen + MKL 或 OpenBLAS 比 Numpy/Scipy + OpenBLAS 慢

matlab - 如何格式化数据以在 nntool (MATLAB) 中使用?

machine-learning - 神经网络 - 如何最好地处理可变数量的输入

machine-learning - 如何处理机器学习的大量特征

python - 使用 seaborn 绘制单个数据点