pytorch - 如何在pytorch中查看Adam的适配学习率?

标签 pytorch

有许多不同的 optimizers 具有自适应学习率方法。是否可以看到 Adam 初始学习率的适应值?

Here 是一个关于 Adadelta 的类似问题,答案是搜索 ["acc_delta"] 键,但 Adam 没有那个键。

最佳答案

AFAIK 没有 super 简单的方法可以做到这一点。但是,您可以使用 PyTorch 中的 Adam 实现重新计算某个参数的当前学习率:https://pytorch.org/docs/stable/_modules/torch/optim/adam.html
我想出了这个最小的工作示例:

import torch
import torch.nn as nn
import torch.optim as optim
from torch.autograd import Variable

def get_current_lr(optimizer, group_idx, parameter_idx):
    # Adam has different learning rates for each paramter. So we need to pick the
    # group and paramter first.
    group = optimizer.param_groups[group_idx]
    p = group['params'][parameter_idx]

    beta1, _ = group['betas']
    state = optimizer.state[p]

    bias_correction1 = 1 - beta1 ** state['step']
    current_lr = group['lr'] / bias_correction1
    return current_lr

x = Variable(torch.randn(100, 1)) #Just create a random tensor as input
model = nn.Linear(1, 1)
criterion = nn.MSELoss()
optimizer = optim.Adam(model.parameters(), lr=1e-3)
niter = 20
for _ in range(0, niter):
    out = model(x)

    optimizer.zero_grad()
    loss = criterion(out, x) #Here we learn the identity mapping
    loss.backward()
    optimizer.step()
    group_idx, param_idx = 0, 0
    current_lr = get_current_lr(optimizer, group_idx, param_idx)
    print('Current learning rate (g:%d, p:%d): %.4f | Loss: %.4f'%(group_idx, param_idx, current_lr, loss.item()))
它应该输出这样的东西:
Current learning rate (g:0, p:0): 0.0100 | Loss: 0.5181
Current learning rate (g:0, p:0): 0.0053 | Loss: 0.5161
Current learning rate (g:0, p:0): 0.0037 | Loss: 0.5141
Current learning rate (g:0, p:0): 0.0029 | Loss: 0.5121
Current learning rate (g:0, p:0): 0.0024 | Loss: 0.5102
Current learning rate (g:0, p:0): 0.0021 | Loss: 0.5082
Current learning rate (g:0, p:0): 0.0019 | Loss: 0.5062
Current learning rate (g:0, p:0): 0.0018 | Loss: 0.5042
Current learning rate (g:0, p:0): 0.0016 | Loss: 0.5023
Current learning rate (g:0, p:0): 0.0015 | Loss: 0.5003
Current learning rate (g:0, p:0): 0.0015 | Loss: 0.4984
Current learning rate (g:0, p:0): 0.0014 | Loss: 0.4964
Current learning rate (g:0, p:0): 0.0013 | Loss: 0.4945
Current learning rate (g:0, p:0): 0.0013 | Loss: 0.4925
Current learning rate (g:0, p:0): 0.0013 | Loss: 0.4906
Current learning rate (g:0, p:0): 0.0012 | Loss: 0.4887
Current learning rate (g:0, p:0): 0.0012 | Loss: 0.4868
Current learning rate (g:0, p:0): 0.0012 | Loss: 0.4848
Current learning rate (g:0, p:0): 0.0012 | Loss: 0.4829
Current learning rate (g:0, p:0): 0.0011 | Loss: 0.4810
请注意,监控每个单独参数的学习率可能不可行,也对较大的模型没有帮助。

关于pytorch - 如何在pytorch中查看Adam的适配学习率?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/61773139/

相关文章:

python - PyTorch:_thnn_nll_loss_forward 未针对类型 torch.LongTensor 实现

python - pytorch 摘要因 huggingface 模型而失败

matrix - 在Pytorch中创建knn邻接矩阵

image - 在 pytorch 中加载 csv 和图像数据集

python - model.to(device) 和 model=model.to(device) 有什么区别?

python - PyTorch 中神经网络推理时间的波动

python - matmul 和通常的张量乘法有区别吗

python - 我是否需要在 PyTorch 中创建神经网络的多个实例来测试多个损失函数?

multithreading - Pytorch 使用了太多资源

pytorch - 为什么 MNIST 显示为 4 层深度的列表?