main.py
- model = DepthModel()
- params = model.parameters()
- optimizer = optim.AdamW(params, weight_decay=args.wd, lr=args.lr)
model.py-最初的模样,实际上就是尝试自己搭建,照葫芦画瓢....QVQ
- class DepthModel(nn.Module):
-
- def __init__(self):
- super().__init__()
-
- def forward(self, x):
- return x
ValueError:optimizer got an empty parameter list -运行报错,火速找原因
optimizer = optim.Adam([var1, var2], lr = 0.0001)
实际操作,加一个可学习的参数,可行!
- class DepthModel(nn.Module):
- # __init__ 构造函数
- def __init__(
- self,):
- super(DepthModel,self).__init__()
- self.fc = nn.Linear(1,10)
-
- def forward(self, x):
- return self.fc(x)
总结:检查是否给予了可学习的参数?哪些参数是可学习的?参数传递是否成功?