Pytorch ValueError: optimizer got an empty paramet

2020-02-14 06:00发布

When trying to create a neural network and optimize it using Pytorch, I am getting

ValueError: optimizer got an empty parameter list

Here is the code.

import torch.nn as nn
import torch.nn.functional as F
from os.path import dirname
from os import getcwd
from os.path import realpath
from sys import argv

class NetActor(nn.Module):

    def __init__(self, args, state_vector_size, action_vector_size, hidden_layer_size_list):
        super(NetActor, self).__init__()
        self.args = args

        self.state_vector_size = state_vector_size
        self.action_vector_size = action_vector_size
        self.layer_sizes = hidden_layer_size_list
        self.layer_sizes.append(action_vector_size)

        self.nn_layers = []
        self._create_net()

    def _create_net(self):
        prev_layer_size = self.state_vector_size
        for next_layer_size in self.layer_sizes:
            next_layer = nn.Linear(prev_layer_size, next_layer_size)
            prev_layer_size = next_layer_size
            self.nn_layers.append(next_layer)

    def forward(self, torch_state):
        activations = torch_state
        for i,layer in enumerate(self.nn_layers):
            if i != len(self.nn_layers)-1:
                activations = F.relu(layer(activations))
            else:
                activations = layer(activations)

        probs = F.softmax(activations, dim=-1)
        return probs

and then the call

        self.actor_nn = NetActor(self.args, 4, 2, [128])
        self.actor_optimizer = optim.Adam(self.actor_nn.parameters(), lr=args.learning_rate)

gives the very informative error

ValueError: optimizer got an empty parameter list

I find it hard to understand what exactly in the network's definition makes the network have parameters.

I am following and expanding the example I found in Pytorch's tutorial code.

I can't really tell the difference between my code and theirs that makes mine think it has no parameters to optimize.

How to make my network have parameters like the linked example?

1条回答
兄弟一词,经得起流年.
2楼-- · 2020-02-14 06:31

Your NetActor does not directly store any nn.Parameter. Moreover, all other layers it eventually uses in forward are stored as a simple list is self.nn_layers.
If you want self.actor_nn.parameters() to know that the items stored in the list self.nn_layers may contain trainable parameters, you should work with containers.
Specifically, making self.nn_layers to be a nn.ModuleList instead of a simple list should solve your problem:

self.nn_layers = nn.ModuleList()
查看更多
登录 后发表回答