Skip to content

Input type and weight type should be the same #599

Open
@mahmoodn

Description

@mahmoodn

For the super resolve run, I get this error

$ python super_resolve.py --input_image dataset/BSDS300/images/test/16077.jpg --model model_epoch_30.pth --output_filename out.png
Namespace(cuda=False, input_image='dataset/BSDS300/images/test/16077.jpg', model='model_epoch_30.pth', output_filename='out.png')
Traceback (most recent call last):
  File "super_resolve.py", line 29, in <module>
    out = model(input)
  File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules/module.py", line 491, in __call__
    result = self.forward(*input, **kwargs)
  File "/home/mahmood/cactus/pt/pytorch/examples/super_resolution/model.py", line 20, in forward
    x = self.relu(self.conv1(x))
  File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules/module.py", line 491, in __call__
    result = self.forward(*input, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules/conv.py", line 339, in forward
    self.padding, self.dilation, self.groups)
RuntimeError: Input type (torch.FloatTensor) and weight type (torch.cuda.FloatTensor) should be the same

Any idea?

P.S: Although readme file says model_epoch_500.pth, but with --nEpochs 30, I see model_epoch_30.pth

Activity

romanoss

romanoss commented on Sep 21, 2019

@romanoss

not an expert but you can resolve it using --cuda arg inthe inference command

msaroufim

msaroufim commented on Mar 10, 2022

@msaroufim
Member

@romanoss is correct although it may be worth making a simple change to make this example work on cpu as well

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Development

      No branches or pull requests

        Participants

        @romanoss@msaroufim@mahmoodn

        Issue actions

          Input type and weight type should be the same · Issue #599 · pytorch/examples