min() arg empty sequence

2019-09-05 21:43发布

I am learning object oriented concepts in python. Below, I have created a class and a method in it. I am trying to find the minimum of a list by directly calling the min() function and also by calling the class method findMin().

The below code gives me an error:

min() arg is an empty sequence

Please tell me what I am missing here.

class Solution:
    nums = list()
    def findMin(self, nums):
        self.nums.sort()
        out = min(self.nums)
        return out

x = [4,5,6,7,0,1,2]
y = Solution()

print min(x)
print y.findMin(x) 
print len(x)
print type(y)
print dir(y)

1条回答
Ridiculous、
2楼-- · 2019-09-05 22:39

You are confusing nums with self.nums.

When you write:

    nums=list()

You are setting a variable on the class.

When you write:

    def findMin(self, nums):

You are receiving the parameter in a local variable, nums.

When you then write self.nums on the next two lines, you're referencing the instance's nums variable, which was initialized to the value of the class's nums variable, which was the empty list.

As such, you are essentially sorting an empty list and then trying to find its minimum. This isn't going to work, since there's no value in an empty list to find the minimum of.

Hence, the error that you see:

ValueError: min() arg is an empty sequence

To solve this, use nums rather than self.nums, because then you'll be referencing the parameter rather than the instance field.

查看更多
登录 后发表回答