Python - Set class property to depend on values of

2019-07-10 18:49发布

问题:

Sorry if this already exists somewhere in the question archives, but I'm not sure how to ask it and searching didn't lead to any great revelations.

In Python (2.6.x) I have created a class

class timetuple(object):
    def __init__(self):
        self.weekday = 6
        self.month   = 1
        self.day     = 1
        self.year    = 2011
        self.hour    = 0
        self.min     = 0
        self.sec     = 0
    def jd(self):
        self.jd = julian_date(self)

def julian_date(obj):
    (Code to calculate a Julian Date snipped)

start = timetuple()
start.day   = 23
start.month = 2
start.year  = 2011
start.hour  = 13
start.min   = 30
start.sec   = 0

print start.__dict__
start.jd()
print start.__dict__
print start.jd

Which returns

{'hour': 13, 'min': 30, 'month': 2, 'sec': 0, 'weekday': 6, 'year': 2011, 'date': 23, 'day': 1}
{'hour': 13, 'min': 30, 'month': 14, 'jd': 2455594.0625, 'sec': 0, 'weekday': 6, 'year': 2010, 'date': 23, 'day': 1}
2455594.0625

So the .jd property (or do I call this a function or a method? I'm unsure of the lingo here honestly) doesn't exist before the start.jd() call. Is there a way I can somehow rewrite this to make it always exist based on the current values in the timetuple class, or have it update itself when the .jd property is called?

I know I can do it the long way by just making a .jd property in the init(self) section and then do something like

start = timetuple()
start.jd = julian_date(start)

but I'd like to know how to set up my classes better honestly :)

回答1:

You want to actually define a property, as opposed to a variable:

class A(object):

    def __init__(self):
        self.a = 1
        self.b = 1

    @property
    def a_plus_b(self):
        return self.a + self.b

foo = A()
print foo.a_plus_b # prints "2"
foo.a = 3
print foo.a_plus_b # prints "4"
foo.b = 4
print foo.a_plus_b # prints "7"