Custom Objective Function Keras

2019-05-30 10:39发布

I need to define my own loss function, I am using GAN model and my loss will include both adverserial loss and L1 loss between true and generated images.

I tried to write a function but the following error:

ValueError: ('Could not interpret loss function identifier:', Elemwise{add,no_inplace}.0)

My loss function is:

def loss_function(y_true, y_pred, y_true1, y_pred1):

    bce=0
    for i in range (64):
        a = y_pred1[i]        
        b = y_true1[i]        
        x = K.log(a)
        bce=bce-x
    bce/=64
    print('bce = ', bce)

    for i in zip( y_pred, y_true):
        img   = i[0]
        image = np.zeros((64,64),dtype=y_pred.dtype)
        image = img[0,:,:]                
        image = image*127.5+127.5                
        imgfinal = Image.fromarray(image.astype(np.uint8))

        img1 = i[1]
        image1 = np.zeros((64,64), dtype=y_true.dtype)
        image1 = img1[0,:,:]
        image1 = image1*127.5+127.5              
        imgfinal1 = Image.fromarray(image1.astype(np.uint8))

        diff = ImageChops.difference(imgfinal,imgfinal1)
        h = diff.histogram()
        sq = (value*((idx%256)**2) for idx, value in enumerate(h))       
        sum_of_squares = sum(sq)
        lossr = math.sqrt(sum_of_squares/float(im1.size[0] * im1.size[1]))
        loss  = loss+lossr

    loss /=(64*127) 
    print('loss = ', loss)

    return x+loss

1条回答
三岁会撩人
2楼-- · 2019-05-30 11:13

From your comment you say you are passing your custom function to the compile operation like this:

discriminator_on_generator.compile(loss = loss_function(y_true ,y_pred ,y_true1 ,y_pred1), optimizer=g_optim)

However, according to the docs you should be passing your custom function like:

discriminator_on_generator.compile(loss = loss_function, optimizer=g_optim)

You can take a look at this github discussion where they also indicate how to use custom loss functions.

Note: As you require 4 parameters in your function and it is only expected to have 2 at most, you can do something as suggested in this github issue, which involves defining a container function that handles those extra parameters, something like:

def loss_function(y_true1, y_pred1):
    def my_nested_function(y_true, y_pred):
        #now you can work with all 4 variables here

and passing it as parameter when compiling like:

discriminator_on_generator.compile(loss=loss_function(y_true1, y_pred1), optimizer=g_optim)

Alternatively, you could merge your 4 parameters into 2 (y_true, y_predict) and then inside your single function split them into your 4 variables (y_true, y_pred, y_true1, y_predict1), as they also discuss in that issue.

查看更多
登录 后发表回答