What is the difference between tf.keras.layers ver

2020-07-07 11:04发布

问题:

What is the difference between tf.keras.layers versus tf.layers?
E.g. both of them have Conv2d, do they provide different outputs?
Is there any benefits if you mix them (something like a tf.keras.layers.Conv2d in one hidden layer and in the next, tf.layers.max_pooling2d)?

回答1:

Since TensorFlow 1.12, tf.layers are merely wrappers around tf.keras.layers.

A few examples:

Convolutional tf.layers just inherit from the convolutional tf.keras.layers, see source code here:

@tf_export('layers.Conv2D')
class Conv2D(keras_layers.Conv2D, base.Layer):

The same is true for all core tf.layers, e.g.:

@tf_export('layers.Dense')
class Dense(keras_layers.Dense, base.Layer):

With the integration of Keras into TensorFlow, it would make little sense to maintain several different layer implementations. tf.keras is becoming the de-facto high-level API for TensorFlow, therefore tf.layers are now just wrappers around tf.keras.layers.



回答2:

tf.keras.layers.Conv2d is a tensorflow-keras layer while tf.layers.max_pooling2d is a tensorflow 'native layer'

You cannot use a native layer directly within a Keras model, as it will be missing certain attributes required by the Keras API.

However, it is possible to use native layer if wrapped within a tensorflow-keras Lambda layer. A link to the documentation for this is below.

https://www.tensorflow.org/api_docs/python/tf/keras/layers/Lambda



回答3:

tf.layers module is Tensorflow attempt at creating a Keras like API whereas tf.keras.layers is a compatibility wrapper. In fact, most of the implementation refers back to tf.layers, for example the tf.keras.layers.Dense inherits the core implementation:

@tf_export('keras.layers.Dense')
class Dense(tf_core_layers.Dense, Layer):
  # ...

Because the tf.keras compatibility module is checked into the Tensorflow repo separately, it might lack behind what Keras actually offers. I would use Keras directly or tf.layers but not necessarily mix them.