Save or export weights and biases in TensorFlow fo

2019-06-23 15:53发布

I've built a neural network that performs reasonably well, and I'd like to replicate my model in a non-Python environment. I set up my network as follows:

sess = tf.InteractiveSession()
x = tf.placeholder(tf.float32, shape=[None, 23])
y_ = tf.placeholder(tf.float32, shape=[None, 2])
W = tf.Variable(tf.zeros([23,2]))
b = tf.Variable(tf.zeros([2]))
sess.run(tf.initialize_all_variables())
y = tf.nn.softmax(tf.matmul(x,W) + b)

How can I obtain a decipherable .csv or .txt of my weights and biases?

EDIT: Below is my full script:

import csv
import numpy
import tensorflow as tf

data = list(csv.reader(open("/Users/sjayaram/developer/TestApp/out/production/TestApp/data.csv")))
[[float(j) for j in i] for i in data]
numpy.random.shuffle(data)
results=data

#delete results from data
data = numpy.delete(data, [23, 24], 1)
#delete data from results
results = numpy.delete(results, range(23), 1)

sess = tf.InteractiveSession()
x = tf.placeholder(tf.float32, shape=[None, 23])
y_ = tf.placeholder(tf.float32, shape=[None, 2])
W = tf.Variable(tf.zeros([23,2]))
b = tf.Variable(tf.zeros([2]))
sess.run(tf.initialize_all_variables())
y = tf.nn.softmax(tf.matmul(x,W) + b)
cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y), reduction_indices=[1]))
train_step = tf.train.GradientDescentOptimizer(0.5).minimize(cross_entropy)

#train the model, saving 80 entries for testing
#batch-size: 40
for i in range(0, 3680, 40):
  train_step.run(feed_dict={x: data[i:i+40], y_: results[i:i+40]})

correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))
accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
print(accuracy.eval(feed_dict={x: data[3680:], y_: results[3680:]}))

1条回答
SAY GOODBYE
2楼-- · 2019-06-23 16:21

You can fetch the variables as NumPy arrays, and use numpy.savetxt() to write out the contents as text or CSV:

import numpy as np

W_val, b_val = sess.run([W, b])

np.savetxt("W.csv", W_val, delimiter=",")
np.savetxt("b.csv", b_val, delimiter=",")

Note that this is unlikely to give performance as good as using TensorFlow's native replication mechanisms, in the distributed runtime.

查看更多
登录 后发表回答