This article shows how to use sharing variables in Tensroflow. But I still have a question: dose sharing variables have the same value? To answer this question, I write these code below:

```import tensorflow as tf
#initial = tf.constant(0.1, shape=[1])
initial = tf.truncated_normal(shape=[3], stddev=1, mean=1)
a = [None] * 3
b = [None] * 3
c = [None] * 3
with tf.variable_scope(tf.get_variable_scope()):
for i in xrange(3):
with tf.name_scope("my_%d" % i):
a[i] = tf.Variable(initial, [3])
b[i] = tf.Variable(initial, [3])
c[i] = a[i] + b[i]
tf.get_variable_scope().reuse_variables()
sess = tf.Session()
sess.run(tf.global_variables_initializer())
curr_a, curr_b, curr_a1, curr_b1, curr_a2, curr_b2, curr_c = sess.run([a[0], b[0], a[1], b[1], a[2], b[2], c[0]], feed_dict={a[0]:[0.1, 0.1, 0.1], b[0]:[0.2,0.2,0.2]})
print(curr_a, curr_b, curr_a1, curr_b1, curr_a2, curr_b2, curr_c)
The result of running these python code is:
(array([ 0.1,  0.1,  0.1], dtype=float32), array([ 0.2,  0.2,  0.2], dtype=float32), array([ 0.90568691,  1.30992699,  1.49500561], dtype=float32), array([ 0.90568691,  1.30992699,  1.49500561], dtype=float32), array([ 0.90568691,  1.30992699,  1.49500561], dtype=float32), array([ 0.90568691,  1.30992699,  1.49500561], dtype=float32), array([ 0.30000001,  0.30000001,  0.30000001], dtype=float32))

Therefore, the "sharing variables" mechanism is made only for convenience of writing short code to create multi-models. For sharing same value for different variables, we still need 'assign' operation.

Related Posts

Some tips about TensorflowQ: How to fix error report like tensorflow.python.framework.errors_impl.InvalidArgumentError: Input 0 of node Adam/update_embeddings/AssignSub was passed…

How to average gradients in TensorflowSometimes, we need to average an array of gradients in deep learning model. Fortunately, Tensorflow…

"Eager Mode" in TensorflowAlthough Tensorflow is the most popular Deep Learning Framework in 2016, Pytorch, a smaller new…

```