Why do we name variables in Tensorflow?

Reference:Stack Overflow

The name parameter is optional (you can create variables and constants with or without it), and the variable you use in your program does not depend on it. Names can be helpful in a couple of places:

When you want to save or restore your variables (you can save them to a binary file after the computation). From docs:

By default, it uses the value of the Variable.name property for each variable

matrix_1 = tf.Variable([[1, 2], [2, 3]], name="v1")
matrix_2 = tf.Variable([[3, 4], [5, 6]], name="v2")
init = tf.initialize_all_variables()

saver = tf.train.Saver()

sess = tf.Session()
sess.run(init)
save_path = saver.save(sess, "/model.ckpt")
sess.close()

Nonetheless you have variables matrix_1, matrix_2 they are saves as v1, v2 in the file.

Also names are used in TensorBoard to nicely show names of edges. You can even group them by using the same scope:

import tensorflow as tf

with tf.name_scope('hidden') as scope:
  a = tf.constant(5, name='alpha')
  W = tf.Variable(tf.random_uniform([1, 2], -1.0, 1.0), name='weights')
  b = tf.Variable(tf.zeros([1]), name='biases')


How does TensorFlow name tensors?

In TensorFlow,what's the meaning of “:0” in a Variable's name?

It has to do with representation of tensors in underlying API. A tensor is a value associated with output of some op. In case of variables, there's a Variable op with one output. An op can have more than one output, so those tensors get referenced to as <op>:0, <op>:1 etc. For instance if you use tf.nn.top_k, there are two values created by this op, so you may see TopKV2:0 and TopKV2:1

a,b=tf.nn.top_k([1], 1)
print a.name # => 'TopKV2:0'
print b.name # => 'TopKV2:1'

How to understand the term `tensor` in TensorFlow?