SoFunction
Updated on 2024-11-15

Tensorflow tf.dynamic_partition matrix splitting example (Python3)

Let's start with a sample

import tensorflow as tf

raw = ([1, 2, 3, 4, 5, 6, 6, 5, 4, 3, 2, 1])

'''
Split into [1,2] [3,4] [5,6] [6,5] [4,3] [2,1]
'''
result_1 = tf.dynamic_partition((raw, [6,2]),[0, 1, 2, 3, 4, 5], 6)

'''
break up into [1, 2, 3, 4, 5, 6] [6, 5, 4, 3, 2, 1]
'''
result_2 = tf.dynamic_partition((raw, [2, 6]), [0, 1], 2)

'''
Split into [1] [2] [3] [4] [5] [6] [6] [5] [4] [3] [2] [1]
'''
result_3 = tf.dynamic_partition((raw, [12, 1]), [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11], 12)

with () as sess:
  print((result_1))
  print((result_2))
  print((result_3))

in the end

[array([[1, 2]]), array([[3, 4]]), array([[5, 6]]), array([[6, 5]]), array([[4, 3]]), array([[2, 1]])]
[array([[1, 2, 3, 4, 5, 6]]), array([[6, 5, 4, 3, 2, 1]])]
[array([[1]]), array([[2]]), array([[3]]), array([[4]]), array([[5]]), array([[6]]), array([[6]]), array([[5]]), array([[4]]), array([[3]]), array([[2]]), array([[1]])]

One more sample is given

Py3 Code:

# Samples of one-hot functions
import tensorflow as tf

label = (tf.int32,[None])
# Directly take the input sequence and perform a One-Hot result
one_hot = tf.one_hot(label, 3, 1, 0)
# Transpose
one_hot_new = (one_hot, perm=[1,0])
one_hot_new = (one_hot_new, tf.float32)
# one_hot_new[2] = one_hot_new[2] * 1.5

# Split by size of each dimension
one_hot_new_1 = tf.dynamic_partition(one_hot_new, [0, 1, 1], 2)[0]
one_hot_new_2 = tf.dynamic_partition(one_hot_new, [1, 0, 1], 2)[0]
one_hot_new_3 = tf.dynamic_partition(one_hot_new, [1, 1, 0], 2)[0]

# Split by size of each dimension
one_hot_1 = tf.dynamic_partition(one_hot_new, [0, 1, 2], 3)[0]
one_hot_2 = tf.dynamic_partition(one_hot_new, [0, 1, 2], 3)[1]
one_hot_3 = tf.dynamic_partition(one_hot_new, [0, 1, 2], 3)[2]

# one_hot_new_3 = tf.dynamic_partition(one_hot_new, [0, 0, 1], 2)[2]
# Splice the above two dimensions to get the original result
one_hot_new = ([one_hot_new_1, one_hot_new_2], axis=0)


if __name__ == '__main__':
  with () as sess:
    (tf.global_variables_initializer())
    one_hot_out, one_hot_new_out, one_hot_new_1_out, one_hot_new_2_out, one_hot_new_3_out, one_hot_1_out, one_hot_2_out, one_hot_3_out = ([one_hot, one_hot_new, one_hot_new_1, one_hot_new_2, one_hot_new_3, one_hot_1, one_hot_2, one_hot_3], feed_dict={label: [0, 1, 1, 2, 2, 0, 0, 1, 2, 2, 0, 2]})
    print("Raw One-hot results:")
    print(one_hot_out, end='\n\n')
    print("The results of the above.T:")

    print("Method 1 Splitting:")
    print(one_hot_new_out, end='\n\n')
    print("Split (1) dimension:")
    print(one_hot_new_1_out, end='\n\n')
    print("broken up inseparate items (2)dimension:")
    print(one_hot_new_2_out, end='\n\n')
    print("broken up inseparate items (3)dimension:")
    print(one_hot_new_3_out, end='\n\n')

    print("Method 2 Splitting:")
    print("Split (1) dimension:")
    print(one_hot_1_out, end='\n\n')
    print("broken up inseparate items (2)dimension:")
    print(one_hot_2_out, end='\n\n')
    print("broken up inseparate items (3)dimension:")
    print(one_hot_3_out, end='\n\n')

Console Output:

primitiveOne-hotin the end: 
[[1 0 0] 
[0 1 0] 
[0 1 0] 
[0 0 1] 
[0 0 1] 
[1 0 0] 
[1 0 0] 
[0 1 0] 
[0 0 1] 
[0 0 1] 
[1 0 0] 
[0 0 1]]

以上的in the end.T: 
Method 1 Splitting: 
[[ 1. 0. 0. 0. 0. 1. 1. 0. 0. 0. 1. 0.] 
[ 0. 1. 1. 0. 0. 0. 0. 1. 0. 0. 0. 0.]]

broken up inseparate items(1)dimension: 
[[ 1. 0. 0. 0. 0. 1. 1. 0. 0. 0. 1. 0.]]

broken up inseparate items (2)dimension: 
[[ 0. 1. 1. 0. 0. 0. 0. 1. 0. 0. 0. 0.]]

broken up inseparate items (3)dimension: 
[[ 0. 0. 0. 1. 1. 0. 0. 0. 1. 1. 0. 1.]]

方法二broken up inseparate items: 
broken up inseparate items(1)dimension: 
[[ 1. 0. 0. 0. 0. 1. 1. 0. 0. 0. 1. 0.]]

broken up inseparate items (2)dimension: 
[[ 0. 1. 1. 0. 0. 0. 0. 1. 0. 0. 0. 0.]]

broken up inseparate items (3)dimension: 
[[ 0. 0. 0. 1. 1. 0. 0. 0. 1. 1. 0. 1.]]

Above this Tensorflow tf.dynamic_partition matrix splitting example (Python3) is all I have to share with you, I hope to be able to give you a reference, and I hope you will support me more.