tf.concat giving 'Shape must be at least rank 2 but is rank 1' error even if both tensors are the same shape












0















I'm trying to concat two tensorflow constants of the same shape but I'm getting an error. Here's the code. (I just edited it to make the init value explicit)



import tensorflow as tf



b1 = tf.constant(value=[5,8])
b2 = tf.constant(value=[6,9])
b3= tf.concat( [b1, b2] , 1)

with tf.Session( ) as sess:
sess.run(tf.global_variables_initializer())
print(sess.run([ b3] ))


Gives this error



---------------------------------------------------------------------------
InvalidArgumentError Traceback (most recent call last)
/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs)
1658 try:
-> 1659 c_op = c_api.TF_FinishOperation(op_desc)
1660 except errors.InvalidArgumentError as e:

InvalidArgumentError: Shapes must be equal rank, but are 2 and 1
From merging shape 0 with other shapes. for 'stack_38' (op: 'Pack') with input shapes: [2,2], [2].

During handling of the above exception, another exception occurred:

ValueError Traceback (most recent call last)
<ipython-input-96-3acc40ce0738> in <module>()
1 c1 = [[5,8], [7,4]]
2 c2 = [6,9]
----> 3 c3= tf.stack( [c1, c2] )
4 with tf.Session( ) as sess:
5 sess.run(tf.global_variables_initializer())

/usr/local/lib/python3.6/dist-packages/tensorflow/python/util/dispatch.py in wrapper(*args, **kwargs)
178 """Call target, and fall back on dispatchers if there is a TypeError."""
179 try:
--> 180 return target(*args, **kwargs)
181 except (TypeError, ValueError):
182 # Note: convert_to_eager_tensor currently raises a ValueError, not a

/usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/array_ops.py in stack(values, axis, name)
1003 expanded_num_dims))
1004
-> 1005 return gen_array_ops.pack(values, axis=axis, name=name)
1006
1007

/usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/gen_array_ops.py in pack(values, axis, name)
5446 axis = _execute.make_int(axis, "axis")
5447 _, _, _op = _op_def_lib._apply_op_helper(
-> 5448 "Pack", values=values, axis=axis, name=name)
5449 _result = _op.outputs[:]
5450 _inputs_flat = _op.inputs

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/op_def_library.py in _apply_op_helper(self, op_type_name, name, **keywords)
786 op = g.create_op(op_type_name, inputs, output_types, name=scope,
787 input_types=input_types, attrs=attr_protos,
--> 788 op_def=op_def)
789 return output_structure, op_def.is_stateful, op
790

/usr/local/lib/python3.6/dist-packages/tensorflow/python/util/deprecation.py in new_func(*args, **kwargs)
499 'in a future version' if date is None else ('after %s' % date),
500 instructions)
--> 501 return func(*args, **kwargs)
502
503 doc = _add_deprecated_arg_notice_to_docstring(

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in create_op(***failed resolving arguments***)
3298 input_types=input_types,
3299 original_op=self._default_original_op,
-> 3300 op_def=op_def)
3301 self._create_op_helper(ret, compute_device=compute_device)
3302 return ret

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in __init__(self, node_def, g, inputs, output_types, control_inputs, input_types, original_op, op_def)
1821 op_def, inputs, node_def.attr)
1822 self._c_op = _create_c_op(self._graph, node_def, grouped_inputs,
-> 1823 control_input_ops)
1824
1825 # Initialize self._outputs.

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs)
1660 except errors.InvalidArgumentError as e:
1661 # Convert to ValueError for backwards compatibility.
-> 1662 raise ValueError(str(e))
1663
1664 return c_op

ValueError: Shapes must be equal rank, but are 2 and 1
From merging shape 0 with other shapes. for 'stack_38' (op: 'Pack') with input shapes: [2,2], [2].


Even though the two tensors are exactly the same shape. If I do axis=0 it works, and if I replace the tensors with regular numpy arrays of the same numbers, it works, but some how the combination of tensorflow constants and axis=1 is causing an issue.










share|improve this question





























    0















    I'm trying to concat two tensorflow constants of the same shape but I'm getting an error. Here's the code. (I just edited it to make the init value explicit)



    import tensorflow as tf



    b1 = tf.constant(value=[5,8])
    b2 = tf.constant(value=[6,9])
    b3= tf.concat( [b1, b2] , 1)

    with tf.Session( ) as sess:
    sess.run(tf.global_variables_initializer())
    print(sess.run([ b3] ))


    Gives this error



    ---------------------------------------------------------------------------
    InvalidArgumentError Traceback (most recent call last)
    /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs)
    1658 try:
    -> 1659 c_op = c_api.TF_FinishOperation(op_desc)
    1660 except errors.InvalidArgumentError as e:

    InvalidArgumentError: Shapes must be equal rank, but are 2 and 1
    From merging shape 0 with other shapes. for 'stack_38' (op: 'Pack') with input shapes: [2,2], [2].

    During handling of the above exception, another exception occurred:

    ValueError Traceback (most recent call last)
    <ipython-input-96-3acc40ce0738> in <module>()
    1 c1 = [[5,8], [7,4]]
    2 c2 = [6,9]
    ----> 3 c3= tf.stack( [c1, c2] )
    4 with tf.Session( ) as sess:
    5 sess.run(tf.global_variables_initializer())

    /usr/local/lib/python3.6/dist-packages/tensorflow/python/util/dispatch.py in wrapper(*args, **kwargs)
    178 """Call target, and fall back on dispatchers if there is a TypeError."""
    179 try:
    --> 180 return target(*args, **kwargs)
    181 except (TypeError, ValueError):
    182 # Note: convert_to_eager_tensor currently raises a ValueError, not a

    /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/array_ops.py in stack(values, axis, name)
    1003 expanded_num_dims))
    1004
    -> 1005 return gen_array_ops.pack(values, axis=axis, name=name)
    1006
    1007

    /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/gen_array_ops.py in pack(values, axis, name)
    5446 axis = _execute.make_int(axis, "axis")
    5447 _, _, _op = _op_def_lib._apply_op_helper(
    -> 5448 "Pack", values=values, axis=axis, name=name)
    5449 _result = _op.outputs[:]
    5450 _inputs_flat = _op.inputs

    /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/op_def_library.py in _apply_op_helper(self, op_type_name, name, **keywords)
    786 op = g.create_op(op_type_name, inputs, output_types, name=scope,
    787 input_types=input_types, attrs=attr_protos,
    --> 788 op_def=op_def)
    789 return output_structure, op_def.is_stateful, op
    790

    /usr/local/lib/python3.6/dist-packages/tensorflow/python/util/deprecation.py in new_func(*args, **kwargs)
    499 'in a future version' if date is None else ('after %s' % date),
    500 instructions)
    --> 501 return func(*args, **kwargs)
    502
    503 doc = _add_deprecated_arg_notice_to_docstring(

    /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in create_op(***failed resolving arguments***)
    3298 input_types=input_types,
    3299 original_op=self._default_original_op,
    -> 3300 op_def=op_def)
    3301 self._create_op_helper(ret, compute_device=compute_device)
    3302 return ret

    /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in __init__(self, node_def, g, inputs, output_types, control_inputs, input_types, original_op, op_def)
    1821 op_def, inputs, node_def.attr)
    1822 self._c_op = _create_c_op(self._graph, node_def, grouped_inputs,
    -> 1823 control_input_ops)
    1824
    1825 # Initialize self._outputs.

    /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs)
    1660 except errors.InvalidArgumentError as e:
    1661 # Convert to ValueError for backwards compatibility.
    -> 1662 raise ValueError(str(e))
    1663
    1664 return c_op

    ValueError: Shapes must be equal rank, but are 2 and 1
    From merging shape 0 with other shapes. for 'stack_38' (op: 'Pack') with input shapes: [2,2], [2].


    Even though the two tensors are exactly the same shape. If I do axis=0 it works, and if I replace the tensors with regular numpy arrays of the same numbers, it works, but some how the combination of tensorflow constants and axis=1 is causing an issue.










    share|improve this question



























      0












      0








      0








      I'm trying to concat two tensorflow constants of the same shape but I'm getting an error. Here's the code. (I just edited it to make the init value explicit)



      import tensorflow as tf



      b1 = tf.constant(value=[5,8])
      b2 = tf.constant(value=[6,9])
      b3= tf.concat( [b1, b2] , 1)

      with tf.Session( ) as sess:
      sess.run(tf.global_variables_initializer())
      print(sess.run([ b3] ))


      Gives this error



      ---------------------------------------------------------------------------
      InvalidArgumentError Traceback (most recent call last)
      /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs)
      1658 try:
      -> 1659 c_op = c_api.TF_FinishOperation(op_desc)
      1660 except errors.InvalidArgumentError as e:

      InvalidArgumentError: Shapes must be equal rank, but are 2 and 1
      From merging shape 0 with other shapes. for 'stack_38' (op: 'Pack') with input shapes: [2,2], [2].

      During handling of the above exception, another exception occurred:

      ValueError Traceback (most recent call last)
      <ipython-input-96-3acc40ce0738> in <module>()
      1 c1 = [[5,8], [7,4]]
      2 c2 = [6,9]
      ----> 3 c3= tf.stack( [c1, c2] )
      4 with tf.Session( ) as sess:
      5 sess.run(tf.global_variables_initializer())

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/util/dispatch.py in wrapper(*args, **kwargs)
      178 """Call target, and fall back on dispatchers if there is a TypeError."""
      179 try:
      --> 180 return target(*args, **kwargs)
      181 except (TypeError, ValueError):
      182 # Note: convert_to_eager_tensor currently raises a ValueError, not a

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/array_ops.py in stack(values, axis, name)
      1003 expanded_num_dims))
      1004
      -> 1005 return gen_array_ops.pack(values, axis=axis, name=name)
      1006
      1007

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/gen_array_ops.py in pack(values, axis, name)
      5446 axis = _execute.make_int(axis, "axis")
      5447 _, _, _op = _op_def_lib._apply_op_helper(
      -> 5448 "Pack", values=values, axis=axis, name=name)
      5449 _result = _op.outputs[:]
      5450 _inputs_flat = _op.inputs

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/op_def_library.py in _apply_op_helper(self, op_type_name, name, **keywords)
      786 op = g.create_op(op_type_name, inputs, output_types, name=scope,
      787 input_types=input_types, attrs=attr_protos,
      --> 788 op_def=op_def)
      789 return output_structure, op_def.is_stateful, op
      790

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/util/deprecation.py in new_func(*args, **kwargs)
      499 'in a future version' if date is None else ('after %s' % date),
      500 instructions)
      --> 501 return func(*args, **kwargs)
      502
      503 doc = _add_deprecated_arg_notice_to_docstring(

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in create_op(***failed resolving arguments***)
      3298 input_types=input_types,
      3299 original_op=self._default_original_op,
      -> 3300 op_def=op_def)
      3301 self._create_op_helper(ret, compute_device=compute_device)
      3302 return ret

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in __init__(self, node_def, g, inputs, output_types, control_inputs, input_types, original_op, op_def)
      1821 op_def, inputs, node_def.attr)
      1822 self._c_op = _create_c_op(self._graph, node_def, grouped_inputs,
      -> 1823 control_input_ops)
      1824
      1825 # Initialize self._outputs.

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs)
      1660 except errors.InvalidArgumentError as e:
      1661 # Convert to ValueError for backwards compatibility.
      -> 1662 raise ValueError(str(e))
      1663
      1664 return c_op

      ValueError: Shapes must be equal rank, but are 2 and 1
      From merging shape 0 with other shapes. for 'stack_38' (op: 'Pack') with input shapes: [2,2], [2].


      Even though the two tensors are exactly the same shape. If I do axis=0 it works, and if I replace the tensors with regular numpy arrays of the same numbers, it works, but some how the combination of tensorflow constants and axis=1 is causing an issue.










      share|improve this question
















      I'm trying to concat two tensorflow constants of the same shape but I'm getting an error. Here's the code. (I just edited it to make the init value explicit)



      import tensorflow as tf



      b1 = tf.constant(value=[5,8])
      b2 = tf.constant(value=[6,9])
      b3= tf.concat( [b1, b2] , 1)

      with tf.Session( ) as sess:
      sess.run(tf.global_variables_initializer())
      print(sess.run([ b3] ))


      Gives this error



      ---------------------------------------------------------------------------
      InvalidArgumentError Traceback (most recent call last)
      /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs)
      1658 try:
      -> 1659 c_op = c_api.TF_FinishOperation(op_desc)
      1660 except errors.InvalidArgumentError as e:

      InvalidArgumentError: Shapes must be equal rank, but are 2 and 1
      From merging shape 0 with other shapes. for 'stack_38' (op: 'Pack') with input shapes: [2,2], [2].

      During handling of the above exception, another exception occurred:

      ValueError Traceback (most recent call last)
      <ipython-input-96-3acc40ce0738> in <module>()
      1 c1 = [[5,8], [7,4]]
      2 c2 = [6,9]
      ----> 3 c3= tf.stack( [c1, c2] )
      4 with tf.Session( ) as sess:
      5 sess.run(tf.global_variables_initializer())

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/util/dispatch.py in wrapper(*args, **kwargs)
      178 """Call target, and fall back on dispatchers if there is a TypeError."""
      179 try:
      --> 180 return target(*args, **kwargs)
      181 except (TypeError, ValueError):
      182 # Note: convert_to_eager_tensor currently raises a ValueError, not a

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/array_ops.py in stack(values, axis, name)
      1003 expanded_num_dims))
      1004
      -> 1005 return gen_array_ops.pack(values, axis=axis, name=name)
      1006
      1007

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/gen_array_ops.py in pack(values, axis, name)
      5446 axis = _execute.make_int(axis, "axis")
      5447 _, _, _op = _op_def_lib._apply_op_helper(
      -> 5448 "Pack", values=values, axis=axis, name=name)
      5449 _result = _op.outputs[:]
      5450 _inputs_flat = _op.inputs

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/op_def_library.py in _apply_op_helper(self, op_type_name, name, **keywords)
      786 op = g.create_op(op_type_name, inputs, output_types, name=scope,
      787 input_types=input_types, attrs=attr_protos,
      --> 788 op_def=op_def)
      789 return output_structure, op_def.is_stateful, op
      790

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/util/deprecation.py in new_func(*args, **kwargs)
      499 'in a future version' if date is None else ('after %s' % date),
      500 instructions)
      --> 501 return func(*args, **kwargs)
      502
      503 doc = _add_deprecated_arg_notice_to_docstring(

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in create_op(***failed resolving arguments***)
      3298 input_types=input_types,
      3299 original_op=self._default_original_op,
      -> 3300 op_def=op_def)
      3301 self._create_op_helper(ret, compute_device=compute_device)
      3302 return ret

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in __init__(self, node_def, g, inputs, output_types, control_inputs, input_types, original_op, op_def)
      1821 op_def, inputs, node_def.attr)
      1822 self._c_op = _create_c_op(self._graph, node_def, grouped_inputs,
      -> 1823 control_input_ops)
      1824
      1825 # Initialize self._outputs.

      /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs)
      1660 except errors.InvalidArgumentError as e:
      1661 # Convert to ValueError for backwards compatibility.
      -> 1662 raise ValueError(str(e))
      1663
      1664 return c_op

      ValueError: Shapes must be equal rank, but are 2 and 1
      From merging shape 0 with other shapes. for 'stack_38' (op: 'Pack') with input shapes: [2,2], [2].


      Even though the two tensors are exactly the same shape. If I do axis=0 it works, and if I replace the tensors with regular numpy arrays of the same numbers, it works, but some how the combination of tensorflow constants and axis=1 is causing an issue.







      python tensorflow






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Jan 1 at 23:27







      SantoshGupta7

















      asked Jan 1 at 22:57









      SantoshGupta7SantoshGupta7

      6811516




      6811516
























          1 Answer
          1






          active

          oldest

          votes


















          2














          I'm a little confused as to your issue, but the 0th axis dimension of the two tensors must be the same in order to concatenate along the 1st axis. What does making b1 have shape [6,8] or b2 have shape [5,9] do? Either of those cases should result in a successful concatenation.



          Edited because I misread the script the first time. As I commented, you can't concatenate on the 1st axis because your tensors are rank 1 (they have only an axis 0, or rather they only have 1 dimension). If they were rank 2 (requiring two dimensions to describe the shape), then you could concatenate on the 1st axis without issue.



          For example, you could concatenate tensor([[5,8]]) and tensor([[6,9]]) across axis=1, because they have shapes [1,2] instead of just shapes [2].






          share|improve this answer


























          • I thought [6,8] [5,9] would be the initial values, not the shape of the tensor? I changed the code to explicitly specify that those are the values and not the shape, but still getting an error, though the error is different now. I updated my post with the new code and error.

            – SantoshGupta7
            Jan 1 at 23:30











          • Derp. I saw those as a function like tf.ones for whatever reason, my apologies. Rereading your question, you can't concatenate on axis 1 because your arrays have a dimension of 1, and axis=1 gets concat to smush the tensors together on the second dimension. For example: tensor([6,8]) has shape [2], though tensor([[6,8]]) has shape [1,2]

            – aedificatori
            Jan 1 at 23:33











          • just tried b1 = tf.constant(value=[[5,8]]) b2 = tf.constant(value=[[6,9]]) b3= tf.concat( [b1, b2] , 0) and got result I'm looking for. Thanks

            – SantoshGupta7
            Jan 1 at 23:38











          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53999583%2ftf-concat-giving-shape-must-be-at-least-rank-2-but-is-rank-1-error-even-if-bot%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          2














          I'm a little confused as to your issue, but the 0th axis dimension of the two tensors must be the same in order to concatenate along the 1st axis. What does making b1 have shape [6,8] or b2 have shape [5,9] do? Either of those cases should result in a successful concatenation.



          Edited because I misread the script the first time. As I commented, you can't concatenate on the 1st axis because your tensors are rank 1 (they have only an axis 0, or rather they only have 1 dimension). If they were rank 2 (requiring two dimensions to describe the shape), then you could concatenate on the 1st axis without issue.



          For example, you could concatenate tensor([[5,8]]) and tensor([[6,9]]) across axis=1, because they have shapes [1,2] instead of just shapes [2].






          share|improve this answer


























          • I thought [6,8] [5,9] would be the initial values, not the shape of the tensor? I changed the code to explicitly specify that those are the values and not the shape, but still getting an error, though the error is different now. I updated my post with the new code and error.

            – SantoshGupta7
            Jan 1 at 23:30











          • Derp. I saw those as a function like tf.ones for whatever reason, my apologies. Rereading your question, you can't concatenate on axis 1 because your arrays have a dimension of 1, and axis=1 gets concat to smush the tensors together on the second dimension. For example: tensor([6,8]) has shape [2], though tensor([[6,8]]) has shape [1,2]

            – aedificatori
            Jan 1 at 23:33











          • just tried b1 = tf.constant(value=[[5,8]]) b2 = tf.constant(value=[[6,9]]) b3= tf.concat( [b1, b2] , 0) and got result I'm looking for. Thanks

            – SantoshGupta7
            Jan 1 at 23:38
















          2














          I'm a little confused as to your issue, but the 0th axis dimension of the two tensors must be the same in order to concatenate along the 1st axis. What does making b1 have shape [6,8] or b2 have shape [5,9] do? Either of those cases should result in a successful concatenation.



          Edited because I misread the script the first time. As I commented, you can't concatenate on the 1st axis because your tensors are rank 1 (they have only an axis 0, or rather they only have 1 dimension). If they were rank 2 (requiring two dimensions to describe the shape), then you could concatenate on the 1st axis without issue.



          For example, you could concatenate tensor([[5,8]]) and tensor([[6,9]]) across axis=1, because they have shapes [1,2] instead of just shapes [2].






          share|improve this answer


























          • I thought [6,8] [5,9] would be the initial values, not the shape of the tensor? I changed the code to explicitly specify that those are the values and not the shape, but still getting an error, though the error is different now. I updated my post with the new code and error.

            – SantoshGupta7
            Jan 1 at 23:30











          • Derp. I saw those as a function like tf.ones for whatever reason, my apologies. Rereading your question, you can't concatenate on axis 1 because your arrays have a dimension of 1, and axis=1 gets concat to smush the tensors together on the second dimension. For example: tensor([6,8]) has shape [2], though tensor([[6,8]]) has shape [1,2]

            – aedificatori
            Jan 1 at 23:33











          • just tried b1 = tf.constant(value=[[5,8]]) b2 = tf.constant(value=[[6,9]]) b3= tf.concat( [b1, b2] , 0) and got result I'm looking for. Thanks

            – SantoshGupta7
            Jan 1 at 23:38














          2












          2








          2







          I'm a little confused as to your issue, but the 0th axis dimension of the two tensors must be the same in order to concatenate along the 1st axis. What does making b1 have shape [6,8] or b2 have shape [5,9] do? Either of those cases should result in a successful concatenation.



          Edited because I misread the script the first time. As I commented, you can't concatenate on the 1st axis because your tensors are rank 1 (they have only an axis 0, or rather they only have 1 dimension). If they were rank 2 (requiring two dimensions to describe the shape), then you could concatenate on the 1st axis without issue.



          For example, you could concatenate tensor([[5,8]]) and tensor([[6,9]]) across axis=1, because they have shapes [1,2] instead of just shapes [2].






          share|improve this answer















          I'm a little confused as to your issue, but the 0th axis dimension of the two tensors must be the same in order to concatenate along the 1st axis. What does making b1 have shape [6,8] or b2 have shape [5,9] do? Either of those cases should result in a successful concatenation.



          Edited because I misread the script the first time. As I commented, you can't concatenate on the 1st axis because your tensors are rank 1 (they have only an axis 0, or rather they only have 1 dimension). If they were rank 2 (requiring two dimensions to describe the shape), then you could concatenate on the 1st axis without issue.



          For example, you could concatenate tensor([[5,8]]) and tensor([[6,9]]) across axis=1, because they have shapes [1,2] instead of just shapes [2].







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Jan 1 at 23:37

























          answered Jan 1 at 23:23









          aedificatoriaedificatori

          657




          657













          • I thought [6,8] [5,9] would be the initial values, not the shape of the tensor? I changed the code to explicitly specify that those are the values and not the shape, but still getting an error, though the error is different now. I updated my post with the new code and error.

            – SantoshGupta7
            Jan 1 at 23:30











          • Derp. I saw those as a function like tf.ones for whatever reason, my apologies. Rereading your question, you can't concatenate on axis 1 because your arrays have a dimension of 1, and axis=1 gets concat to smush the tensors together on the second dimension. For example: tensor([6,8]) has shape [2], though tensor([[6,8]]) has shape [1,2]

            – aedificatori
            Jan 1 at 23:33











          • just tried b1 = tf.constant(value=[[5,8]]) b2 = tf.constant(value=[[6,9]]) b3= tf.concat( [b1, b2] , 0) and got result I'm looking for. Thanks

            – SantoshGupta7
            Jan 1 at 23:38



















          • I thought [6,8] [5,9] would be the initial values, not the shape of the tensor? I changed the code to explicitly specify that those are the values and not the shape, but still getting an error, though the error is different now. I updated my post with the new code and error.

            – SantoshGupta7
            Jan 1 at 23:30











          • Derp. I saw those as a function like tf.ones for whatever reason, my apologies. Rereading your question, you can't concatenate on axis 1 because your arrays have a dimension of 1, and axis=1 gets concat to smush the tensors together on the second dimension. For example: tensor([6,8]) has shape [2], though tensor([[6,8]]) has shape [1,2]

            – aedificatori
            Jan 1 at 23:33











          • just tried b1 = tf.constant(value=[[5,8]]) b2 = tf.constant(value=[[6,9]]) b3= tf.concat( [b1, b2] , 0) and got result I'm looking for. Thanks

            – SantoshGupta7
            Jan 1 at 23:38

















          I thought [6,8] [5,9] would be the initial values, not the shape of the tensor? I changed the code to explicitly specify that those are the values and not the shape, but still getting an error, though the error is different now. I updated my post with the new code and error.

          – SantoshGupta7
          Jan 1 at 23:30





          I thought [6,8] [5,9] would be the initial values, not the shape of the tensor? I changed the code to explicitly specify that those are the values and not the shape, but still getting an error, though the error is different now. I updated my post with the new code and error.

          – SantoshGupta7
          Jan 1 at 23:30













          Derp. I saw those as a function like tf.ones for whatever reason, my apologies. Rereading your question, you can't concatenate on axis 1 because your arrays have a dimension of 1, and axis=1 gets concat to smush the tensors together on the second dimension. For example: tensor([6,8]) has shape [2], though tensor([[6,8]]) has shape [1,2]

          – aedificatori
          Jan 1 at 23:33





          Derp. I saw those as a function like tf.ones for whatever reason, my apologies. Rereading your question, you can't concatenate on axis 1 because your arrays have a dimension of 1, and axis=1 gets concat to smush the tensors together on the second dimension. For example: tensor([6,8]) has shape [2], though tensor([[6,8]]) has shape [1,2]

          – aedificatori
          Jan 1 at 23:33













          just tried b1 = tf.constant(value=[[5,8]]) b2 = tf.constant(value=[[6,9]]) b3= tf.concat( [b1, b2] , 0) and got result I'm looking for. Thanks

          – SantoshGupta7
          Jan 1 at 23:38





          just tried b1 = tf.constant(value=[[5,8]]) b2 = tf.constant(value=[[6,9]]) b3= tf.concat( [b1, b2] , 0) and got result I'm looking for. Thanks

          – SantoshGupta7
          Jan 1 at 23:38




















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53999583%2ftf-concat-giving-shape-must-be-at-least-rank-2-but-is-rank-1-error-even-if-bot%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Can a sorcerer learn a 5th-level spell early by creating spell slots using the Font of Magic feature?

          Does disintegrating a polymorphed enemy still kill it after the 2018 errata?

          A Topological Invariant for $pi_3(U(n))$