Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate and fix BH ttnn unit test failures #18485

Open
bbradelTT opened this issue Feb 28, 2025 · 0 comments
Open

Investigate and fix BH ttnn unit test failures #18485

bbradelTT opened this issue Feb 28, 2025 · 0 comments
Assignees
Labels
blackhole bug Something isn't working P1

Comments

@bbradelTT
Copy link
Contributor

TTNN unit tests were run on BH.
https://github.com/tenstorrent/tt-metal/actions/runs/13578665690

The following tests failed:

tests/ttnn/unit_tests/test_concat_issue.py::test_5d_concat_tile[layout=Layout.TILE]
tests/ttnn/unit_tests/operations/test_bernoulli.py::test_bernoulli[is_out_alloc=True-out_dtype=bfloat16-in_dtype=bfloat16-seed=6296-shape=[2003]]
tests/ttnn/unit_tests/operations/test_maxpool2d.py::test_run_max_pool_width_shard[ceil_mode=False-dtype=DataType.BFLOAT16-dilation=(1, 1)-stride=(1, 1)-padding=(0, 0)-kernel_size=(2, 2)-act_shape=[1, 6144, 6, 6]-device_params={'l1_small_size': 24576}]
tests/ttnn/unit_tests/operations/test_maxpool2d.py::test_run_max_pool_width_shard[ceil_mode=False-dtype=DataType.BFLOAT16-dilation=(1, 1)-stride=(1, 1)-padding=(4, 4)-kernel_size=(9, 9)-act_shape=[1, 6144, 6, 6]-device_params={'l1_small_size': 24576}]
tests/ttnn/unit_tests/operations/test_moreh_sum.py::test_moreh_sum_integer[int32-dim-h-2, 2, 3, TILE_HEIGHT * 8, TILE_WIDTH * 8]
tests/ttnn/unit_tests/operations/test_reallocate.py::test_reallocate_sharded[input_shape=[1, 1, 4, 34]-core_grid=ttnn.CoreGrid(x=1, y=1)-strategy=ShardStrategy.BLOCK-layout=Layout.ROW_MAJOR]
tests/ttnn/unit_tests/operations/eltwise/test_binary_bcast.py::test_binary_scalar_ops[ttnn_fn=div-activations=(('LOG',), (), ('ABS', 'SQRT'))-a_shape=torch.Size([5, 1, 64, 1])-b_shape=torch.Size([1, 3, 1, 128])]
tests/ttnn/unit_tests/operations/eltwise/test_composite.py::test_unary_composite_round_ttnn[input_shapes=torch.Size([1, 1, 32, 32])]

PCC Error:
ttnn group 1 tests: tests/ttnn/unit_tests/test_concat_issue.py#L22

test_5d_concat_tile[layout=Layout.TILE] AssertionError: 0.4476023911980018

Directory Structure / Includes Error:
ttnn group 4 tests: tests/ttnn/unit_tests/operations/test_bernoulli.py#L77

test_bernoulli[is_out_alloc=True-out_dtype=bfloat16-in_dtype=bfloat16-seed=6296-shape=[2003]]

RuntimeError: TT_THROW @ /work/tt_metal/impl/program/program.cpp:50: tt::exception
info:
Failed to generate binaries for compute_bernoulli TT_THROW @ /work/tt_metal/jit_build/build.cpp:613: tt::exception
info:
trisc1 build failed

PCC Error:
ttnn group 6 tests: tests/ttnn/unit_tests/operations/test_maxpool2d.py#L431

test_run_max_pool_width_shard[ceil_mode=False-dtype=DataType.BFLOAT16-dilation=(1, 1)-stride=(1, 1)-padding=(0, 0)-kernel_size=(2, 2)-act_shape=[1, 6144, 6, 6]-device_params={'l1_small_size': 24576}]

AssertionError: 0.058725904965279276

PCC Error:
ttnn group 7 tests: tests/ttnn/unit_tests/operations/test_maxpool2d.py#L431

test_run_max_pool_width_shard[ceil_mode=False-dtype=DataType.BFLOAT16-dilation=(1, 1)-stride=(1, 1)-padding=(4, 4)-kernel_size=(9, 9)-act_shape=[1, 6144, 6, 6]-device_params={'l1_small_size': 24576}]

AssertionError: 0.222073781949217

PCC Error:
ttnn group 9 tests: tests/ttnn/unit_tests/operations/test_moreh_sum.py#L550

test_moreh_sum_integer[int32-dim-h-2, 2, 3, TILE_HEIGHT * 8, TILE_WIDTH * 8]

assert False
 +  where False = <built-in method equal of type object at 0x7faaf019a920>(tensor([[[[[ 17,   9,   3,   1, -25, -46, -16, -28,  25, -30,  13, -28, -29, -31,  38, -19,  20,  -4,  19,  15, -66, -32,  23, -34, -11, -13, -28,  11,  28, -36,  13,  -2, -20, -10,  29, -18, -16,  17,  -7,   1, -21,  16,   2,  43,   0, -14, -19,  24,   9,  -8,  32,   8,  -4,  15, -31, -32, -38, -36,  40,  -7, -17,  29,  36,   6, -44,  25,   6,   7,  44,  18,   8,  18, -41,  15,  -9, -19, -23,\n            -17, -26, -13,   5, -19, -16, -28, -21,  21,  20,   3,  15,  31, -10,  -6,  -7,  18, -13,  15, -42, -50,  30, -17,  54,  21,  18,   2, -11,  44,  14,  20,  28,  19,  -1,  15,  14,   1,  25, -35, -21,  30,  30,  41,  -7, -33,  39,   9,  21,  39,  -1,  17,  10, -20,   8,  40, -30,  -7, -36,  17,  25,  19,   9,   4, -21,  -9,  19, -16,  -2,  -7,  -9,  15, -12,   8,  18,   0,  -1,  25,\n             -9, -15,  36, -23, -34,  -9,  -9,  42,  42,  -7,   3,  -8,  22, -39,  21, -30, -25,   3,  26,  14,  26,  -9,  17,  -2,  10, -13,  -2, -11,   2,  16,  -7,  14,  40,  -3,  13, -19,  58,  12,  35,   1, -44,  39,   1, -41, -20, -32,  37,  -9,  21, -29,   2,  17,  -1,  18,  34,  -9, -14,  43, -38,  19,  38, -38, -11, -18, -27,  32,  -7,   4, -17,  27,  49,  -2,  10,   5,   5,  21,  21,\n            -12,  21,   6,  13, -30,   7, -29,  -3, -31,   3,  12, -25,   9,  59,  -3,  -6, -16,  13, -22,  -1, -27, -23, -19,  23, -26]],\n\n          [[-37,  30, -15,  17,  12,  22, -22,  16,  25, -11, -29, -21, -14, -45,  48,  38,  -8,  24, -30, -27,  -4, -14,  -2, -24,  -9,  -3,  63,  24,  46, -11,  -7,   0,   9, -19,   0,  13,   1, -29,  36, -17,   2,  -1,   8,  26, -43,  -3, -41, -31,  10,  22,  -9,  -8, -28,  -7, -33,  40,  21, -19, -11, -16,  11,   4,  15, -39,  12,  -9,  12,  13,  24,  12,   6, -24, -15,   7, -10,  -6,  -1,\n            -41, -15,  13,  -3, -24,  27,  -7,   3,   2,  -6, -11,  12,  13, -72, -22,   3,  -5, -17,  -6,   0,  25,  11,  16,  10, -33,  24,  -2,  27,  22,   7,   3,  -7, -15,  -8,  25,  18,  14,  10,   5, -22,  -5, -35,   0,   2,  17,  14, -43,  -4,  23,  26, -10, -55,  -6,  27, -21,  62, -27,  -3,  26,  39, -39,  21,  -1,   7,  -3, -25,  12, -12,   8, -10,  23, -14,   3,  35, -16,   2, -19,\n            -28,  25, -28,   4,   2, -26, -49, -10,   4,   5,   5,  24,  22,   9,  26,  -6,  24, -24,   7,  20,  12,  45, -13,  -9, -30,  16, -17, -15, -11, -24, -42,  11,   7,  -7, -39,  -5, -17,   8, -33,   8,   8, -31,  22,  15,  35, -14, -10,   4, -27, -43,   2,  21,  -8,  11,  -8,   5, -47, -20, -12,   8, -15,  18,  18,  23,  14,   3,  35,  26,  -8,   5, -13,  25,  24,  19,  31, -28,   7,\n            -24,  -9,   8, -16,  23, -52,  -4,  12, -14,  24, -12, -10,  13,  27,   4, -31,  14,  26,  19,   0, -26,  -4,  13,  16, -12]],\n\n          [[-39, -12, -36,  31,  29, -11,  16, -38, -42,   8, -39, -38,  34, -23,  -3,  -6,  -9,  31,   5,  23, -18,   0,  16,  -3,   6, -23,  40,  -1, -30,  32, -20,  24, -10,  43, -26, -59,  16,  -2,  33, -28,   6, -12,  -6, -10, -30,  36, -22, -11,  17,  -9,  -2,  -7,   1,  33,  12, -18,   1, -16, -50,  -7,  15, -26, -13, -56, -11, -21,   4,  15,  11, -29,  12,  37,   1,   5,   4, -22,   0,\n              6, -38,   8, -22,  13,  25,   3,  25,   2,  15, -26,  30,   7, -16,  16,  -9, -46,  -7,  12,   3,  13, -37,  22,   9, -26,   3, -15, -10,  -8, -28, -12,  -2, -19, -24, -13,  26, -19,  26,  -4, -16,  -4, -13,  12,  23,  -1, -23,   1,  -1,  26,  56, -28, -23, -11,  -8,  27,  -3,  -7,  37,  12, -18,  33, -12,  -4,  -8,   0,   5, -20,  12, -14,  31, -34,  -6,   5,  13,  16, -16,  46,\n              9, -16,   7,   1,   3,  15, -23, -18,   0,  18,  19,   1,  -4,  17,  25, -35,  13,  12, -43,   5,  -1, -25, -21, -30,  -9,  -4,  18, -34,  32,  21, -23, -15, -10,  -3,   0,   0,  12, -22,   5, -28,  -7, -36, -14,  18,  39,  -8,  19,  28,  34,   0, -15, -30, -42, -24,  16,  -9, -25,  -6,  24,  34,  -2,  -5,  -8,  36,  15, -27,  18,   5, -48,   7, -30,  12, -30, -14,  16, -20,  -2,\n             32,   0,  26,   9, -27, -30,   0,  15,  20,  20, -2

Alignment Error:
ttnn group 10 tests: tests/ttnn/unit_tests/operations/test_reallocate.py#L94

test_reallocate_sharded[input_shape=[1, 1, 4, 34]-core_grid=ttnn.CoreGrid(x=1, y=1)-strategy=ShardStrategy.BLOCK-layout=Layout.ROW_MAJOR]

RuntimeError: TT_FATAL @ /work/ttnn/cpp/ttnn/operations/data_movement/sharded/sharded_to_interleaved/device/sharded_to_interleaved_op.cpp:28: (*input_tensor.memory_config().shard_spec).shape[1] * input_tensor.element_size() % (l1_alignment) == 0
info:
Shard page size must be aligned to 16B for L1 Tensor

PCC Error:
ttnn group 11 tests: tests/ttnn/unit_tests/operations/eltwise/test_binary_bcast.py#L178

test_binary_scalar_ops[ttnn_fn=div-activations=(('LOG',), (), ('ABS', 'SQRT'))-a_shape=torch.Size([5, 1, 64, 1])-b_shape=torch.Size([1, 3, 1, 128])]

assert False
 +  where False = <function test_binary_scalar_ops.<locals>.compare at 0x7f7188634430>([ttnn.Tensor([[[[ 0.68750,  0.98438,  ...,  0.64453,  0.71484],\n               [ 1.41406,  2.03125,  ...,  1.32812,  1.47656],\n               ...,\n               [ 0.53125,  0.75781,  ...,  0.49805,  0.55469],\n               [ 0.98047,  1.40625,  ...,  0.92188,  1.02344]],\n\n              [[ 0.76172,  0.86719,  ...,  3.18750,  0.63672],\n               [ 1.57031,  1.78125,  ...,  6.56250,  1.31250],\n               ...,\n               [ 0.58594,  0.66797,  ...,  2.45312,  0.49219],\n               [ 1.08594,  1.23438,  ...,  4.56250,  0.91406]],\n\n              [[ 0.82422,  2.60938,  ...,  0.71875,  0.71484],\n               [ 1.69531,  5.34375,  ...,  1.47656,  1.46875],\n               ...,\n               [ 0.63281,  2.00000,  ...,  0.55469,  0.55078],\n               [ 1.17188,  3.71875,  ...,  1.02344,  1.01562]]],\n\n             [[[ 0.63281,  0.90625,  ...,  0.59375,  0.65625],\n               [ 1.61719,  2.31250,  ...,  1.51562,  1.68750],\n               ...,\n               [ 1.39844,  2.00000,  ...,  1.31250,  1.46094],\n               [ 1.24219,  1.78125,  ...,  1.16406,  1.28906]],\n\n              [[ 0.69922,  0.79688,  ...,  2.92188,  0.58594],\n               [ 1.79688,  2.03125,  ...,  7.53125,  1.50000],\n               ...,\n               [ 1.55469,  1.75781,  ...,  6.46875,  1.29688],\n               [ 1.37500,  1.56250,  ...,  5.75000,  1.14844]],\n\n              [[ 0.75391,  2.39062,  ...,  0.66016,  0.65625],\n               [ 1.93750,  6.12500,  ...,  1.69531,  1.67969],\n               ...,\n               [ 1.67188,  5.28125,  ...,  1.46094,  1.45312],\n               [ 1.48438,  4.68750,  ...,  1.29688,  1.28906]]],\n\n             ...,\n\n             [[[ 1.46094,  2.09375,  ...,  1.36719,  1.52344],\n               [ 0.97656,  1.39844,  ...,  0.91797,  1.01562],\n               ...,\n               [ 0.47656,  0.67969,  ...,  0.44727,  0.49609],\n               [ 1.57812,  2.26562,  ...,  1.47656,  1.64844]],\n\n              [[ 1.61719,  1.83594,  ...,  6.78125,  1.35938],\n               [ 1.08594,  1.22656,  ...,  4.53125,  0.90625],\n               ...,\n               [ 0.52734,  0.59766,  ...,  2.20312,  0.44141],\n               [ 1.75000,  1.98438,  ...,  7.31250,  1.46094]],\n\n              [[ 1.74219,  5.53125,  ...,  1.52344,  1.52344],\n               [ 1.17188,  3.70312,  ...,  1.02344,  1.01562],\n               ...,\n               [ 0.56641,  1.79688,  ...,  0.49609,  0.49414],\n               [ 1.89062,  5.96875,  ...,  1.64844,  1.64062]]],\n\n             [[[ 1.88281,  2.68750,  ...,  1.76562,  1.95312],\n               [ 1.22656,  1.76562,  ...,  1.15625,  1.28125],\n               ...,\n               [ 0.85938,  1.22656,  ...,  0.80859,  0.89453],\n               [ 1.21094,  1.74219,  ...,  1.14062,  1.26562]],\n\n              [[ 2.07812,  2.35938,  ...,  8.68750,  1.74219],\n               [ 1.36719,  1.54688,  ...,  5.71875,  1.14062],\n               ...,\n               [ 0.95312,  1.07812,  ...,  3.98438,  0.79688],\n               [ 1.35156,  1.52344,  ...,  5.62500,  1.12500]],\n\n              [[ 2.23438,  7.09375,  ...,  1.96094,  1.95312],\n               [ 1.46875,  4.65625,  ...,  1.28906,  1.27344],\n               ...,\n               [ 1.02344,  3.25000,  ...,  0.89844,  0.89453],\n               [ 1.45312,  4.59375,  ...,  1.27344,  1.25781]]]], shape=Shape([5, 3, 64, 128]), dtype=DataType::BFLOAT16, layout=Layout::TILE)], [tensor([[[[ 0.6875,  0.9844,  0.9023,  0.6992,  1.6406,  1.1406,  0.8281,  0.6328,  0.6719,  0.6406,  0.9297,  0.6680,  1.1094,  1.3359,  1.0703,  0.7227,  0.7734,  0.8008,  0.6914,  0.6758,  1.0391,  0.7773,  0.6758,  0.9102,  2.0625, 10.1250,  0.6797,  0.6328,  1.3828,  0.6758,  1.3125,  0.8867,  1.1641,  7.1562,  0.7617,  1.2422,  0.9219,  1.2344,  0.8984,  1.4766,  0.8398,  0

Probably Bad Arch Check Error:
ttnn group 12 tests: tests/ttnn/unit_tests/operations/eltwise/test_composite.py#L468

test_unary_composite_round_ttnn[input_shapes=torch.Size([1, 1, 32, 32])]

RuntimeError: TT_FATAL @ /work/ttnn/cpp/ttnn/operations/eltwise/unary/device/unary_composite_op.cpp:709: arch == tt::ARCH::WORMHOLE_B0
info:
Op is only supported on Wormhole

Sub-issues need to be created and assigned to the appropriate owners.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
blackhole bug Something isn't working P1
Projects
None yet
Development

No branches or pull requests

2 participants