Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add data type workaround for binary ops #2326

Merged
merged 1 commit into from
Feb 28, 2025
Merged

Conversation

mmanzoorTT
Copy link
Contributor

@mmanzoorTT mmanzoorTT commented Feb 28, 2025

Ticket

Closes #2321

Problem description

tt-metal has introduced type checker for binary ops which aborts program
in case of any mismatch.
tt-metal PR: tenstorrent/tt-metal#17828

What's changed

Add a workaround for binary ops.

Checklist

  • New/Existing tests provide coverage for changes

@mmanzoorTT mmanzoorTT force-pushed the mmanzoor/multiply-wa branch 2 times, most recently from eb55ca3 to 5c20e55 Compare February 28, 2025 16:41
Copy link
Contributor

@sdjordjevicTT sdjordjevicTT left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the follow up!

* Data type workaround for unsupported types
* Layout workaround if input layout is ROW_MAJOR
@kmabeeTT
Copy link
Contributor

Hey guys @mmanzoorTT / @sdjordjevicTT want some thoughts - tt-metal folks just replied to my question in slack channel #tt-metal-ttnn about whether this assert might be overly tight (given llama was passing for us) and they are planning to remove the assert for now because it regressed passing models. So while we probably don't 100% need this PR, I think it's still worthwhile to keep for now to unblock llama without waiting for newer tt-metal tomorrow, and it can be updated as needed in a while depending on where they land with (potentially updated/less tight) assert, does this sound good?

@mmanzoorTT
Copy link
Contributor Author

Hey guys @mmanzoorTT / @sdjordjevicTT want some thoughts - tt-metal folks just replied to my question in slack channel #tt-metal-ttnn about whether this assert might be overly tight (given llama was passing for us) and they are planning to remove the assert for now because it regressed passing models. So while we probably don't 100% need this PR, I think it's still worthwhile to keep for now to unblock llama without waiting for newer tt-metal tomorrow, and it can be updated as needed in a while depending on where they land with (potentially updated/less tight) assert, does this sound good?

@kmabeeTT I agree

@mmanzoorTT mmanzoorTT merged commit f47f5ad into main Feb 28, 2025
31 checks passed
@mmanzoorTT mmanzoorTT deleted the mmanzoor/multiply-wa branch February 28, 2025 20:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[TTNN] Add workaround for unsupported types for binary ops
3 participants