Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump mlx to version 2.21.2 #94

Open
wants to merge 6 commits into
base: main
Choose a base branch
from
Open

Conversation

LeonNissen
Copy link
Contributor

@LeonNissen LeonNissen commented Jan 27, 2025

Bumps mlx to version 2.21.2

♻️ Current situation & Problem

The MLX library is currently 1.18.1, and can be bumped to 2.x to support newer models.

⚙️ Release Notes

Updates the MLX library to 2.21.2 and migrates code for breaking changes.

📝 Code of Conduct & Contributing Guidelines

By submitting creating this pull request, you agree to follow our Code of Conduct and Contributing Guidelines:

@LeonNissen LeonNissen added the dependencies Pull requests that update a dependency file label Jan 27, 2025
@LeonNissen LeonNissen self-assigned this Jan 27, 2025
Copy link

codecov bot commented Jan 27, 2025

Codecov Report

Attention: Patch coverage is 0% with 64 lines in your changes missing coverage. Please review.

Project coverage is 38.51%. Comparing base (26b1e07) to head (eebcc51).

Files with missing lines Patch % Lines
...urces/SpeziLLMLocal/LLMLocalSession+Generate.swift 0.00% 61 Missing ⚠️
Sources/SpeziLLMLocal/LLMLocalSession+Setup.swift 0.00% 3 Missing ⚠️
Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main      #94      +/-   ##
==========================================
- Coverage   38.57%   38.51%   -0.06%     
==========================================
  Files          64       64              
  Lines        2331     2335       +4     
==========================================
  Hits          899      899              
- Misses       1432     1436       +4     
Files with missing lines Coverage Δ
Sources/SpeziLLMLocal/LLMLocalSchema.swift 0.00% <ø> (ø)
Sources/SpeziLLMLocal/LLMLocalSession.swift 0.00% <ø> (ø)
Sources/SpeziLLMLocal/LLMLocalSession+Setup.swift 0.00% <0.00%> (ø)
...urces/SpeziLLMLocal/LLMLocalSession+Generate.swift 0.00% <0.00%> (ø)

Continue to review full report in Codecov by Sentry.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 26b1e07...eebcc51. Read the comment docs.

Copy link
Member

@PSchmiedmayer PSchmiedmayer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very cool; thank you for bumping this so quickly.

Happy to see it merged once all CI elements are passing and the existing warnings are addressed 🚀

Package.swift Outdated
.package(url: "https://github.com/huggingface/swift-transformers", .upToNextMinor(from: "0.1.14")),
.package(url: "https://github.com/StanfordBDHG/OpenAI", .upToNextMinor(from: "0.2.9")),
.package(url: "https://github.com/StanfordSpezi/Spezi", from: "1.2.1"),
.package(url: "https://github.com/StanfordSpezi/SpeziFoundation", from: "2.0.0"),
.package(url: "https://github.com/StanfordSpezi/SpeziStorage", from: "1.0.2"),
.package(url: "https://github.com/StanfordSpezi/SpeziOnboarding", from: "1.1.1"),
.package(url: "https://github.com/StanfordSpezi/SpeziChat", .upToNextMinor(from: "0.2.1")),
.package(url: "https://github.com/StanfordSpezi/SpeziChat", exact: "0.2.1"),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the reason for not keeping it as upToNextMinor and even bumping it to 0.2.2?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@PSchmiedmayer this is fixed to 0.2.1, because using 0.2.2 leads to the following error when building the TestApp, which originates from line 212 of MessageInputView.swift within SpeziChat:

Non-sendable type 'SFSpeechRecognitionResult?' returned by implicitly asynchronous call to nonisolated function cannot cross actor boundary

@LeonNissen and I both independently encountered this.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Sendable warning should come almost certainly from SpeziSpeech. Interesting that this doesn't happen in 0.2.1 but does happen in 0.2.2?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh I see, 0.2.2 enables Swift 6 language mode for SpeziChat and SpeziSpeech, that's why the error is surfaced here then.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is very weird that this issue occurs now, probably an Xcode 16.2 thing. Back with 16.1 this wasn't an issue.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@vishnuravi Can you check which SpeziSpeech version is checked out with the SpeziLLM UI test app? Is it 1.2.0?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it is resolved to SpeziSpeech 1.2.0.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image

Yep same for me. I have no idea how that happens, there isn't a 1.2.0 release OR 1.2.0 tag for the SpeziSpeech repo: https://github.com/StanfordSpezi/SpeziSpeech/tags
The code state that is checked out with the "1.2.0 version" is quite old, hence the Sendable errors.

@PSchmiedmayer Any ideas here?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just verified the list of tags, there isn't a 1.2.0:

image

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We tagged new versions in Speech and Chat; hopefully this resolves the moved tags. Really strange where this came from.

@vishnuravi
Copy link
Member

vishnuravi commented Jan 28, 2025

Sorry @LeonNissen I was playing around with this branch and pushed a commit by mistake. I reverted it. Didn't mean to step on your toes.

@LeonNissen
Copy link
Contributor Author

LeonNissen commented Jan 28, 2025

Sorry @LeonNissen I was playing around with this branch and pushed a commit by mistake. I reverted it. Didn't mean to step on your toes.

@vishnuravi No worries, feel free to play around with it, I am currently working on a different branch anyways 😊

@vishnuravi
Copy link
Member

Sorry @LeonNissen I was playing around with this branch and pushed a commit by mistake. I reverted it. Didn't mean to step on your toes.

@vishnuravi No worries, feel free to play around with it, I am currently working on a different branch anyways 😊

Oh ok, cool, I will continue to work on this branch then.

@vishnuravi vishnuravi changed the title bump mlx to version 2.21.2 Bump mlx to version 2.21.2 Jan 28, 2025
@vishnuravi vishnuravi marked this pull request as ready for review January 28, 2025 23:12
@vishnuravi
Copy link
Member

The Linkspector check is failing with a 403 when attempting to verify links to the OpenAI documentation that are valid and reachable from a regular web browser. This suggests that openAI is likely blocking GitHub IPs. I would make this check optional until these false positives are solved.

@PSchmiedmayer
Copy link
Member

That's unfortunate.
@vishnuravi, what about adding a config file in the repo and adding the openai web page to the ignore pattern or just exclude the very specific link in the ignore pattern?

internal func _generate(continuation: AsyncThrowingStream<String, any Error>.Continuation) async {
#if targetEnvironment(simulator)
// swiftlint:disable:next return_value_from_void_function
return await _mockGenerate(continuation: continuation)
return await _mockGenerate(continuation: continuation) // swiftlint:disable:this return_value_from_void_function
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have a lot of SwiftLint disable calls in this file and others that are not needed and can be resolved by following the guidelines for the lint rules, e.g.:

Suggested change
return await _mockGenerate(continuation: continuation) // swiftlint:disable:this return_value_from_void_function
await _mockGenerate(continuation: continuation)

import MLXRandom
import os
import SpeziChat
import SpeziLLM


extension LLMLocalSession {
// swiftlint:disable:next identifier_name function_body_length
// swiftlint:disable:next identifier_name function_body_length cyclomatic_complexity
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would be great if we can break it down into smaller private functions or extensions, e.g., handling the tokens. Could resolve most of the SwiftLint issues here and will make it nicer to maintain in the future.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
Status: In Progress
Development

Successfully merging this pull request may close these issues.

4 participants