Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix updating channel's preview message when coming back to online #3574

Open
wants to merge 3 commits into
base: develop
Choose a base branch
from

Conversation

laevandus
Copy link
Contributor

@laevandus laevandus commented Jan 28, 2025

🔗 Issue Links

Resolves IOS-680

🎯 Goal

Channel's preview message did not update when coming back online

📝 Summary

  • Do not use cached fetch request results when new objects were deleted of inserted

🛠 Implementation

Ignore cached results if there are new objects inserted or deleted for that type.

🎨 Showcase

Add relevant screenshots and/or videos/gifs to easily see what this PR changes, if applicable.

Before After
img img

🧪 Manual Testing Notes

  1. Go Offline in the Channel List
  2. Other user sends messages to a Channel
  3. User from 1. goes back online
  4. The unread count, order, and preview is updated

☑️ Contributor Checklist

  • I have signed the Stream CLA (required)
  • This change should be manually QAed
  • Changelog is updated with client-facing changes
  • Changelog is updated with new localization keys
  • New code is covered by unit tests
  • Documentation has been updated in the docs-content repo

@laevandus laevandus added 🐞 Bug An issue or PR related to a bug 🌐 SDK: StreamChat (LLC) Tasks related to the StreamChat LLC SDK labels Jan 28, 2025
@laevandus laevandus requested a review from a team as a code owner January 28, 2025 14:52
// Ignore cache when inserted (but not yet saved) object id is present
guard !objectIds.contains(where: { $0.isTemporaryID }) else { return false }
// Context has pending inserted or deleted objects of this type (can affect ids returned by the fetch request)
guard !insertedObjects.contains(where: { $0 is T }) && !deletedObjects.contains(where: { $0 is T }) else { return false }
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Theoretically speaking even updatedObjects could change results, but this will take away the benefit of the cache. Based on the history, it was added for speeding up channel list payload writes.

@laevandus laevandus force-pushed the fix/updating-preview-message branch from 20c98c7 to 4c1f1b3 Compare January 28, 2025 14:56
@Stream-SDK-Bot
Copy link
Collaborator

Stream-SDK-Bot commented Jan 28, 2025

SDK Size

title develop branch diff status
StreamChat 7.02 MB 7.02 MB 0 KB 🟢
StreamChatUI 4.77 MB 4.77 MB 0 KB 🟢

@Stream-SDK-Bot
Copy link
Collaborator

SDK Performance

target metric benchmark branch performance status
MessageList Hitches total duration 10 ms 3.34 ms 66.6% 🔼 🟢
Duration 2.6 s 2.55 s 1.92% 🔼 🟢
Hitch time ratio 4 ms per s 1.31 ms per s 67.25% 🔼 🟢
Frame rate 75 fps 78.37 fps 4.49% 🔼 🟢
Number of hitches 1 0.4 60.0% 🔼 🟢

@laevandus laevandus added the 🤞 Ready For QA A PR that is Ready for QA label Jan 29, 2025
@Stream-SDK-Bot
Copy link
Collaborator

Stream-SDK-Bot commented Jan 29, 2025

SDK Size

title develop branch diff status
StreamChat 7.02 MB 7.02 MB 0 KB 🟢
StreamChatUI 4.77 MB 4.77 MB 0 KB 🟢

@testableapple testableapple added 🧪 QAing and removed 🤞 Ready For QA A PR that is Ready for QA labels Jan 29, 2025
@laevandus
Copy link
Contributor Author

@testableapple found that in some cases the preview message can end up being nil and then the channel list shows "No messages" as the preview. Reproduced it once, but don't know which code path triggers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 Bug An issue or PR related to a bug 🧪 QAing 🌐 SDK: StreamChat (LLC) Tasks related to the StreamChat LLC SDK
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants