We render all messages available in a chat, which can slow the UI.
One idea is to offload any messages that aren't in the view. If user scrolls back, they can be re-rendered.
Tagging with v1 release as this came up in light of some spam tests in dogfooding chat yesterday. But not a hard blocker.
Andrey's comment:
morning team, one more thing with spam and with lots of messages from history, all UI getting slow, because we render all messages, but we have a limit of 20 messages to render, and we should keep it when we receive new messages , only user can change this limit by scrolling
Andrea:
should we not just offload messages that are not in the view?
if the user scrolls back to present, we offload older messages that are not in the view (with some margin for faster scrolling)
similarly if we receive a message that is old (from mailserver), and not in the view, we don't display it
I don't want to give you whiplash @cammellos but if this is a hairy issue鈥攃an we get a sense for effort and impact? e.g. how much payoff can we expect to see?
I'd rather have you focus fully on Waku, even documenting the integration plan, if the complexity is high and yield is not significant. It's probably not as important as Waku.
@rachelhamlin hard to say, for today I have been fully on Waku, I can prioritize that 75% 25% or I can timebox this (say tomorrow morning I can work on this) and try to quantify the perf. improvement / see how far it goes, up to you. If everything goes well soon enough I should have waku integrated so I should have some slack on that side of things.
Amazing @cammellos. Can you budget some time to document Waku in Status as part of that work? I think it would be good to file something away in the specs repo.
We are probably ready to begin the design phase of Waku settings as well.
You can manage your time however you prefer, given that Waku is close. As long as Waku > offloading.
@flexsurfer heyo! catching up via GH and not Discord - was this fixed elsewhere or just closed?
@rachelhamlin this has been merged already