mania maps with custom hitsounds, especially keymaps do not work correctly. a lot of the hitsounds are correct, but some are the wrong file and some don't play at all. sorry if this is a known issue, i couldn't find it anywhere
maps: https://osu.ppy.sh/beatmapsets/136986#mania/365278 https://osu.ppy.sh/beatmapsets/351167#mania/1700101 etc
video: https://youtu.be/Thxm-U6F1C0
osu!lazer version: latest
Logs: none
As far as I can see the map you linked leverages storyboard samples for hitsound playback, so tagging it as such.
Haven't checked the second map from the video but it does seem off-sync. Maybe a lead-in issue.
Investigated one of the cases (Renai Circulation) and it's not actually due to the storyboard, but due to the fact that long notes play samples on release in lazer (#2506)... Overriding DrawableHoldNoteTail.PlaySamples() to be a no-op fixed it. I guess we can scratch that one off here.
I'll have a look if I can determine something about the other one.
I had a look at the other case and what's causing incorrect sample playback. Unfortunately it seems to be caused by at least two simultaneous issues.
First off, at parsing time the legacy ConvertHitObjectParser always tacks on a normal-hitnormal sample, which breaks the keysounding by making the default hitsound play alongside the map-provided samples:
I'm not sure why this is done, but it does interfere in this particular case.
However, even after disabling the additional sample, on a listen the kick drums were still not played. In the map concerned they are done by leveraging storyboard samples. I found out that the samples would straight up not play, as this branch in DrawableStoryboardSample would never get hit:
As to why it would never get hit - I logged the value of ElapsedFrameTime and found it to be always 0, which seems like a pretty big framework bug. Looking at FramedClock.ProcessFrame shows what I think is incorrect storage of LastFrameTime.
I would expect the current time to be stored to LastFrameTime at the start at the method. Doing that seems to make FramedClock.ElapsedFrameTime return the expected result of ~16ms at 60fps.
Both of the parts of the problem touch quite integral parts of systems involved, so please advise on how you'd like this fixed. (I can also yield fixing these things to you, if you'd prefer)
This PR is intended to resolve the first issue mentioned.
I was wrong about FramedClock; it is returning proper values when used raw. It's only returning ElapsedFrameTime = 0 through the GameplayClock which wraps it in DrawableStoryboardSample. That in turn, I think, is returning 0 because the wrapped clock is being driven at a frame level itself, and GameplayClockContainer is simply reading values within its frames (so it has no right to ever read anything but 0). Although I have to admit this clock stuff is quite difficult for me to get a good grasp on. I spent like over an hour looking at it and I'm still not sure what's going on.
Just deleting the conditional "fixes" the issue but it seems gameplay clock's ElapsedFrameTime is unsafe for use...? The frame-stable one is fine but the raw one doesn't seem to be.
i haven't looked into this in detail, but that behaviour sounds wrong and may be an oversight in GameplayClock if it is always zero.
from recollection, the underlying clock should be an interpolating variety, which should generally ensure elapsed is always changing.
After another maddening evening looking at this issue I'm starting to feel as if I have a better grasp on it.
First off, it isn't strictly true that GameplayClock returns zero elapsed time always; it returns that in this case because of how it seems to work. GameplayClock's direct underlying clock is a FramedOffsetClock, and while it is true that it has an underlying interpolating clock, it is also true that it doesn't do any interpolating on its own. Combined with the fact that GameplayClockContainer manually processes that underlying clock before letting its children observe its state, the fact that ElapsedTime returns 0 starts making sense.
When I wrapped that FramedOffsetClock in an InterpolatingFramedClock (which is very wrong and breaks rewind horribly, but it was only supposed to be janky debug code), I did get non-zero ElapsedTime. That however also revealed some interference from LifetimeManagementContainer.
Even though the check I mentioned above started being true, for some reason setting LifetimeEnd as strictly as done there caused the samples to not play back. Adding lenience (such as removing the specification, or adding allowable_late_start to it) makes the sample play. I'm guessing this is a off-by-one-frame issue but I haven't looked into detail yet. Not remotely sure what a fix for this mess is, either.
Hmm, that sounds weird.
Each FramedOffsetClock should be updating only once, and taking on the elapsed time of its child (by only processing its underlying clock recursively once). This sounds like something is processing a clock in this hierarchy more than once per frame, which would cause the zero-elapsed-time.
I've finally root-caused the remaining issue after like 2 weeks of searching overall and a couple of hours adding log statements everywhere. It's a pretty huge one.
The problem arises from the fact that sometimes this map has two storyboard samples starting at the same time. Storyboard layers are LifetimeManagementContainers, which in itself isn't a problem. But in this case the scenario is as follows:
The storyboard layer's UpdateSubTree() starts to run. Within it, the first drawable sample's Update() runs. Within that update, it plays the sample and modifies its own lifetime:
The set to LifetimeEnd causes LMC to remove the sample from the storyboard layer's AliveInternalChildren. But, recall that we are actually in the middle of the layer's UpdateSubTree(). Therefore:
AliveInternalChildren changes from 2 to 1.I confirmed this by making a copy of AliveInternalChildren within CompositeDrawable.UpdateSubTree() and iterating on the copy instead of the actual list - it brings the missing sample back. Obviously the final fix should be more localised, but I haven't yet thought about how it should look like. I'll come back to it over the coming days if you guys don't want to take over.
I've propped up a test case and an "obvious" resolution here. It works, but I'm not sold on it yet. It's a bit awkward code-wise and I'm a little worried about a potential performance impact, even with the optimisation of using one list instance instead of reallocating every frame.
I experimented a little with alternatives, such as deferring MakeChild{Alive,Dead} calls until UpdateAfterChildren, but applying them the obvious way usually caused this assertion to fall over for reasons I haven't investigated in detail. This is a little complicated and my time during the week is a little limited so it might take me a few days to fully explore all options.
Have you tried scheduling the callbacks? Otherwise yeah I think I can get behind such a solution.
I did try scheduling the MakeChild{Alive,Dead} calls themselves (fell over on that same assertion) but strangely didn't think of scheduling the entire childLifetimeChanged callback. Will try and see how it goes.