Continuing the discussion from here: https://github.com/dotnet/runtime/pull/41918#discussion_r487331771
@eerhardt pointed out that our IDisposable docs indicate that we should call GC.SuppressFinalize even if we don't have a finalizer, so that derived classes don't need to call it.
For this to work, we have to always call this, otherwise folks adding the finalizer need to check if the base type is already calling it (by looking at source or disassembling) and then only call it if the base type doesn't call it. I don't think we should make folks look at implementations (nor can they in all cases). So the alternate safe advice is to always call GC.SF in Dispose(bool) the type that adds the finalizer.
I think relying on the base-type that implements IDisposable to call it is the best recommendation since it results in the fewest possible calls to GC.SF. However for this to work, we need to do it everywhere.
Currently we don't. I spot checked a few places:
https://github.com/dotnet/runtime/blob/b910310f066d57e5290d5c3a96d64f03dfdd63c6/src/libraries/System.Private.CoreLib/src/System/IO/BinaryWriter.cs#L86-L89
https://github.com/dotnet/runtime/blob/b910310f066d57e5290d5c3a96d64f03dfdd63c6/src/libraries/System.Private.CoreLib/src/System/Diagnostics/Tracing/DiagnosticCounter.cs#L67-L74
https://github.com/dotnet/runtime/blob/b910310f066d57e5290d5c3a96d64f03dfdd63c6/src/libraries/System.Private.CoreLib/src/System/Threading/ReaderWriterLockSlim.cs#L1249-L1252
https://github.com/dotnet/runtime/blob/b910310f066d57e5290d5c3a96d64f03dfdd63c6/src/libraries/System.Private.CoreLib/src/System/Resources/ResourceReader.cs#L140-L148
https://github.com/dotnet/runtime/blob/b910310f066d57e5290d5c3a96d64f03dfdd63c6/src/libraries/System.Private.CoreLib/src/System/Resources/ResourceSet.cs#L82-L105
Tagging subscribers to this area: @dotnet/gc
See info in area-owners.md if you want to be subscribed.
cc @dotnet/fxdc
Yes, they _should_; it's the rules of the Basic Dispose Pattern :smile:.
I wonder if we could have an analyzer validate this pattern.
CA1816 seems to check that there is a SuppressFinalize call in Dispose(), but it doesn't seem to necessarily enforce it's a perfect implementation of the Basic Dispose Pattern (that is, that it's after the call to Dispose(true).
But it would at least find all the places that we didn't do it at all.
What is a breaking potential of adding the SuppressFinalize calls to existing types that never had it before? It will regress performance a bit for sure, but I believe it can also introduce functional breaks or memory leaks.
I'll continue to voice the contrarian view I've held for ages. :)
IMO the dispose pattern (this.Dispose(true); GC.SuppressFinalize(this);) shouldn't be recommended for new code. The purpose of this pattern is to support the scenario where type authors are holding raw handles to unmanaged resources, which honestly vanishingly few people should be doing nowadays. The overwhelmingly more common cases are where type authors maintain instance fields to managed objects (like Stream) or to managed handles (SafeHandle-derived types), which means that the type authors should never have to worry about finalization. But by continuing to promote the dispose pattern, and by having analyzer rules which promote it, we're forcing devs to think about this concept even though it's not necessary.
"But what if a subclassed type wants to maintain an unmanaged handle?" is sometimes brought up as a counterargument. I'm not too worried about this for two reasons. First, I anticipate very few people ever subclassing an existing unsealed type and using raw handles instead of managed handles. Second, on the off-chance that somebody does this, _they'll need to write a finalizer anyway_, and they can take that opportunity to clean up whatever unmanaged resources they were holding on to. IMO there's no benefit to forcing types all the way up and down the hierarchy to worry about this unlikely scenario.
@GrabYourPitchforks: What do you think about unregistering events? What is the suggested pattern here? If an object with registered event handlers gets disposed should the class itself unregister by setting the events to null, or should the one the registered the event also unregister it?
You'll see some discrepancies on that across the Framework. UI stacks in particular often have their own mechanisms for registering and deregistering.
As a rule of thumb, I'd recommend that the component responsible for registering the event also be responsible for unregistering the event. If I instantiate some object _Foo_, then I call Blah.SomeEvent += foo.EventHandler;, then I am also responsible for removing the event. Alternatively, if _Foo_'s ctor is responsible for registering the event, then _Foo_'s dispose routine should be responsible for deregistering the event.
But that's just a rule of thumb. If you're building a component for a particular UI framework, general recommendation is to follow whatever convention that UI framework prescribes. That'll make it easier for your component to interoperate with its sibling components and for consumers to reason about your component's behavior.
Most helpful comment
I'll continue to voice the contrarian view I've held for ages. :)
IMO the dispose pattern (
this.Dispose(true); GC.SuppressFinalize(this);) shouldn't be recommended for new code. The purpose of this pattern is to support the scenario where type authors are holding raw handles to unmanaged resources, which honestly vanishingly few people should be doing nowadays. The overwhelmingly more common cases are where type authors maintain instance fields to managed objects (likeStream) or to managed handles (SafeHandle-derived types), which means that the type authors should never have to worry about finalization. But by continuing to promote the dispose pattern, and by having analyzer rules which promote it, we're forcing devs to think about this concept even though it's not necessary."But what if a subclassed type wants to maintain an unmanaged handle?" is sometimes brought up as a counterargument. I'm not too worried about this for two reasons. First, I anticipate very few people ever subclassing an existing unsealed type and using raw handles instead of managed handles. Second, on the off-chance that somebody does this, _they'll need to write a finalizer anyway_, and they can take that opportunity to clean up whatever unmanaged resources they were holding on to. IMO there's no benefit to forcing types all the way up and down the hierarchy to worry about this unlikely scenario.