[REQUIRED] Step 1: Describe your environment
Xcode version: 12.0.1
Firebase SDK version: 6.34.0
Firebase Component: Firestore
Component version: 6.34.0
Installation method: CocoaPods
[REQUIRED] Step 2: Describe the problem
We had a lot of crash recently with the following error:
FIRESTORE INTERNAL ASSERTION FAILED: Invalid field path (). Paths must not be empty, begin with '.', end with '.', or contain '..' (expected !segment.empty())
After inspecting the code and the stack trace, it's seems that this crash is coming from an assertion when trying to retrieve a query from the levelDB caches specifically on a orderBy. because it's coming from the FromServerFormatView, we don't have any crashes coming from FromDotSeparatedStringView so it's not directly related to our codebase
Because all the FIRESTORE INTERNAL ASSERTION FAILED are grouped in one single crash in Crashlytics I can't really tell you how much it's impacting our app but it's one of the biggest redundancies.
Here is the full stack trace:
```
FIRESTORE INTERNAL ASSERTION FAILED: Invalid field path (). Paths must not be empty, begin with '.', end with '.', or contain '..' (expected !segment.empty())
Fatal Exception: NSInternalInconsistencyException
0 CoreFoundation 0x19e625114 __exceptionPreprocess
1 libobjc.A.dylib 0x1b1e4bcb4 objc_exception_throw
2 CoreFoundation 0x19e534308 -[CFPrefsSearchListSource addManagedSourceForIdentifier:user:]
3 Foundation 0x19f866ca8 -[NSAssertionHandler handleFailureInFunction:file:lineNumber:description:]
4 Wizz 0x100a9c7b0 firebase::firestore::util::ObjcThrowHandler(firebase::firestore::util::ExceptionType, char const, char const, int, std::__1::basic_string
5 Wizz 0x100a9c300 firebase::firestore::util::Throw(firebase::firestore::util::ExceptionType, char const, char const, int, std::__1::basic_string
6 Wizz 0x100aeb968 firebase::firestore::util::internal::FailAssertion(char const, char const, int, std::__1::basic_string
7 Wizz 0x100aeba08 firebase::firestore::util::internal::FailAssertion(char const, char const, int, std::__1::basic_string
8 Wizz 0x100aa0dc8 firebase::firestore::model::FieldPath::FromServerFormatView(absl::lts_2020_02_25::string_view)::$_1::operator()() const + 122 (field_path.cc:122)
9 Wizz 0x100aa0af4 firebase::firestore::model::FieldPath::FromServerFormatView(absl::lts_2020_02_25::string_view) + 171 (field_path.cc:171)
10 Wizz 0x100aa09d8 firebase::firestore::model::FieldPath::FromServerFormat(std::__1::basic_string
11 Wizz 0x100b4bb40 firebase::firestore::remote::Serializer::DecodeOrderBy(firebase::firestore::nanopb::Reader, firebase::firestore::_google_firestore_v1_StructuredQuery_Order const&) const + 1420 (string:1420)
12 Wizz 0x100b4a930 firebase::firestore::remote::Serializer::DecodeOrderBys(firebase::firestore::nanopb::Reader, firebase::firestore::_google_firestore_v1_StructuredQuery_Order, unsigned int) const + 1334 (serializer.cc:1334)
13 Wizz 0x100b4a174 firebase::firestore::remote::Serializer::DecodeQueryTarget(firebase::firestore::nanopb::Reader, firebase::firestore::_google_firestore_v1_Target_QueryTarget const&) const + 4086 (memory:4086)
14 Wizz 0x100b0bfc4 firebase::firestore::local::LocalSerializer::DecodeTargetData(firebase::firestore::nanopb::Reader, firebase::firestore::_firestore_client_Target const&) const + 271 (local_serializer.cc:271)
15 Wizz 0x100b04efc firebase::firestore::local::LevelDbTargetCache::DecodeTarget(absl::lts_2020_02_25::string_view) + 2608 (memory:2608)
16 Wizz 0x100b05178 firebase::firestore::local::LevelDbTargetCache::EnumerateTargets(std::__1::function
18 Wizz 0x100b156a8 firebase::firestore::local::LruGarbageCollector::RunGarbageCollection(std::__1::unordered_map
19 Wizz 0x100b154ec firebase::firestore::local::LruGarbageCollector::Collect(std::__1::unordered_map
20 Wizz 0x100b14e78 std::__1::__function::__func
21 Wizz 0x100afd74c firebase::firestore::local::LevelDbPersistence::RunInternal(absl::lts_2020_02_25::string_view, std::__1::function
23 Wizz 0x100ac12a8 std::__1::__function::__func
24 Wizz 0x100a7db70 firebase::firestore::util::AsyncQueue::ExecuteBlocking(std::__1::function
26 libdispatch.dylib 0x19e25e280 _dispatch_client_callout
27 libdispatch.dylib 0x19e20356c _dispatch_continuation_pop$VARIANT$mp
28 libdispatch.dylib 0x19e21428c _dispatch_source_invoke$VARIANT$mp
29 libdispatch.dylib 0x19e206e70 _dispatch_lane_serial_drain$VARIANT$mp
30 libdispatch.dylib 0x19e207a84 _dispatch_lane_invoke$VARIANT$mp
31 libdispatch.dylib 0x19e211518 _dispatch_workloop_worker_thread
32 libsystem_pthread.dylib 0x1e3f085a4 _pthread_wqthread
33 libsystem_pthread.dylib 0x1e3f0b874 start_wqthread
We've encountered the same issue. The full stack trace is pretty much the same:
The unfortunate part is, when this bug appears, it would consistently crashing our App until the user re-installed the App.
I wonder if there's any workaround for this, e.g. when we hit the exception, can we clear the LevelDB storage so it would not happen again?
Xcode version: 12.0
Firebase SDK version: 6.6.4
Installation method: CocoaPods
Full stacktrace:
Fatal Exception: NSInternalInconsistencyException
0 CoreFoundation 0x1a8a1a5ac __exceptionPreprocess
1 libobjc.A.dylib 0x1bca9442c objc_exception_throw
2 CoreFoundation 0x1a89231b4 -[CFPrefsSearchListSource addManagedSourceForIdentifier:user:]
3 Foundation 0x1a9cd3c80 -[NSAssertionHandler handleFailureInFunction:file:lineNumber:description:]
4 App 0x106acd12c firebase::firestore::util::ObjcThrowHandler(firebase::firestore::util::ExceptionType, char const*, char const*, int, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&) + 59 (exception_apple.mm:59)
5 App 0x106accde0 (Missing)
6 App 0x106b1f65c firebase::firestore::util::internal::FailAssertion(char const*, char const*, int, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, char const*) + 42 (hard_assert.cc:42)
7 App 0x106b1f6fc (Missing)
8 App 0x106ad1a60 firebase::firestore::model::FieldPath::FromServerFormatView(absl::lts_2019_08_08::string_view)::$_1::operator()() const + 122 (field_path.cc:122)
9 App 0x106ad17b4 (Missing)
10 App 0x106ad16a0 firebase::firestore::model::FieldPath::FromServerFormat(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&) + 113 (field_path.cc:113)
11 App 0x106b852f4 firebase::firestore::remote::Serializer::DecodeOrderBy(firebase::firestore::nanopb::Reader*, firebase::firestore::_google_firestore_v1_StructuredQuery_Order const&) const + 1420 (string:1420)
12 App 0x106b84298 firebase::firestore::remote::Serializer::DecodeOrderBys(firebase::firestore::nanopb::Reader*, firebase::firestore::_google_firestore_v1_StructuredQuery_Order*, unsigned int) const + 1306 (serializer.cc:1306)
13 App 0x106b83adc firebase::firestore::remote::Serializer::DecodeQueryTarget(firebase::firestore::nanopb::Reader*, firebase::firestore::_google_firestore_v1_Target_QueryTarget const&) const + 4086 (memory:4086)
14 App 0x106b3ffb8 (Missing)
15 App 0x106b38b88 firebase::firestore::local::LevelDbTargetCache::DecodeTarget(absl::lts_2019_08_08::string_view) + 2608 (memory:2608)
16 App 0x106b38e04 (Missing)
17 App 0x106b49df8 firebase::firestore::local::LruGarbageCollector::SequenceNumberForQueryCount(int) + 1831 (functional:1831)
18 App 0x106b49968 (Missing)
19 App 0x106b49800 firebase::firestore::local::LruGarbageCollector::Collect(std::__1::unordered_map<int, firebase::firestore::local::TargetData, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, firebase::firestore::local::TargetData> > > const&) + 128 (lru_garbage_collector.cc:128)
20 App 0x106b49180 std::__1::__function::__func<std::__1::enable_if<!(std::is_same<void, decltype(fp0())>::value), decltype(fp0())>::type firebase::firestore::local::Persistence::Run<firebase::firestore::local::LocalStore::CollectGarbage(firebase::firestore::local::LruGarbageCollector*)::$_16>(absl::lts_2019_08_08::string_view, firebase::firestore::local::LocalStore::CollectGarbage(firebase::firestore::local::LruGarbageCollector*)::$_16)::'lambda'(), std::__1::allocator<std::__1::enable_if<!(std::is_same<void, decltype(fp0())>::value), decltype(fp0())>::type firebase::firestore::local::Persistence::Run<firebase::firestore::local::LocalStore::CollectGarbage(firebase::firestore::local::LruGarbageCollector*)::$_16>(absl::lts_2019_08_08::string_view, firebase::firestore::local::LocalStore::CollectGarbage(firebase::firestore::local::LruGarbageCollector*)::$_16)::'lambda'()>, void ()>::operator()() + 148 (persistence.h:148)
21 App 0x106b312c0 firebase::firestore::local::LevelDbPersistence::RunInternal(absl::lts_2019_08_08::string_view, std::__1::function<void ()>) + 2592 (memory:2592)
22 App 0x106b433a8 (Missing)
23 App 0x106af5bc4 (Missing)
24 App 0x106aab574 firebase::firestore::util::AsyncQueue::ExecuteBlocking(std::__1::function<void ()> const&) + 957 (atomic:957)
25 App 0x106acd57c (Missing)
26 libdispatch.dylib 0x1a8618ac8 _dispatch_client_callout
27 libdispatch.dylib 0x1a861bd60 _dispatch_continuation_pop
28 libdispatch.dylib 0x1a862d328 _dispatch_source_invoke
29 libdispatch.dylib 0x1a861fad4 _dispatch_lane_serial_drain
30 libdispatch.dylib 0x1a8620734 _dispatch_lane_invoke
31 libdispatch.dylib 0x1a862a528 _dispatch_workloop_worker_thread
32 libsystem_pthread.dylib 0x1eff74908 _pthread_wqthread
33 libsystem_pthread.dylib 0x1eff7b77c start_wqthread
Hi @dconeybe did you have the Chance to explore this issue?
I available if you need more data or context.
Thanks
Sorry for the slow updates here. @canyousayyes or @gautier-gdx - Can you let me know if you use limitToLast?
@schmidt-sebastian We do not use any limitToLast in our code base.
Also for your reference, we have enabled persistence in Firestore, and use the default cache size (10MB).
I am not familiar with the internal structure of Firebase SDK, but looking into the code of LruGarbageCollector::RunGarbageCollection, it seems the crash happens in the logic
Target Serializer::DecodeQueryTarget(...) {
OrderByList order_by;
if (query.order_by_count > 0) {
order_by = DecodeOrderBys(reader, query.order_by, query.order_by_count);
}
}
For some reasons if the query.order_by_count > 0 but any of the query.order_by.field.field_path is an empty string, it will cause a crash.
Any chance that a checking can be added into the if (query.order_by_count > 0) clause so that it will not try to decode an invalid query.order_by and make the whole app crash?
BTW, when the Firestore is running GC, I did not see any of the logs printed even if I set Firestore.enableLogging(true), did I miss something?
@gautier-gdx @schmidt-sebastian May I also ask the consequences of disabling the garbage collection entirely?
(i.e. set FirestoreSettings.cacheSizeBytes to kFIRFirestoreCacheSizeUnlimited)
Does it mean:
Given that we saw quite a few users are hitting this crash, we would like to have a workaround soon ...
@schmidt-sebastian We do not use limitToLast either
@canyousayyes I can't tell you but if 2. is true then I personally have a lot of users crashing for not having enough space left on device because of persistence (see #6727). So I would not advice using kFIRFirestoreCacheSizeUnlimited
@canyousayyes If kFIRFirestoreCacheSizeUnlimited is used, persistence storage will not be cleared upon app restart. You can clear persistence manually (using clearPesistence()) or ask the user to re-install the app.
I will continue to look at this and update this issue as we come up with a fix. If this specific decoding issue is the only issue that causes trouble, then we can certainly look at ignoring this specific failure.
@schmidt-sebastian Thanks! Please keep us posted.
For the clearPersistence(), the doc says it also remove pending writes, which would introduce data loss as a significant amount of our App users would use it offline.
I wonder if it's possible to check whether there's pending writes / un-synced data in client side, or just clean the cache storage?
clearPersistence() does also remove all pending writes. You can potentially use waitForPendingWrites() in combination with clearPersistence() to only clear the cache once all writes are sent.
In the meantime, I will do more investigation here.
Unfortunately, we haven't been able to determine a root cause given the information that is exposed by the stacktrace. We will add more logging in the next release (which really means "next next" since we just cut a release).
@schmidt-sebastian Thanks for the update.
For the more logging part, does it mean we have to upgrade our SDK to the new version (when it's released), wait until we have users who hit the same crash, then report here with the new stacktrace / logs so that you can continue the investigation?
@canyousayyes Those are the steps. Unfortunately, I was not able to figure out the root cause based on the current stack trace.
@schmidt-sebastian thanks to the new update I was able to get more infos from our crashes, here's the full stack trace:
FIRESTORE INTERNAL ASSERTION FAILED: Target proto failed to parse: Invalid argument: Invalid field path (). Paths must not be empty, begin with '.', end with '.', or contain '..', message: <Target 0x16b91a060>: { target_id: 17886 last_listen_sequence_number: 170635 query { parent: "projects/[OUR-PROJECT]/databases/(default)/documents" structured_query { from { collection_id: "messages" } where { composite_filter { op: AND filters { field_filter { field { field_path: "compositeIndex" } op: EQUAL value { string_value: "BET1bJITtNfxlQ5mm6sf9Hw9PJG3_zg7FgxLJ2fMAX0WedVf58dX9Ce52" } } } filters { field_filter { field { field_path: "creationDate" } op: GREATER_THAN value { timestamp_value { seconds: 1605285000 nanos: 766999000 } } } } } } order_by { field { field_path: "creationDate" } direction: DESCENDING } order_by { direction: DESCENDING } limit { value: 20 } } } }
Fatal Exception: NSInternalInconsistencyException
0 CoreFoundation 0x1b033c878 (Missing)
1 libobjc.A.dylib 0x1c4842c50 objc_exception_throw
2 CoreFoundation 0x1b0242000 (Missing)
3 Foundation 0x1b1627728 (Missing)
4 FirebaseFirestore 0x105fbeee8 firebase::firestore::util::ObjcThrowHandler(firebase::firestore::util::ExceptionType, char const*, char const*, int, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&) + 59 (exception_apple.mm:59)
5 FirebaseFirestore 0x105fbea38 firebase::firestore::util::Throw(firebase::firestore::util::ExceptionType, char const*, char const*, int, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&) + 91 (exception.cc:91)
6 FirebaseFirestore 0x1060139c8 firebase::firestore::util::internal::FailAssertion(char const*, char const*, int, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&, char const*) + 42 (hard_assert.cc:42)
7 FirebaseFirestore 0x10602e90c firebase::firestore::local::LevelDbTargetCache::DecodeTarget(absl::lts_2020_02_25::string_view) + 394 (leveldb_target_cache.cc:394)
8 FirebaseFirestore 0x10602eb0c firebase::firestore::local::LevelDbTargetCache::EnumerateTargets(std::__1::function<void (firebase::firestore::local::TargetData const&)> const&) + 1871 (functional:1871)
9 FirebaseFirestore 0x106040260 firebase::firestore::local::LruGarbageCollector::SequenceNumberForQueryCount(int) + 1831 (functional:1831)
10 FirebaseFirestore 0x10603fdd0 firebase::firestore::local::LruGarbageCollector::RunGarbageCollection(std::__1::unordered_map<int, firebase::firestore::local::TargetData, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, firebase::firestore::local::TargetData> > > const&) + 157 (lru_garbage_collector.cc:157)
11 FirebaseFirestore 0x10603fc14 firebase::firestore::local::LruGarbageCollector::Collect(std::__1::unordered_map<int, firebase::firestore::local::TargetData, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, firebase::firestore::local::TargetData> > > const&) + 142 (lru_garbage_collector.cc:142)
12 FirebaseFirestore 0x10603f3fc std::__1::__function::__func<std::__1::enable_if<!(std::is_same<void, decltype(fp0())>::value), decltype(fp0())>::type firebase::firestore::local::Persistence::Run<firebase::firestore::local::LocalStore::CollectGarbage(firebase::firestore::local::LruGarbageCollector*)::$_16>(absl::lts_2020_02_25::string_view, firebase::firestore::local::LocalStore::CollectGarbage(firebase::firestore::local::LruGarbageCollector*)::$_16)::'lambda'(), std::__1::allocator<std::__1::enable_if<!(std::is_same<void, decltype(fp0())>::value), decltype(fp0())>::type firebase::firestore::local::Persistence::Run<firebase::firestore::local::LocalStore::CollectGarbage(firebase::firestore::local::LruGarbageCollector*)::$_16>(absl::lts_2020_02_25::string_view, firebase::firestore::local::LocalStore::CollectGarbage(firebase::firestore::local::LruGarbageCollector*)::$_16)::'lambda'()>, void ()>::operator()() + 148 (persistence.h:148)
13 FirebaseFirestore 0x106026d28 firebase::firestore::local::LevelDbPersistence::RunInternal(absl::lts_2020_02_25::string_view, std::__1::function<void ()>) + 2592 (memory:2592)
14 FirebaseFirestore 0x1060393f0 firebase::firestore::local::LocalStore::CollectGarbage(firebase::firestore::local::LruGarbageCollector*) + 1831 (functional:1831)
15 FirebaseFirestore 0x105fe8f68 std::__1::__function::__func<firebase::firestore::core::FirestoreClient::ScheduleLruGarbageCollection()::$_6, std::__1::allocator<firebase::firestore::core::FirestoreClient::ScheduleLruGarbageCollection()::$_6>, void ()>::operator()() + 305 (firestore_client.cc:305)
16 FirebaseFirestore 0x105f9d0a0 firebase::firestore::util::AsyncQueue::ExecuteBlocking(std::__1::function<void ()> const&) + 957 (atomic:957)
17 FirebaseFirestore 0x106095168 firebase::firestore::util::Task::ExecuteAndRelease() + 1859 (functional:1859)
18 libdispatch.dylib 0x1aff30db0 (Missing)
19 libdispatch.dylib 0x1aff3412c (Missing)
20 libdispatch.dylib 0x1aff45c08 (Missing)
21 libdispatch.dylib 0x1aff37fd8 (Missing)
22 libdispatch.dylib 0x1aff38c5c (Missing)
23 libdispatch.dylib 0x1aff42d78 (Missing)
24 libsystem_pthread.dylib 0x1f9674804 (Missing)
25 libsystem_pthread.dylib 0x1f967b75c (Missing)
NB: this crashes is only an example out of all our different queries, but we get the same crashes from different queries from various collections
Reformatted, that looks like this:
{
target_id: 17886
last_listen_sequence_number: 170635
query {
parent: "projects/[OUR-PROJECT]/databases/(default)/documents"
structured_query {
from { collection_id: "messages" }
where {
composite_filter {
op: AND
filters {
field_filter {
field { field_path: "compositeIndex" }
op: EQUAL
value { string_value: "BET1bJITtNfxlQ5mm6sf9Hw9PJG3_zg7FgxLJ2fMAX0WedVf58dX9Ce52" }
}
}
filters {
field_filter {
field { field_path: "creationDate" }
op: GREATER_THAN
value { timestamp_value { seconds: 1605285000 nanos: 766999000 } }
}
}
}
}
order_by {
field { field_path: "creationDate" }
direction: DESCENDING
}
order_by {
direction: DESCENDING
}
limit { value: 20 }
}
}
}
It looks like your query has at least this much in it:
let minDate = Timestamp(seconds: 1605285000, nanoseconds: 766999000);
db.collection("messages")
.where("compositeIndex", isEqualTo: "BET1bJITtNfxlQ5mm6sf9Hw9PJG3_zg7FgxLJ2fMAX0WedVf58dX9Ce52")
.where("creationDate", isGreaterThan: minDate)
.limit(20)
Is there a query like this in your code and are there any other filter/orderBy statements on it?
Hi, @wilhuff, thanks for the quick reply.
Yes there is one like this. compositeIndex and creationDate are variables, so those can change.
There are no more filter/orderBy statements on it, I'm using it like that
This query is used with a snapshot listener only, and other queries which have the same type of crash are too. so it might come from that. I believe from query used to get the document from the cache
Another hint is that queries who are crashing are always orderer with a Firestore Timestamp property, so maybe it's related to the way you retrieving the Timestamp ? I'm only guessing here
Wait, are you sure there's no orderBy? Something like
db.collection("messages")
.where("compositeIndex", isEqualTo: x)
.where("creationDate", isGreaterThan: y)
.orderBy("creationDate", descending: true)
.limit(20)
Yes excuse me, there is always the same .orderBy("creationDate", descending: true) where creationDate is a Firestore Timestamp, that what I meant in my previous message, I think this is what's cause my app to crash: trying to retrieve a query with an orderBy "Timestamp" from the cache in a snapshotListener.
@wilhuff do you have any update on this issue?
It's still the biggest crash in our app, impacting more than 1% of our user base (50k DAU). And as @canyousayyes said before, it's a nasty one since once a user gets it he will have it indefinitely until he downloads the app again.
If you need further informations I'm here to help.
We will likely change the garbage collector to skip over targets it cannot serialize, which should alleviate this crash.
@schmidt-sebastian Thanks for the fix! Any plans to make a new release to include this fix at this moment?
@canyousayyes I updated the milestone on this issue. It's scheduled for the upcoming 7.3.0 release targeted for next week.
Most helpful comment
@canyousayyes I updated the milestone on this issue. It's scheduled for the upcoming 7.3.0 release targeted for next week.