I get a dataset like so:
FOR d IN coll1
FILTER d.key== @Key
LET VE = (
FOR v, e, p IN 1..3 OUTBOUND d._id edges_coll OPTIONS {
bfs: true,
uniqueVertices: 'global'
}
RETURN {[SPLIT(v._id, "/")[0]]:[v]}
)
RETURN MERGE(d, {v})
In my one edges-collection find documents from different collections with 1:n relationships.
"processes/Process_1->dpia/TESTDPIA",
"processes/Process_1->risks/TESTRISK",
"processes/Process_1->erasure_period/7149493",
"processes/Process_1->erasure_period/7149493",
"processes/Process_1->data_categories/7198090",
"processes/Process_1->data_subject/7198675",
"processes/Process_1->data_recepient/HOD",
"processes/Process_1->data_subject/7198671",
"processes/Process_1->software_services/7252898",
"processes/Process_1->data_categories/7198583",
"processes/Process_1->data_categories/7198154",
"processes/Process_1->data_recepient/DEP",
"processes/Process_1->egal_basis/7198999",
"processes/Process_1->countries/7196637"
You can see there are several objects from the same collection.
In the result, entries now exist with the same key. How do I set a new object or array?
I slit the id with RETURN {[SPLIT(v._id, "/")[0]]:[v]} i have a key with the name frome the collection and the row object. I would like all row from the same collection and withe the name from the collection as Object
"Objects": [
{
Object
}, {
Object
}
...
]
I work with GO and would like the result ob Edges->collection save in a struct, but the results have just the id, which gives me the only conclusion to the collection.
Just stumbled over your question, but I find it hard to understand your desired output format. I created a bit of test data (2 vertex collections, 1 edge collection):

Using the following query with bind parameters { "id": "coll1/A" }
RETURN MERGE(
FOR v IN 1..3 OUTBOUND @id edges_coll
OPTIONS { bfs: true, uniqueVertices: 'global' }
COLLECT coll = PARSE_IDENTIFIER(v._id).collection INTO keys = v._key
RETURN { [coll]: keys }
)
… the result is as follows:
[
{
"coll1": [ "B", "D", "C", "E", "F", "G" ],
"coll2": [ "B", "C" ]
}
]
Hi @Fruchtgummi,
Did you have a chance to read Simran-B's latest comment? Was it helpful?
I'm closing issue because of inactivity. Feel free to re-open if my answer does not suffice.