I have a collection with documents in this form:
{
"name" : "John Smith",
"store_affiliation" : {
"stores" : {
"ABCD" : {
"role" : "General Manager",
"startdate" : ISODate("1970-01-01T00:00:00.000+0000"),
"enddate" : ISODate("1980-01-01T00:00:00.000+0000"),
"permissions" : "GM"
},
"1234" : {
"role" : "Owner",
"startdate" : ISODate("1970-01-01T00:00:00.000+0000"),
"enddate" : null,
"permissions" : "ALL"
},
"4321" : {
"role" : "Owner",
"startdate" : ISODate("1990-01-01T00:00:00.000+0000"),
"enddate" : null,
"permissions" : "ALL"
}
}
}
...but I need the list of stores to be in this form (an array of "stores"):
{ "name" : "John Smith",
"store_affiliation" : {
"stores" : [
{
"store_code" : "ABCD",
"role" : "General Manager",
"startdate" : ISODate("1970-01-01T00:00:00.000+0000"),
"enddate" : ISODate("1980-01-01T00:00:00.000+0000"),
"permissions" : "GM"
},
{
"store_code" : "1234",
"role" : "Owner",
"startdate" : ISODate("1970-01-01T00:00:00.000+0000"),
"enddate" : null,
"permissions" : "ALL"
},
{
"shop_id" : "4321",
"role" : "Owner",
"startdate" : ISODate("1990-01-01T00:00:00.000+0000"),
"enddate" : null,
"permissions" : "ALL"
}
]
}
I have researched using $project
, $group
and $push
in an aggregate
pipeline, but I feel like using aggregate
may even be a dead end, because I'm not after a query result; I'm trying to modify every document (thousands) in the collection permanently.
You can try below aggregation pipeline in
3.4
version.Below aggregation changes the
stores
embedded document into array of key value pairs using$objectToArray
followed by$map
to output transformed array of with new field while keeping all the existing fields.Bulk update to write the new stores structure.