I have the following setup: I parse an api response that has a structure similar to this (I oversimplify a bit):
{
orders: [
{
// some data
customers: [
{
// some data
addresses: [
{
id: 436456445
// some data
},
// Other addresses
]
}
]
},
{
customers: [
{
addresses: [
{
id: 436456445
// some data
}
]
}
]
}
]
}
Note that a specific address appears twice in the response (same id).
I deep travel (hope is the right translation) the structure and make entities. All goes well until the save part when inserting throws a duplicate key exception.
An obvious solution would be to normalize structure so the addresses are before orders and remap everything around that, but as I said I oversimplified, the real case has hundreds of structures that map dynamically into entities with way to many special cases to be treated manually. That makes "hunting" for logical mapping impossible. Catching the exception (as suggested here insert ignore on duplicate entries in Doctrine2/Symfony2) looses my entity manager and would mean starting from scratch on same data, same problem emerging (not 100% sure on this... am I missing something here?).
What I need to do is make use of the mysql insert on duplicate key [do something] or insert ignore (I prefere insert on duplicate key because insert ignore ignores a bit too many things). Any idea how to do that?
Important note: I'm persisting and flushing the "order", rest of the entities are being persisted by cascade.