- This is my Event schema
EventA (FieldA string, FieldB string, FiledC string)
EventB (FieldD string, FieldE string)
- I want the query like:
- devide events into group window, group EventA by
FieldA, FieldB
,group EventB byFieldE
- If events occured in this order:
EventA -> EventA -> EventA -> EventB -> EventB
,anddistinct( EventA.FieldC ) >= 3
,and everyEventB.FieldE
equals everyEventA.FieldB
(join part),then generate a correlate event.
- devide events into group window, group EventA by
The problem is that:the count of EventA (3) is for the group window.For example:
["1","2","3"],["1","2","4"],["1","2","5"]
can trigger "EventA -> EventA -> EventA"
(3 events in one group window ),and ["1","2","3"],["1","3","4"],["1","4","5"]
can not(3 group window , each has 1 event) and so as the 2 EventB.
so the complete example is:
["1","2","3"],["1","2","4"],["1","2","5"],["a","2"],["b","2"]
I find it's hard even in naturl language to descript the situation, and I have no idea how to descript this in CEP;
I use context in esper to solve problem that the count is for the group window not for the stream(in siddhi,partition).It is suit for the single EventA or EventB,but if EventA and EventB has to do a patten, the two independent context can not use together.And I tried Context Nesting
like:
create context PartAB
context PartA partition by FieldA and FieldB from EventA,
context PartB partition by FieldE from EventB
and it seems doesn't work.
Thanks for anyone that can help this situation.