I'm having trouble understanding a basic concept of error handling with chaining promises. In order to learn the rules, I have written a simple example, guessing what the result will be. But unfortunatly it doesn't behave as I though it will. I have read multiple articles about the subject but perhaps can't I get details because of my poor english language.
Anyway, here is my code :
var promiseStart = $q.when("start");
var promise1 = promiseStart.then(function() {
return Serviceforpromise1.get();
});
var promise2 = promise1.then(function(data1)
{
return Serviceforpromise2.get(data1);
},function(error)
{
return $q.reject();
});
var promiseend = promise2.then(function(data2)
{
return data2;
},function(error)
{
return error;
});
return promiseend;
Well I know that it can be way better coded but it's just for the purpose. Here is the code of Serviceforpromise1 function :
function Serviceforpromise1()
{
...
return $http.get(*whatever*).then(function (data){
return data;
},function(error)
{
return $q.reject();
});
}
Consider only the case of Serviceforpromise1's failure. A $q.reject is sent back to main chain so I'm waiting the error callback of "promise1 .then(" to be called and it worked as expected. I decided for the example to transfert the error to the "promise2 .then" so in this error callback I added the line return $q.reject(); But it never reached the second error callback (the "promise2 .then" one) and I don't understand why (like Serviceforpromise1, I returned a rejected promise !)
I will be happy to deeply understand what is happening here. Thanks for your help.