I am trying to understand overload resolution.
First let's consider this first case:
struct int1{
int val;
operator int&()&{
return val;
}
operator const int &() const&{
return val;
}
};
void f(int &){} //f#1
void f(const int&){} //f#2
void test1(){
int1 x;
f(x);
//Conversion sequence for f#1:
// - int_wrapper& --> int1::operator int&
// => Ranking: user defined conversion rank
//Converison sequence for f#2:
// - int1& --> int1::operator int & --> const int&
// - int1& --> const int1 & --> int1::operator const int&
// => Ranking: ambiguous because 2 conversion sequence [over.best.ics]/10
// => user defined conversion rank
//
//=> No best viable overload, 2 user defined conversion rank
}
Unlike my wrong analysis, compilers agrees: the call to f
is not ambiguous. Why?
Now consider this second case, which is very similar, I just replace the int&
by int &&
:
struct int2{
int val;
operator int&&()&&{
return std::move(val);
}
operator const int &() const&{
return val;
}
};
void g(int &&){} // g#1
void g(const int&){} // g#2
void test2(){
int2 x;
g(std::move(x));
//Conversions sequence for g#1
// - int2&& --> int2::operator int&&
// => Ranking: user defined conversion rank
//Conversion sequence for g#2
// - int2&& --> const int2& --> int2::operator const int&
// - int2&& --> int2::operator int&& --> const int&
// => Ranking: ambiguous because 2 conversion sequence [over.best.ics]/10
// => user defined conversion rank
//
//=> No best viable overload, 2 user defined conversion rank
}
My analysis (which is certainly wrong in this case too) concludes similarly that the call to g
is ambiguous. Unfortunatly, in this second case, compilers do not agree:
- Clang (3.4.1 to 5.0), MSVC 19 2017 RTW, Zapcc 190308 (Clang derivate), ellcc (0.1.33, 0.1.34)(Clang derivate) => call to
g
is ambiguous; - GCC (4.8.1 to 7.2), icc (16 to 18) => call to
g
is not ambiguous.
What is the right analysis and which compiler is right?
Could you precise why the rule below does not apply, or when is it applied?
[over.best.ics]/10:
If several different sequences of conversions exist that each convert the argument to the parameter type, the implicit conversion sequence associated with the parameter is defined to be the unique conversion sequence designated the ambiguous conversion sequence . For the purpose of ranking implicit conversion sequences as described in 16.3.3.2, the ambiguous conversion sequence is treated as a user-defined conversion sequence that is indistinguishable from any other user-defined conversion sequence
N4713 16.3.3.2:
So, standard conversion is the key:
So
gcc
meets the standards's requirement.Both of these examples are more complicated representations of simpler underlying concepts:
For the first call, we favor the the less cv-qualified type, so
int&
is better thanint const&
. For the second call, we favor binding to rvalue reference over binding to lvalue reference, soint&&
is better thanint const&
.In the specific example, these preferences manifest themselves over the choice of conversion function used by way of the implicit object parameter. For the first example, because we're binding the implicit object parameter to
int1&
for converting toint&
, butint1 const&
for converting toint const&
. Likewise, in the second example, we're binding the implicit object parameter toint2&&
for converting toint&&
, butint2 const&
for converting toint const&
.I'd call this a clang bug.