I have 2 libraries: test.1
and test.2
. Both libraries contain a single global extern "C" void f();
function, with different implementations (just a cout
for the test).
I did the following test:
Test 1 Dynamic linking:
If I add libtest.1.so
and then libtest.2.so
in the makefile of the executable and then call f();
in main
, libtest.1.so->f()
is called.
If I change the order in the makefile, libtest.2.so->f()
is called
Test 2 Static linking:
Absolutely the same happens with static libraries
Test 3 Dynamic loading
As the library is manually loaded, everything works as expected.
I expected an error for multiple definitions, which obviously didn't happen.
Also, this does not break the one-definition-rule, as the situation is different.
It's also not a dependency-hell(not that it's related to this at all), nor any linking fiasco..
So, than what is this? Undefined behavior? Unspecified behavior? Or it really does depend on the linking order?
And is there a way to easily detect such situations?
Related questions:
dlopen vs linking overhead
What is the difference between dynamic linking and dynamic loading
Is there a downside to using -Bsymbolic-functions?
Why does the order in which libraries are linked sometimes cause errors in GCC?
linking two shared libraries with some of the same symbols
EDIT I did two more tests, which confirm this UB:
I added a second function void g()
in test.1
and NOT in test.2
.
Using dynamic linking and .so
libs, the same happens - f
is called with the same manner, g
is also executable (as expected).
But using static linking now changes the things: if test.1
is before test.2
, there are no errors, both functions from test.1
are called.
But when the order is changed, "multiple definitions" error occurs.
It's clear, that "no diagnostic required" (see @MarkB's answer), but it's "strange" that sometimes the error occurs, sometimes - it doesn't.
Anyway, the answer is pretty clear and explains everything above - UB.