I am considering using ultrasound (inaudible) as an option for determining the relative position of 2 mobile devices (which can be either on Android or iOS devices). There will be my app installed on both of these devices. Its users will face each other (with their max. distance being 1,5 m), holding the devices facing each other.
I would like to know whether it could be possible to create an efficient system in which one app would send ultrasound/inaudible signal and other user's app would receive it and determine that this particular user (standing very close) sent it (emitted the sound).
Note: In my case the sound can be audible but the less audible it is, the better (therefore I used the word ultrasound). Battery consumption of such application is not important at this point (though I'll appreciate any information). I only would like to know whether it's possible and how efficient such application would be. I also want to send (if possible) few bytes of information but the system should work also in places where there is some ambient noise.
Can anyone answer such question/share experiences on this topic?