I want to make a function that calculates the length of the common segment (starting from the beginning) in two strings. For example:
foo:="Makan"
bar:="Makon"
The result should be 3.
foo:="Indah"
bar:="Ihkasyandehlo"
The result should be 1.
You mean like this. Please note, this will not handle UTF 8, only ascii.
Note that if you were working with Unicode characters, the result could be quite different.
Try for instance using
utf8.DecodeRuneInString()
.See this example:
Here, the result would be:
utf8.RuneError
)It's not clear what you are asking because you limited your test cases to ASCII characters.
I've added a Unicode test case and I've included answers for bytes, runes, or both.
play.golang.org:
Output: