I read this previous post. Can any one say what the exact difference between CharSequence
and String is, other than the fact that String
implements CharSequence
and that String
is a sequence of character? For example:
CharSequence obj = "hello";
String str = "hello";
System.out.println("output is : " + obj + " " + str);
What happens when "hello" is assigned to obj
and again to str
?
Consider UTF-8. In UTF-8 Unicode code points are built from one or more bytes. A class encapsulating a UTF-8 byte array can implement the CharSequence interface but is most decidedly not a String. Certainly you can't pass a UTF-8 byte array where a String is expected but you certainly can pass a UTF-8 wrapper class that implements CharSequence when the contract is relaxed to allow a CharSequence. On my project, I am developing a class called CBTF8Field (Compressed Binary Transfer Format - Eight Bit) to provide data compression for xml and am looking to use the CharSequence interface to implement conversions from CBTF8 byte arrays to/from character arrays (UTF-16) and byte arrays (UTF-8).
The reason I came here was to get a complete understanding of the subsequence contract.
From the Java API of CharSequence:
This interface is then used by String, CharBuffer and StringBuffer to keep consistency for all method names.