Normally I associate strings to keywords that describe them, and then store them in a database table with a structure of id int, id varchar(16), language char(2), string varchar
. Then I make a PHP wrapper function to fetch each string depending on the visitor’s current language.
I have been told (and read somewhere I think) to use PHP’s built in methods to handle internationalization. But I have not found any really “convincent” information about why it should be more appropriate than my database way to do it.
What is the most appropriate method to handle internationalized websites strings and why? is there one that I should always prefer for its better performance? If there is one with a better performance then I will switch to use it.
It depends on what your objectives are. If you need your translations to be updated on-line by website users, then your database approach is ok. However, if translations are to be provided by professional translators, then it's better to decouple the translation from the web site. Using something like gettext is a better idea in this case - you will be able to send out .po files for translation, then integrate it back into your web site.
Think about the whole process, managing translators, managing changes, etc.
Gettext seems to normally depend on each proccess’ locale settings, and they depend on system installed locales too. For Web Applications consider using your own Database implementation along with some caching methods already pointed out.
There are a number of ways to skin a cat. I think you've been given this advice because PHP uses the client settings to get the default language (although, you're probably using a similar method). Either way, multiple strings have to be stored. If you prefer the db save those strings, then that works. If you're not seeing any performance issues, then go with it. No need to change what you're comfortable with.