可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
Lately, I've been pondering the somewhat non-mainstream architecture of building raw XML on the server side, and then using an XSLT stylesheet on the client to transform the XML into the full UI. Of course, a fallback mechanism would have to exist if the client was not capable of client side XSLT, in which case we'd just transform it for them on the server side.
I'm already intimately familiar with XSLT, and this approach seems to be a clean separation of presentation and content, completely forcing the data into XML, and using XSLT for presentation.
I'm also aware that this does add an extra layer of complexity to the application, which is just another moving part that can fail.
My question is: are there any big name or big traffic sites using this approach, and if so: what limitations/lessons learned did you take away from it?
Thanks Internet,
Zach
回答1:
Like other people have mentioned, Blizzard has many sites that are client side xsl. I would recommend avoiding client side xsl. It is a really cool idea, but there are many unusual bugs that you need to work around.
In Firefox, any javascript that uses document.write will destroy the DOM. Also, the noscript plug-in for firefox stops client side xsl. In both cases, the user will see nothing. There doesn't seem to be a way to detect this kind of error, so a fall back will not work.
In IE, if you have anything that does a 30x redirect to something of a different origin (going from http to https or crossing sub domains), you will get an error for violating the same origin policy. You did not really violate the same origin policy, but IE acts like you did. For example, if you go to http://foo.example.com/login and that does a 302 redirect to https://bar.example.com/login.xml, IE will treat the xsl as if it came from bar.example.com and it will treat the xml as if it came from foo.example.com. So you will need to revert to something like a meta refresh for your redirects.
These are the things that I came up with off the top of my head. It is a neat idea, but be aware of these issues.
回答2:
I couldn't tell you in detail how it's implemented, but World of Warcraft is pretty big and high traffic, and their web site is implemented as you describe.
回答3:
I don't know any big public Websites that use client-side XSLT transform (well, except World of Warcraft mentioned by Joel :-). So I cannot answer your question directly.
However, from time to time I was pondering the same question myself, and I have a hypothesis that the number of such sites on Internet must be very close to zero. :-)
The short version of my theory behind this hypothesis is this: with the exception of some pretty exotic cases, provision of client-side XSLT option is simply not worth the trouble. :-)
回答4:
The company I worked at back in 2001 released a server product that implemented exactly the architecture you describe. We had very good success offloading the processing onto the clients. Furthermore, doing client detection using the HTTP user agent we were able to use the server side XSL processing to cater to very specific clients like Japanese cell phones. I think the sites/services/products that use this technique do it quite transparently to the clients. However, I think the trend lately is to do the processing server-side so that you do not have to rely nor test on the particular implementations of XSL for a variety of clients and you get support for some XSL extensions that you would not be able to use when supporting the vast majority of browsers.
I know I'm not directly answering your question of naming some big name sites, but I hope I'm offering something of value to the problem. So, basically my point is that unless the performance saving of doing your template processing is more valuable than having to QA, support and development for three or four browsers without extensions, then you should stick with server-side processing.
回答5:
I am currently running a few minor pages with client side XSLT, all in swedish though (lillemanfestivalen.se, resihop.nu and beta projects). My biggest concern were that google did not index my pages content, just the XML without the transformation. However, since I launched resihop.nu a week ago, it shows up on google with transformation! :D
Now my other concern is facebook and other social sites, that do not understand how to handle it. I still, however, think the up sides are greater than the down sides. The fabulous speed and separation I get is awsome. And with resihop.nu, I dont even develop a separate API, I just point developers to the site itself. (More work will be needed there to make it good though)
回答6:
I agree with Elijah's answer. I think that using client side XSLT is a difficult job. You have to do a lot of QA for it. But whereas with server side, you don't that QA. You have to take care of all type of clients and possibilities while using client side. There may be a possibility that your code may break while using client side XSLT.
回答7:
I may be biased when I say this, but working on a web based app that does this, I hate it. The only reason it is even viable is because the clients are only IE6+. Even then, there are issues with it. I find XSLT to be very difficult and would suggest if you are going to do this to get a good tool for debugging and editing XSLT. Why not use JSON and jquery? Must more standard and less client side variability.