What are the main differences between CGI and Java servlets?
问题:
回答1:
Servlets are run in one process (HTTP server with additional features, which called Servlet Container) and they exist as long as that process exists.
CGI means every time there's client request, HTTP server creates new instance of process to serve this request. This is performance killer. Additionally, since there's new process per each request, it means CGI can't aggregate data from several requests in memory, as Servlets can, and must resort to external persistent storage (file or DB). This is performance killer as well.
回答2:
Biggest difference is that CGI died a decade+ ago.
Servlets are a standard, Java CGI never really was.
回答3:
Java servlets run in some kind of container (Tomcat, JBoss, Glassfish, Jetty etc) which needs to be running to serve request.
CGI normally spawns a new process for each request which (considering starting a JVM is somewhat expensive) is not the best solution for Java.
回答4:
At a minimum, using Java servlets in a servlet container should provide better performance. Using any type of CGI with Java is most likely having to spawn new Java processes for each request, which is less than ideal. In working with Java on the server-side from the web, using Servlets is really the best approach.