It seems that the software language skills most sought for embedded devices and robots are C, C++, and LISP. Why haven't more recent languages made inroads into these applications?
For example, Erlang would seem particularly well-suited to robotic applications, since it makes concurrent programming easier and allows hot swapping of code. Python would seem to be useful, if for no other reason than its support of multiple programming paradigms. I'm even surprised that Java hasn't made a foray into general robotic programming.
I'm sure one argument would be, "Some newer languages are interpreted, not compiled" - implying that compiled languages are quicker and use fewer computational resources. Is this still the case, in a time when we can put a Java Virtual Machine on a cell phone or a SunSpot? (and isn't LISP interpreted anyway?)
My guess is that C/C++ are used because they are closer to the hardware and allow for resource-aware programming. This is generally true for all embedded projects, not just robotics.
Then I guess that LISP is often chosen, because it still seems the predominant language in artificial intelligence research. LISP is probably used for the higher level workings of the robot, I suppose.
I once fell over this interesting snippet on using Lisp at NASA: http://www.flownet.com/gat/jpl-lisp.html
I just read some introductory Erlang materials and one of the first things they said was that Erlang was suitable for "Soft" real-time control. This is not something that I would want in any robot near me.
In addition I would say that robots (as in industrial) currently have no real need for hot swapped code. They are working on a piece basis and there will always be scheduled downtime to reload code at an appropriate moment - which of course is well tested in an offline unit.
Having worked with robotics, my answer is efficiency. Yes, you can run a Java Virtual Machine on cell phones. But how efficient will it be? I was on a team that wanted to run a Java Virtual Machine on a full Windows XP machine on a robot, running multiple realtime monitoring applications in Matlab. Needless to say, we were dropping frames like it was no one's business. Moral of the story, override people who don't understand computing even if you need to override your supervisors if it's going to sink your operation.
Yes, you could run Python, and I've seen it done to manage multiple C processes. But at the end of the day, running C allows you to do some direct manipulation of connections so much easier and reliably than some of the higher level codes, and therefore is preferred.
Java made another milestone this year when it became a programming option for the FIRST Robotics Competition. FRC is an impressive competition involving over 77,000 high-school students, mentors, and volunteers from around the world building 120 pound robots in six weeks. I just posted some results about this on my blog.
By strange coincidence (or not), it uses the same Java VM as the Sun SPOTs mentioned in the original question.
The main reason for the prevalence of C and C++ is that the runtime is deterministic for both due to the lack of garbage collection requirements. This makes it a good choice when you have to provide runtime guarantees. Not too mention that C has been considered the "higher level assembly language" of choice for many years.
Another interesting observation is that most embedded devices do not need or even have access to a complex GUI layer -- cellphones are an obvious exception. Most of the embedded work that I have done professionally has been in the cable set-top box arena so I may have a slanted view of things. And "No", I don't consider the set-top box to be a hard embedded environment. We grew up from having nothing more than a raw memory map of what is "on screen" and very little in the way of resources. To make a long story short, on screen graphics are an exercise in bit-twiddling with fixed bit widths - this is another place that pointers in C really shine.
I'm really not too surprised that Java hasn't made headway into the more "bare bones" market yet. The interpreter is just too heavy even though the Java ME was supposed to solve this. It is pretty prevalent in cell phones (e.g., BREW) and is slowly making its way into the set-top and TV markets (e.g., <tru2way> and GEM) but it isn't there yet and I'm really not sure that it will ever be.
As others have mentioned, FORTH is an "interpreted" language that has been used in a number of embedded environments as well as in quite a few bootloaders. Interpreted languages can definitely be used in realtime environments. Not all implementations of FORTH are interpreted though. LISP has been embedded as well.
I think that the main criteria for an embedable language are:
The last point is the most interesting in my opinion - this is also why I believe that many languages will have trouble in the embedded market. Pure functional languages are a natural fit for concurrency and usually work in a flat memory model. General purpose languages work well because they don't usually proscribe any particular threading model which gives a lot of flexibility to the RTOS runtime implementers. Virtual memory environments are damned near impossible to implement so that they are deterministic and fast. This makes it very difficult for languages that require virtual memory support to really function correctly.