By "theoretical computer science topics", I am referring to things such as regular vs non-regular languages, the pumping lemma, and grammars.
I'm familiar with the real world applications of finite automata and regular expressions, but topics such as these other ones are giving me more problems as I'm not seeing any real world applications.
As Yuval F pointed out, grammars are of critical importance to language parsing. If you take a look at parser generators, such as ANTLR, you'll find that the structure of grammars is surprisingly similar to what you've learned in school.
Here's Steve Yegge's take about the usefulness of compilers, summed up as:
Grammars are essential to parsing recursive structures, which appear in programming languages, natural languages, and complex data structures. I believe some programmers do not meet them at all, but the persons who write compilers, IDEs, serialization etc. meet them a lot.
These things are useful if you want to know whether trying to do something is futile with regular expressions. For example, knowing that XML is non-regular is useful if the idea to parse XML with regex ever enters your mind. And if you don't know off the top of your head that XML is non-regular, the pumping lemma will let you prove it trivially.
And then there's compilers, where if you don't know this stuff, you're doomed. It just depends on the application.