There are countless books about algorithms to scan, parse, generate code and massage data structures used in compiler writing as well as extensive theory on grammars but, I am yet to see a book that is an in-depth scientific/mathematical study of the effects of the grammar on the expressive power of the language and its ability to minimize/eliminate terse and error prone grammatical construct (e.g, =/==, |/||, </<< and countless others in C/C++.)
If I could throw in one additional point. There are, in my opinion, far too many academics who are prepared to suggest some outrageous construct and then devote years to finding a way to compile it efficiently (or, as a related issue, have the compiler issue sensible diagnostics when things go wrong).
There's a very good reason why Her Majesty's Government doesn't recruit diplomats with impenetrable Geordie, Scouse or Devonshire accents: if you want to convey your message succinctly and unambiguously a good start is to use intelligible enunciation and idiom.
My conviction is that if a structure in a programming language is difficult to compile reliably it would probably be better omitted: it's all very well say "we know how to fix it by enhancing the compiler" but every enhancement brings with the risk of yet more bugs and problems.
So, let's keep our base languages simple. Let's identify what /has/ to be in the language, and then find an extension mechanism for the extra features... function and class libraries have proven to be remarkably capable and have extended the life of the basic ALGOL-style languages enormously, but we should always be on the lookout for additional tools /provided/ that they can be implemented robustly.
MarkMLl