I've been taking another look at what I did historically, and have got it part-running: Javascript detection always fails because of some sort of cookie problem (probably an address lookup or similar) while the non-Javascript stuff is OK.
Basically, what I've got is Smalltalk-like expressions where classes are Usenet-style discussion groups and methods are stored in the first message of each group (or inherited), it's been in use for 20+ years for workflow handling. The class browser starts off by reporting what the server thinks the client is telling it using CGI variables (adequately documented elsewhere for e.g. Apache) and this includes- in particular- the HTTP_USER_AGENT string, client IP address and client port: those last two in combination worry me.
Much more information is available if Javascript is enabled, and I'd also got tests in there for client-side Java and for unhandled <server> tags (which were very rarely implemented, I think they might have been specific to Netscape Enterprise Server which is now owned by Oracle).
The bottom line is that client-side scripting is a serious problem, and that is particularly the case if it has the capability to open server-like ports to which external systems can "push" data (which might, it goes without saying, be potentially malicious) and the server-facing router fails to block them lest it be accused of hobbling the user experience.
Finally, I'd remark that none of this is really new: I was aware of terminals in the early 1980s which ran downloaded programs coded in something similar to UCSD Pascal, and network security was 100% dependent on the fact that it was unlikely that somebody would tap into the bank's leased communication lines.
MarkMLl