Q programming has an advantage toward classic computing =
- classic computing is based on true OR false at bit level, Q computing is much more based on true AND false... meaning it can mimic more easily the approximations contained inside your own neuronal system where an idea can not only be true or false, but "both", "more or less", "much more than less" etc etc.
I'm not sure mimicing your brain that do not support true or false but plenty of coloured answers between has to become a "niche" with clients demanding more and more "IA". It's not only a fashion LOL
Just in time, Im pretty sure old compilers and languages would at least have extensions (i mean must) or become oldies "emulators"
chessmate!