Of course no OS is perfect. Nothing with technology will ever be perfect because if it reaches that level, then we can stop building everything at once. That is why new software and hardware come out, because they improve over time.
However, by design Windows, on paper is the worst OS coded that is out this day. The whole Internet runs off of Unix based servers, as well as all the servers that run DNS world wide. There are zero viruses for any and all Unix based OSes in the wild (this includes Linux and OS X), that is because by design Unix is a far superior OS when it comes to security. It is also superior when it comes to design.
You don't have the "bloat," with Unix based OSes like you do Windows. This is because you can compile an OS from the kernel up to build what you want it to do. It is very modular. Where as Microsoft Windows has very all or nothing monolithic approach. You get all of it, including all the legacy code, the bloated registry, and the OS itself allows kernel hooks so APIs and things like drivers can get direct access to a kernel with out authentication, which is a huge security risk.
I could go on and on how Unix (and all Unix based OSes) allow for developers to use sockets for multiple processes a certain application may need to run, and set one to run as root so the developer can control what escalated access goes where, unlike Windows. Most people don't care about these differences and they will always come back to say things like I can run my DX 11 video card in Windows. So, that is why I won't delve into this too much.
Sadly also, Open GL is far more powerful and robust than Direct X, but developers use DX because it is easier. If all games were just coded in Open GL to begin with you could easily port them to any OS you wanted to.