First of all, Linux is very lightweight while Windows is heavy. In Windows, a lot of programs run in the background and they eat up the RAM. Secondly, in Linux, the file system is very much organized.
Linux usually uses less RAM and CPU usage, so it can boot and run faster than Windows.
Source: Wondershare Recoverit
Windows and Linux have been around roughly the same amount of time (both were begun around 1990) but Linux itself was primarily a rewriting of the older AT&T Unix operating system first developed by Kernigen, Ritchie and Thompson in the late 1960s. As such, much of its architecture was already proven out by the time Linus Torvald first forked Linux from Minix, yet another Unix implementation.
This architecture was built primarily upon the concept of pipes and layered security. A pipe moves information from one form or representation to another. through a series of transformations. This process is an example of declarative programming. Perhaps the most intuitive declarative environment that people experience on a day to day basis is a spreadsheet, where if you change a number in a row of numbers, the sum of those numbers will automatically change, and so will any number dependent upon that sum.
Such programs are called state machines. This means that for every specific set of inputs, you will always have the same states throughout the environment. One key benefit of this approach is that there are no side effects - data that is hidden can’t affect the answers that are returned. This makes for extraordinarily stable applications.
When Windows was first introduced, a new language, called C++, was gaining in popularity. C++ added classes to the C language, and popularized a programming paradigm called object oriented programming (OOP).
OOP was a powerful new way of organizing information, by encapsulating that information in what were called class instances. Yet one consequence of this was that it became possible for a class to change state behind the scenes, so that data coming from such classes was no longer consistent. There were side effects.
There are multiple distributions (known as distros) of Linux, each of which features a kernel that defines core functionality and then a set of intermediate packages, package managers and libraries that are implemented differently from one distro to the next. Yet all share the same fundamental security model (one that is quite well tested at this point), and largely declarative core.
This typically means that when an application fails, it fails in its own box, without bringing the whole operating system down. This typically means that Linux-based systems at their core are very stable, even if applications written on top of them aren’t.
In the early 2000s, Microsoft introduced a new programming paradigm called dotNET, and with it a new set of programming languages such as C# (which took many of the garbage collection and sandbox features that had started appearing in Java and made them more core).
It also introduced another language called F# that was much more declarative (its roots are the declarative language Haskell) which developers could use to minimize side effects significantly. One upshot of these innovations is that Windows currently is far more stable, though still arguably not quite as stable as Linux.
One of the most intriguing innovations of the last few years has been the introduction of dockers, which are essentially stripped down virtual machines that provide just enough information to keep an application separated from the core operating system. These have the advantage of being more stable, because they are not dependent upon the stability of the underlying system, nor can they influence it. They are also more secure, because these docked applications exist within a separate security context that doesn’t touch system security at all.
As such, it increasingly makes the point moot about which is the better system. If you write applications on the web in Windows, it’s highly likely that that Windows environment is running on a Linux based machine.
Kurt is the founder and CEO of Semantical, LLC, a consulting company focusing on enterprise data hubs, metadata management, semantics, and NoSQL systems. He has developed large scale information and data governance strategies for Fortune 500 companies in the health care/insurance sector, media and entertainment, publishing, financial services and logistics arenas, as well as for government agencies in the defense and insurance sector (including the Affordable Care Act). Kurt holds a Bachelor of Science in Physics from the University of Illinois at Urbana–Champaign.