Because the vast majority of computers come with Windows preinstalled, and the vast majority of users can’t be bothered to update their OS unless they’re forced, let alone reinstall something else. I’m fairly certain the numbers would be very different if there were a significant number of blank laptops on the market, let alone ones shipped with Linux.
I’m 100% certain there would be little difference because people need an OS that can run the software that they want, and just as importantly they need to be able to actually install and use it and Linux has never even tried to make that process anything but a nightmare. And I’ll stop you right there with your various flavors of Mint or Ubuntu or Elementary or the dozens of other distros. Users don’t care about endlessly tinkering. They want something that just works. Linux doesn’t offer that.
But if you turn to science / engineering software, most stuff will work out of the box on Linux/ Mac, but is a pain in the ass to set up on Windows. Ease of installing software isn’t an OS thing; it depends on the developer of the software in question.
That’s simply not true. The vast majority of CAD, CFD and FEA software is run on Windows (with many not even having Linux versions) and that has been the case for decades. The installation process on Windows is almost universally a straightforward process, and the times it isn’t, is usually because that software has (or had) Unix roots from ages ago and the clunky nature of anything related to Unix comes through.
CAD is an exception, yes. Also a lot of stuff related to commercial aviation, because of the regulations. I was talking about comp sci, statistics, big data etc.
that software has (or had) Unix roots from ages ago and the clunky nature of anything related to Unix comes through.
It’s the other way around. It is extremely easy to make simple UNIX scripts. And it is extremely easy to string together a bunch of such simple scripts to make larger software. Windows does not follow the UNIX philosophy, making it difficult for different programmes to talk to each other.
Someone’s been feeding you bullshit.
I’m talking from personal experience. I write R, python and bash scripts to take data from machines, analyse it and draw graphs and charts based on it. I use three languages and multiple machines. But thanks to the UNIX emphasis on modularity, I can connect all of this into one automated pipeline. And if tomorrow one machine is replaced, I will only have to change a few lines of the script, again thanks to modularity.
Don’t get me wrong, Windows has its advantages - better gaming support, ‘safety rails’ that prevent you nuking your system, better drivers for peripherals, and so on. But software installation is not one of them.
Because the vast majority of computers come with Windows preinstalled, and the vast majority of users can’t be bothered to update their OS unless they’re forced, let alone reinstall something else. I’m fairly certain the numbers would be very different if there were a significant number of blank laptops on the market, let alone ones shipped with Linux.
I’m 100% certain there would be little difference because people need an OS that can run the software that they want, and just as importantly they need to be able to actually install and use it and Linux has never even tried to make that process anything but a nightmare. And I’ll stop you right there with your various flavors of Mint or Ubuntu or Elementary or the dozens of other distros. Users don’t care about endlessly tinkering. They want something that just works. Linux doesn’t offer that.
But if you turn to science / engineering software, most stuff will work out of the box on Linux/ Mac, but is a pain in the ass to set up on Windows. Ease of installing software isn’t an OS thing; it depends on the developer of the software in question.
That’s simply not true. The vast majority of CAD, CFD and FEA software is run on Windows (with many not even having Linux versions) and that has been the case for decades. The installation process on Windows is almost universally a straightforward process, and the times it isn’t, is usually because that software has (or had) Unix roots from ages ago and the clunky nature of anything related to Unix comes through.
Someone’s been feeding you bullshit.
CAD is an exception, yes. Also a lot of stuff related to commercial aviation, because of the regulations. I was talking about comp sci, statistics, big data etc.
It’s the other way around. It is extremely easy to make simple UNIX scripts. And it is extremely easy to string together a bunch of such simple scripts to make larger software. Windows does not follow the UNIX philosophy, making it difficult for different programmes to talk to each other.
I’m talking from personal experience. I write R, python and bash scripts to take data from machines, analyse it and draw graphs and charts based on it. I use three languages and multiple machines. But thanks to the UNIX emphasis on modularity, I can connect all of this into one automated pipeline. And if tomorrow one machine is replaced, I will only have to change a few lines of the script, again thanks to modularity.
Don’t get me wrong, Windows has its advantages - better gaming support, ‘safety rails’ that prevent you nuking your system, better drivers for peripherals, and so on. But software installation is not one of them.