It'll be interesting to see what happens next and whether Microsoft will put effort into closing this loophole. If so I think it reinforces the idea that this hardware check is an attempt to force upgrades to Windows 10, not just a "we're choosing not to support this platform" decision.
they'll probably move the logic into the kernel and use patchguard to defend it
Thinking more about it - it's pretty funny that such a patch would:
* do nothing on machines running < Kaby Lake
* not install on machines running >= Kaby lake with no modifications
* only install and function on machines running >= Kaby Lake that have been hacked by the user to download future Microsoft patches
Actual technical content on github, which is linked from the article, but perhaps should be the main link?
The article does at least do some digging vis-a-vis the history of Zeffy's work and how it came to reach the state that it's in currently, so some credit is due to computerworld.
It's both actually, MS is currently supporting approximately 7 OS flavors of windows as best I can tell, 5 of which are Windows 10 alone (7 SP1, 8.1, 10 RS1, 10 RS2, 10 CBB, 10 LTSB, 10 RS3). That's a lot of OS's to deal with and be ensuring that 'It just works'. If anything I would say this is a commentary on AMD and Intel's ability to make compatible processors than just on MS' ability to support. Unfortunately new processors have Errata and can be buggy pieces of garbage as Ryzen just showed with a bug in its FMA3 implementation that would cause a GP fault. MS has to test and handle these issues for every single supported CPU. To do this they have massive labs of machines with common CPU/Mobo combos that they deploy and run tests on with every single patch and build. Not surprisingly this isn't cheap and if you have to support effectively 7 different OSes you're going to make some cost decisions. I suspect this really was a decision to focus lab hardware and development time on the latest OS versions that can more easily share the errata patches than anything nefarious.
There's a lot more versions. There's Server, IoT, Embedded, RT.
Following your argument, they'd have to be tested as well. A few versions more of Windows 7 and 8 doesn't seem like the huge budget breaker, it just takes more time. I don't think they'd use a dedicated machine for each and every version, you can use the same machines and just change the drives or use NetBoot or whatever.
The real issue here is that they want users to upgrade to Windows 10.
> There's a lot more versions. There's Server, IoT, Embedded, RT.
> A few versions more of Windows 7 and 8 doesn't seem like the huge budget breaker,
There's a lot of versions of Windows 7 and Windows 8 as well. Windows 7 had at least 6 SKUs.
> I don't think they'd use a dedicated machine for each and every version, you can use the same machines and just change the drives or use NetBoot or whatever.
You're completely underestimating the scale necessary to carry out proper testing and validation.
>You're completely underestimating the scale necessary to carry out proper testing and validation.
I think you're completely overestimating the scale. In any case, neither of us really know what's going on.
One thing we know is that Microsoft has the means to test these things if they wanted to.
Windows 10 has been out for a while, other processor generations came out; they didn't do this trick until now.
This only came up because Microsoft is seeing that their product is extremely unpopular with their customers.
From a profit perspective, they're doing the logical and obvious thing: to force users to upgrade because they need their license fees.
If I was running ubuntu on the release date of windows 7, it would have run out of service in 2013. Windows 7 still gets updates, you just can't install it on a new machine.
No version of Linux released in 2009 is still (a) supported, or (b) supports Kaby Lake.
The thing with Linux is that you can set up a new installation with a new release that works more-or-less like an arbitrary older release. You can't really do that with Windows, so the situation isn't very comparable.
I wouldn't mind Windows 10 if I could switch off every new crappy feature they've pushed into it (e.g. the phone-home stuff and ads), and trust Microsoft to respect my decisions.
That differs greatly in my experience -- in the time I've been using ubuntu I've lost gnome 2 and been forced onto Unity, and had to learn systemd.
The phone home stuff is indeed a seperate, and very serious, problem.
Unity was an ill advised choice and it looks like they're finally dumping it. Anyway keep in mind that Ubuntu has a company behind it pulling strings, so it's pretty normal for them to reduce costs by not supporting older architectures. For older/different architectures the choice should be Debian or other community backed distros.
Just to get an idea.
> That differs greatly in my experience -- in the time I've been using ubuntu I've lost gnome 2 and been forced onto Unity, and had to learn systemd.
That's only true if you're wedded to a distribution, though. There's always the option to switch to one that doesn't use systemd, and recreate your favored experience there.
Also, was it possible to install Gnome on Ubuntu if you didn't want to use Unity? I never used that distribution myself.
Windows 7 was end of life'd in January of 2015. I don't think this is some kind of evil business decision Microsoft made. Switching to Linux isn't a magical way to ensure your OS version will be supported for the rest of eternity. Linux is great for many things, but give the Microsoft/Windows is evil thing a rest.
They're also not supporting Windows 8 which is not EOL yet.
The evil business decision was to make Windows 10 a steaming pile of malware. Everything else follows from that. If running Windows 10 weren't such a big security risk, end-of-lifing Windows 7 and 8 wouldn't be a problem.
OSX is sending just as much telemetry back to Apple, but it's less hipster to call their OS a steaming pile of malware...
Windows 10 "Basic" level telemetry:
> With your explicit consent, we may collect data about how you use your device and applications in order to help app developers improve their apps.
That is all Apple's policy has to say about Telemetry collection. It doesn't say when or how it gains consent.
With Windows you can't install without consent. I'm not going to re-image my Mac to check but I wouldn't at all be surprised if consent was part of that first boot EULA that you agree to.
Ultimately you find Microsoft's transparency on the matter is unsettling but Apple's hand waving "Trust us to do the right thing" is ok?
Also, I've never heard of any OS vendor, microsoft or otherwise, abusing telemetry data. They're incentivized to not abuse this data.
I think MacOS gives you an off switch for the telemetry. I know iOS does. Not only does iOS give you an easy way to turn it off, it will also show you the raw data being sent to Apple.
Microsoft does neither. You can't turn it off easily if you're not on Enterprise / Education, and the data being sent is encrypted.
the enterprise edition has telemetry too, which can't be completely disabled
You're right. Windows for Workgroups 3.11 isn't getting patches for Kaby Lake either. Those evil bastards.
The difference is that according to this:
Windows 7 still has twice as large install base as Windows 10. It's literally nothing other than MS stopping support for 7 because 10 is out - imagine the outrage if Tesla stopped supporting an older model S because a new one was out.
The main difference is that here Microsoft arbitrarily put a check for the cpuid in while Kaby Lake is probably as compatible with Windows 7 as any previous 64bit Intel CPU. Windows 3.11 doesn't have this, if you have any i386 compatible CPU it will run just fine and will happily apply all available patches.
> The main difference is that here Microsoft arbitrarily put a check for the cpuid in while Kaby Lake is probably as compatible with Windows 7 as any previous 64bit Intel CPU. Windows 3.11 doesn't have this, if you have any i386 compatible CPU it will run just fine and will happily apply all available patches.
IIRC, the compatibility problems with the new processors are processor drivers and power management.
However, the dumb thing is that Microsoft already made the engineering effort to support those things, since they used to support Windows 7 on the now-unsupported architectures. Now they're taking it away to push Windows 10 some more.
It checked the DOS version and errored out on non-MS implementations of DOS.
Let's put something into perspective. Linux is 25 years old. If Linux hasn't supplanted Windows on the desktop by now, then there is no indication it ever will.
And don't get me wrong, I think Linux is a fine OS and use it daily (lately more often than Windows). But "replace Windows" is not a static target, and the organizations in charge of promoting Linux as a desktop OS have nothing now or in the works that can compete with the corporate Juggernaut of Microsoft.
Selling software is hard, especially on Linux. We tell people FOSS means free as in freedom, but the vast majority of even people who parrot that line thinks it means free as in beer. As a software vendor, you'd be killing yourself to ignore 90% of the market.
The desktop/laptop market is becoming less general consumer and more professional, yet MS and Apple are targeting the consumer experience. MS keeps trying to get in on the mobile market and failing hard, leaving their professional desktop/laptop users frustrated in a wasteland of poorly planned mobile features. Apple is doing it with their desktop OS because they've stopped caring about it, they've won the mobile market. Linux is the only option left for OS. Unfortunately, too many professional tools only run on Windows and/or Mac right now, doubly unfortunate is that a lot of them are made by MS or Apple and they're unlikely to port to Linux any time soon. I think a good strategy going forward for desktop software developers would be, not to ignore Windows and Mac, but to include Linux as a target from the beginning, maybe even make it first priority. Then maybe down the road, if a good chunk of the lock-in tools get replaced, a lot less people will be stuck. When less people are stuck, quality goes up all around due to competition. Selling on Linux is definitely hard, but companies will pay if the licensing requires it. I hate proprietary software as much as the next FOSS zealot but just getting people on to Linux is a win, if that means a few closed source/strict license/vague EULA tools become prevalent on Linux, so be it.
Microsoft hasn't focused on mobile since the 8.x timeframe. The creator's update of windows 10 has tons of developer-focused improvements, for example in the linux subsystem.
I was gung-ho on linux on the desktop in the late 90's, but there's one fatal flaw in the bazaar volunteer-driven model that prevents it from catching on: it is more fun to write anew than to maintain and improve. The desktop linux ecosystem is stuck in an endless loop of rebuilding what was, instead of advancing.
By "mobile" I guess I meant the kind of stuff companies get away with on mobile; walled garden, limited control, aggressive updates, etc. If they would just give control back to the power users, there would be no more discussion about it. They could do it easily but they show no signs of going back.
If you're referring specifically to the various DEs on Linux desktop, I agree, there are a ridiculous amount of options, none of them doing "modern" super well. Personally, I'm on Xfce and that is because it doesn't do "modern" on purpose. DEs were feature complete in 1995, everything else has been useless fluff.
yes let's put it into perspective.
Yes the Linux kernel was developed ~25 years ago. Linux is not an operating system it is a kernel. Gnu/Linux distributions have several desktops, depending on what part of the world you are from, etc...
Linux hasn't hit the desktop market because you can not just go to a store and buy one.
The problem is availability, the Average Consumer does not what to buy something, and then alter or modify the product as soon as they get home.
To most people computers are entertainment devices. They just want it to work. They just want to buy one.
The reason that Microsoft, Apple, and even Google (chrome books, android) have hit the market share is because they realized the formula of be a hardware company first, software second.
Yes Microsoft has a cartel on Hardware, as they got all vendor companies to make hardware for them. It's like MacDonalds being a real estate company and not a burger flipping company. They take their money off the top of the hardware vendors. They have created a ecosystem that is endless cycle of upgrade. The Cost of ownership of a Microsoft ecosystem is quite expensive over the long term. They have developed a system of hooking users with the home/personal lines, that have forced business to purchase their products.
I should also point out that, Android (Linux kernel) has the largest market share at the time of writing this. Personal computers are changing to things that fit into pockets.
The average consumer is not going to want a desktop or a laptop anymore. As that their use case of just being entertainment devices.
If Gnu/Linux wants to compete for smaller the professional,Gamer,power user desktop/laptop markets. There needs to be a Hardware company to champion it, (Like Dell) that will put it front and center at the Wal-marts/best-buy's/Fry's of the world. So that we can just go buy one.
Remember the superior product is a matter of perspective. Which is the superior product? The product which is perfect, barely breaks, and is fixed quickly if it does. Or the product that has flaws, multiple ways to break, crash, etc... only lasts a few years.
It all depends. To the salesman the one that breaks of course, so we can sell more of them. If it does not break I can only sell it once.
So... long rant short; until we can just drive to the store and buy a Linux PC off the shelf and start using it . There will be poor market share, and we'll fighting to FUD at the work place.
> Linux hasn't hit the desktop market because you can not just go to a store and buy one.
Remember the 90s when you could do that, at least the smaller shops used to have some Linux CDs. What happened, why is none seemingly interested in selling it? What i mean, the smaller hw shops that are still around could sell NVMe disks preloaded with Linux distros, they could also sell enclosures so that users would be able to run it via usb 3 as a portable operating system. Let's face it, dual booting is dead, for those with interest in Linux but who can't escape the Windows trap maybe could run it on portable storage.
I do remember buying the install media back in the day, you used to be able to get RedHat, Suse, and FreeBSD (if you were a Unix purist) too. The problem then was the problem right now, you can't buy a pre-loaded Linux PC at the store.
Imagine if RedHat back in 96-97 could have got HP, Dell, Compaq, etc... to sell a pre-loaded product. How many more people would have been exposed to Linux. Good or bad, exposure is exposure.
Linux as in Android already won the market. Desktop OSes... are all bad atm imo, for different reasons. Desktop linux is missing many things to be real choice and can't win until it changes.
standard application format, working across all the distros - with isolation, permissions and stuff (see android's .apk files, snap, flatpak, appimage etc).
fault-proof system updates. atm it's possible to break something via regular update and it'll cost you dearly. maybe win-style restoration points/backups.
unified desktop experience, not current shitshow where every single app can draw it's very own shaped buttons
Agreed. Giving ISVs one format to deliver their apps to all the distributions will be a big help. To the distros as well -
you don't lose users to Ubuntu because Spotify is only distributed as a deb.
For those unfamiliar with "snaps", here's a quick primer:
Google is actively working on their own kernel for their consumer facing operating systems. https://en.m.wikipedia.org/wiki/Google_Fuchsia So that "win" is going to be relatively short-lived.
There's more than one possible reason for this move, but I can't help thinking that escaping the GPL 2 with its (hardware and software) patent grab might be the biggest single reason.
Did you mean GPL3? If not, would appreciate a link on GPL2 patent issues.
GPL2 Did you Google?
Google "gpl 2 patent clause"
First result, for me:
Patent clauses in software licences - software patents wiki (en.swpat.org)
Jump to GNU GPL v2 - (See: GPLv2 and patents). See section 6 and section 7. Patent attorney Dan Ravicher argues that GPLv2 includes an implicit ...
Thanks. Had not seen patent language in GPLv2, but it was implied, https://copyleft.org/guide/comprehensive-gpl-guidech7.html
> The GPLv2, despite being silent with respect to patents, actually confers on its licensees more rights to a licensor’s patents than those licenses that purport to address the issue. This is the case because patent law, under the doctrine of implied license, gives to each distributee of a patented article a license from the distributor to practice any patent claims owned or held by the distributor that cover the distributed article. The implied license also extends to any patent claims owned or held by the distributor that cover “reasonably contemplated uses” of the patented article.
But there's no COM on Linux. How can I magically port my software across if I do not have access to all the wonders of COM, eg. MS XML and have to rewrite my software from scratch?
At least with Microsoft you have support for APIs for a long long time; eg. MFC is old but it is still supported.
With Linux you are on your own.
So "get your software on Linux" is easy to say but in the real world it isn't so easy.
Use winelib if you want a quick and dirty port
That's very dismissive.
Why? I was providing a solution. There's nothing wrong with using winelib if you want to port Windows software to linux.
There isn't anything wrong with using wine but tossing it out there as the cure-all for Windows based applications is wreck less. I haven't played with wine in quite some time but it has a track record of "working" rather than working.
> Operating systems are hard, drop Windows and sell your software on Linux. We all know it's inevitable in the long run.
Soo... how do I run Linux on a new Dell XPS or a Surface Book without updating kernel to newest 4.x branch? Because last I check Linux forced me to update to new version (including userspace since new kernels weren't compatible with old userspace) to run on new hardware as well.
Do you have a link to info about that kernel/userspace incompatibility? I thought Linus' mantra was "we don't break userspace".
Nonsense. It's perfectly plain that the message passing micro kernel architecture and superior userland design of HURD will lead to it's inevitable supremacy.
I don't think anyone believed it was anything but a business decision so this is not really shocking.
Operating systems are hard, drop Windows and sell your software on Linux. We all know it's inevitable in the long run.
I doubt it, this isn't exactly revolutionary. All he did was modify a dll to return a 1 for the "is_os_supported" function instead of what it used to it.
The question is: why would you want put yourself through that level of hell? I don't mean working for Microsoft, that would be pretty amazing! What I mean is, why would anyone want to become a specialist in Windows Update and winsxs technologies?
Seriously, it's kludge after kludge. Windows 7 uses the WSUS protocol (which is a hairball set of web services) to figure out which updates it needs to apply - of which it does through recursively querying what base packages are on the system and goes some way to explain the incredibly slow updates people see... but it gets its latest package list and after Microsoft realised they had so many security patches they were releasing they discovered their CAB file format needed to be rearchitected as it could hold enough files... hence WSUSCAN2.CAB now must be downloaded along with something like a third or fourth attempt at getting the windows agent written correctly.
But it gets worse, because somehow it must check what it has downloaded in a gigantic (over 1GB) opaque ESE edb file, which it synchronised with the sift are distribution cache in the Windows directory. Here it looks up the Componebt Based Services registry, along with a CONPONENTS hive that only the Windows Update service loads and you generally can't see when you open Regedit. Except that there is a set of keys in the HKLM\Software\Microsoft\CurrentControlSet\Conponebt Based Services - consisting of an ApplicabilityCache, a list of packages, a package index, a set of package detection keys, and a set of componebt detection keys.
Once you begin to decipher a current and applicable state, you must work out how it relates to the package index list in the registry, which in turn somehow related to the packages keys, which have their own interesting binary keys that Microsoft set...
So once you've worked that out then you need to decipher the manifest files in the Windows Sude by Side system. These are signed with CAT files, abdysuslly have Microsoft Update files that go along with actually payload files. Somehow these relate to a set of session XML files, which are meant to help you troubleshoot when things go wrong and package states go awry. Except the XML format isntvfocumebted except in a few tantalising blogs which aren't in any way complete and some of which seem to becMicrosidt developers reverse engineering the file format themselves...
WinSxS itself holds every files ever installed by Windows Uodate in the %systemroot%\winsxs folder, which is a bunch of folders with NTFS hardkinks back to the C:\Windows\system32 files. Microsoft originally wanted to see the state of an installation so they made the decision that when the kinks to newer files were updated they would keep The of files around so they could know the system state and presumably try to allow rolling forwards and backwards o a snapshot of time - they reasoned this was ok because they released the dism.exe took to remove of hitfixee and updates from previous service packs. Unfortunately that switch never got used because the DISM got released in Windows 7 SP1 and sonebody in Mucrosodt decided to go with rolling updates, and no more service packs. Consequently there are often 8 or 9 GBof unneeded and unused files in most Windows 7 systems (depending on the age and how frequently updated the system has been maintained)... and evidently someone st Microsofy realised this because about 4 or 5 months ago they released an update for, or all things, the Windows Disk Cleanup Woizard to remove these old packages. You must run it, then reboot and depending on the amount if packages it removes has been known to have edb ysers stuck in "100% of updates have been installed" for 15 to 45 minutes whilst it cranks through the cleanup process. Of course most end users think there PC has crashed and Windows is "stuck" so they reboot it half way through, to varying results.
There are three different tools to check corruption - the sfc utility, the dism utility and a variety of Windows diagnostics that old you can download and that are a bunch of Powershell of VBscriot scripts that attempt to slowly fix the plethora of issues that can prevent Microsoft Update from working.
No, kernel reversing engineering is fun, but poking around Windows Update is not. I wouldn't recommend getting hired to untangle it...
Disk cleanup wizard WinSXS addon for W7 SP1 was released in October 2013...
Oh, I stand corrected then!
Wow, I got all that out and look at the typos!
Thank you for taking the time, typos & all. I was unaware Disk Cleanup now has the ability to clean that ginormous mess up! Am off to attempt to 'not crash' an old bloated laptop.
When Mark Russinovich did something like this, he got a job offer from Microsoft. Maybe Zeffy has a future on the Windows team?