For the better part of my adult life, I’ve been dealing with the myth that if there were certain settings that could speed up Windows, Microsoft would make those settings the default for the operating system. The pundits who perpetuate this myth have their reasons for doing so, but that didn’t make them true.
Now, the difference is harder to notice today than it was when I started my career. There are things I can do to make Windows 7 run better on my 4-core, 3.1 GHz AMD64 box with 8 GB of RAM and a 100 GB SSD. But I won’t notice the cumulative effects of a few 5% improvements on that box. Not the way I did on 50 MHz 80486-based PCs in 1997.
Microsoft’s philosophy for 22 years, from Windows 1.0 in 1985 to Windows Vista in 2007, was to write the software, and if it takes a few years for the hardware to catch up with it, so be it. Windows 7 changed that–for the first time, the actual requirements for running a new version of Windows went down–and, with Windows 8, it looks like CPU requirements will hold steady, and memory usage will actually go down.
Read more
David Farquhar is a computer security professional, entrepreneur, and author. He started his career as a part-time computer technician in 1994, worked his way up to system administrator by 1997, and has specialized in vulnerability management since 2013. He invests in real estate on the side and his hobbies include O gauge trains, baseball cards, and retro computers and video games. A University of Missouri graduate, he holds CISSP and Security+ certifications. He lives in St. Louis with his family.