Last Updated on November 22, 2018 by Dave Farquhar
PC Magazine is advocating a bring your own laptop, with your own software approach to business. It likens it to mechanics who bring their own tools.
The trouble is that while mechanical tools in a toolbox operate autonomously and don’t interfere with one another, software residing on a computer does.
I’m sure some people are tired of me bringing up the 1990s, but it’s relevant. At the start of my career, I was expected to support three office suites, four operating systems and two hardware platforms. There weren’t enough of us to go around to keep things working.
The worst example was when a department of about seven employees soaked up the totality of 1.5 IT employees’ billable hours over the course of one fiscal year. Their support costs exceeded $200,000, and since the bulk of those calls came from three users, it’s likely that the computer support costs for those three employees exceeded their salaries.
You may think I’m kidding, but when we came to work, it wasn’t a question of whether we’d spend the day visiting one of them; the question was which of them, and whether it would be more than one. One of the techs spent so much after-hours time fixing their machines that he took to sleeping in the sick room on at least one occasion.
It got so far out of hand that we talked them into an experiment. We gave those employees identical Micron PCs running Windows 2000 and our standard suite that every other department was using. Their designers got both a PC and a brand-new Macintosh. Those Macs were identically configured as well. Productivity soared and downtime plummeted to nearly zero. And the department made money for the first time in its history.
I called it an experiment, but it wasn’t much of one, really. I noticed very early in my career that departments with oddball computers had more problems than departments that replaced all their machines with identical PCs at once. This was originally a cost-saving measure because most makers would cut my employers a deal if they bought in bulk, but the benefits of standardization were so immediate and so immense that anyone could see it. I saw this trend in my first job at the University of Missouri, and I brought it with me when I moved to St. Louis, where I really saw it work on a large scale on a network with 1,000 PCs on it.
Lest you think this was a 1990s problem, at my last job we had multiple types of HP machines running XP, Windows 7, and combinations of Office 2007 and 2010. Although the problems weren’t as bad as the 1990s, we had a great deal of trouble with document inconsistency. The formatting would get messed up, embedded files would fail to open for some users, and other nagging problems. There were enough tech savvy people in our department that we could deal with the problem; I doubt the non-technical departments could have.
Consistency makes it much easier to find and deal with problems.You build a machine using the current year’s hardware, load all the company’s standard software on it, run some rudimentary tests to make sure everything behaves correctly, then fix any problems before making your image and deploying it. In the unlikely event that a problem surfaces afterward, you fix it once and the same solution works everywhere else. In these austere times when everyone’s pushing for people do to more with less, this is the way you do more with less on the desktop. It works.
Inconsistency can quickly cause support costs to spiral much higher than the cost of the machines. I know, because I watched it happen and I fixed it.
So while letting people bring in whatever they want may sound like a great way to increase productivity and save money, we went down that road 15 years ago. I was there and saw it happen, and it was a disaster. A very expensive, very painful disaster. It’s a form of what we now call Shadow IT.
It’s a problem from a security standpoint too. You knew I’d go there. Under this scenario, nothing stops people from bringing in their outdated, no-longer-updateable, vulnerable copies of software like Microsoft Office 2000–some people prefer that version, you know–and Adobe Creative Suite 2. It’s trivially easy to embed a reverse telnet applet in a document that exploits old Microsoft and Adobe software–a sixth grader could do it–and then all it takes is for someone to send it, entice the recipient to open it, and boom, they’re on your network. All it takes is a former colleague knowing that Bob prefers Office 2000 and that he’s now working at a competitor. Send the poisoned document to Bob, then Bob opens it, and boom, you have a competitor sitting on your network. The way you fix it is with vulnerability management and patch management.
In this era when all it takes is above-average computer skills and below-average morals to bust through security measures, it’s time to get more strict about what’s loaded on computers. Not less.
David Farquhar is a computer security professional, entrepreneur, and author. He started his career as a part-time computer technician in 1994, worked his way up to system administrator by 1997, and has specialized in vulnerability management since 2013. He invests in real estate on the side and his hobbies include O gauge trains, baseball cards, and retro computers and video games. A University of Missouri graduate, he holds CISSP and Security+ certifications. He lives in St. Louis with his family.