The update is already installed on this system

The update is already installed on this system

I had an update on my system in a partially installed state. Our vulnerability scanner determined one file, MSO.dll, was still out of date. It recommended a patch to apply. Running it gave me an error message. Here’s what to do when Windows says the update is already installed on this system and refuses to let you do anything but click OK.

Because hey, from a security analyst’s point of view, this is anything but OK. I get questions about patches in a partially deployed state all the time, so I figured I’d write about it.

Read more

Benefits of defragmenting your computer hard drive

Benefits of defragmenting your computer hard drive

It’s been a long time since I’ve seen someone explain the benefits of defragmenting your computer hard drive. I do see a lot of misconceptions out there. I explained defragmenting in my 1999 book, so I’ll explain it again.

Part of the misconception is that things have changed. The tools have changed, yes. But the need hasn’t.

Read more

Share a Windows 10 printer by UNC

Windows 10 uses homegroups, but if you have systems that don’t understand homegroups and want to share a Windows 10 printer by UNC (the old school way to share a network printer), it’s not obvious how to go about doing it.

I couldn’t find a way from the GUI, but it’s still possible to share the printer from a command line.

Read more

Microsoft looks back at MS08-067

The most infamous Microsoft patch of all time, in security circles at least, is MS08-067. As the name suggests, it was the 67th security update that Microsoft released in 2008. Less obviously, it fixed a huge problem in a file called netapi32.dll. Of course, 2008 was a long time ago in computing circles, but not far enough. I still hear stories about production servers that are missing MS08-067.

Last week, Microsoft took a look back at MS08-067, sharing some of its own war stories, including how they uncovered the vulnerability, developed a fix, and deployed it quickly. It’s unclear who besides Microsoft knew about the problem at the time, but one must assume others were aware of it and using it. They certainly were after the fall of 2008.

Read more

So just how dangerous is an old, out of date operating system anyway?

Glaurung brought up a good point in a comment yesterday. If you never go online and/or you’re really careful, do you really need to update your OS to something new?

In my professional opinion, it depends. Didn’t you know that would be my answer? Read more

Using antivirus to deliver a virus

A coworker tipped me off the other day to how it’s possible to use a certain major-brand antivirus to infect a computer. “I didn’t have admin rights,” I overheard him explaining, “So I got them with [redacted] antivirus.”

My head spun around violently. “You did what?

“Google ‘confused deputy persistence,'” he deadbeated. “It’s the first result.” Then he went back to explaining the problem at hand. Read more

The contractor who built systems via P2P

Today I was helping one of my coworkers study for the Security+ exam, and one of his study questions reminded me of a story.

I wrote a few days ago about spending some time in an unhealthy IT shop. One of my cohorts supported one of the departments that decided to outsource its IT to a contractor, rather than use the internal IT department. The hand-off didn’t exactly go as it should.

Read more

Happy Patch Tuesday, September 2011

Microsoft has five updates and Adobe has two for us on this fine Patch Tuesday, in addition to a patch Mozilla pushed out for Firefox last week.

Don’t get too complacent if you run something other than Windows. If you run Microsoft Office on a Mac, or Adobe Reader or Acrobat on a Mac, or Adobe Reader on Unix or Linux, you’re vulnerable. The vulnerabilities in those affected products are more serious than the vulnerabilities for Windows. So keep that in mind. Don’t be smug about security. It’ll bite you.

Read more

The best way to optimize your firewall: Use hardware

Let’s get back to talking about utility replacements. We last talked about antivirus programs, but what about the other component of what’s commonly now called a “security suite,” the firewall?

The answer is, don’t use firewall software if at all possible–which means every man, woman and child who has a cable or DSL connection. Use a separate device.There are several good reasons for this. First, there’s the fundamental problem with running your security on the same system you’re trying to protect. If your firewall software goes haywire and crashes, you run the risk of being unprotected. It’s much safer to rely on an external device that doesn’t have an Intel or AMD processor in it and isn’t running Windows. So when someone tries to send a Windows exploit or virus to it, it bounces off because the device just doesn’t understand.

The second reason is price. A plain no-frills cable/DSL router/firewall costs about $20 at Newegg today. The unit I generally recommend is the Linksys WRT54G, which sells for about $50 new or as little as $25 used and adds wireless capability. That’s about the same as the retail price of a software firewall anyway, and it gives you better protection without robbing your system of performance.

A cheaper alternative, which was what I used to do when these devices cost $200, was to take an obsolete PC, put in a couple of cheap network cards, and run Freesco on it. It will run on any PC with a 386 processor or better (I recommend a Pentium with PCI slots for ease of setup). A 100 MHz Pentium is more than powerful enough and if you don’t already have an obsolete PC to run it on, you probably won’t have to ask around very long before finding one for a very low price or free. Today I prefer a Linksys-type box though, since they take less space, consume less electricity, generate less heat and noise, and take less time to set up.

Performance is the third reason. Two years ago I was working at a large broadband ISP that will remain nameless. It provides a “high speed security suite” as part of the subscription price. The system requirements for this suite are ridiculous–the suite itself needs anywhere from 128 to 192 megabytes of RAM all to itself to function. Basically, if you have a PC with 256 megs of RAM (which is what a fair number of PCs out there still have), loading this security suite on it will bring it to its knees. But if your firewall is running on a separate device, 256 megs of RAM is a comfortable amount of memory to run Windows XP or 2000 and basic applications.

Reliability is the fourth reason. Every high-speed security suite I’ve ever dealt with, be it a freebie provided by your ISP, or an off-the-shelf suite, hooks itself into winsock.dll. Three of the last four computer problems I’ve fixed have been related to this problem, and the symptoms are difficult to diagnose unless you’ve seen the problem before. Basically the computer loses any and all ability to do any networking, but when you call tech support, enough things work that tech support will probably tell you to reload your operating system. Unfortunately, the WinSockFix utility doesn’t seem to be well-known at ISPs.

If messing around with your Winsock isn’t bad enough, the security suite my former employer provided was overly paranoid about piracy. If you did any number of things, including but not limited to trying to install it on a second PC without getting a second key from the ISP, it would disable itself and not necessarily warn the user that it had left the PC unprotected. It was my job, when I was working there, to go through all of the disabled accounts by hand. It wasn’t an automated process. So if the security suite decided to go jump off a cliff sometime on Friday after I’d pulled the current report, it would be sometime on Monday before I would even be aware of the problem. Given that it usually takes about 20 minutes for some exploit to find an unprotected Windows box sitting on the Internet, that 48-72 hour window that you could be sitting unprotected is anything but ideal.

Things may have changed since I left that employer in November 2005, but if it’s my PC, I’m not willing to risk it. I’d much rather spend $20-$50 on a cable/DSL router to give myself firewall protection that I know I can just set up once and then ignore for a few years and won’t cause my PC to constantly fall behind on the upgrade treadmill.

And finally, the fifth reason to use a hardware firewall is apathy. Software firewalls tend to throw a lot of popups at the user, warning the user that this or that is trying to access the Internet, or come in, or whatever. Most users are likely to do one of two things: either allow everything or deny everything. The result is either a PC on which nothing works, or whose firewall is full of so many holes there might as well not be one. It’s much better to have a hardware firewall that just does its job. If you’re worried about unauthorized applications hitting the Internet, that’s the job of antivirus and antispyware software, not the firewall.

If I had my own Linux distribution

I found an interesting editorial called If I had my own Linux Distro. He’s got some good ideas but I wish he’d known what he was talking about on some others.
He says it should be based on FreeBSD because it boots faster than Linux. I thought everyone knew that Unix boot time has very little to do with the kernel? A kernel will boot more slowly if it’s trying to detect too much hardware, but the big factor in boot time is init, not the kernel. BSD’s init is much faster than SysV-style init. Linux distros that use BSD-style inits (Slackware, and optionally, Debian, and, as far as I understand, Gentoo) boot much faster than systems that use a traditional System V-style init. I recently converted a Debian box to use runit, and the decrease in boot time and increase in available memory at boot was noticeable. Unfortunately now the system doesn’t shut down properly. But it proves the concept.

He talks about installing every possible library to eliminate dependency problems. Better idea: Scrap RPM and use apt (like Debian and its derivatives) or a ports-style system like Gentoo. The only time I’ve seen dependency issues crop up in Debian was on a system that had an out of date glibc installed, in which case you solve the issue by either keeping the distribution up to date, or updating glibc prior to installing the package that fails. These problems are exceedingly rare, by the way. In systems like Gentoo, they don’t happen because the installation script downloads and compiles everything necessary.

Debian’s and Gentoo’s solution is far more elegant than his proposal: Installing everything possible isn’t going to solve your issue when glibc is the problem. Blindly replacing glibc was a problem in the past. The problems that caused that are hopefully solved now, but they’re beyond the control of any single distribution, and given the choice between having a new install stomp on glibc and break something old or an error message, I’ll take the error message. Especially since I can clear the issue with an apt-get install glibc. (Then when an old application breaks, it’s my fault, not the operating system’s.)

In all fairness, dependency issues crop up in Windows all the time: When people talk about DLL Hell, they’re talking about dependency problems. It’s a different name for the same problem. On Macintoshes, the equivalent problem was extensions conflicts. For some reason, people don’t hold Linux to the same standard they hold Windows and Macs to. People complain, but when was the last time you heard someone say Windows or Mac OS wasn’t ready for the desktop, or the server room, or the enterprise, or your widowed great aunt?

He also talks about not worrying about bloat. I take issue with that. When it’s possible to make a graphical Linux distribution that fits on a handful of floppies, there’s no reason not to make a system smooth and fast. That means you do a lot of things. Compile for an advanced architecture and use the -O3 options. Use an advanced compiler like CGG 3.2 or Intel’s ICC 7.0 while you’re at it. Prelink the binaries. Use a fast-booting init and a high-performance system logger. Mount filesystems with the highest-performing options by default. Partition off /var and /tmp so those directories don’t fragment the rest of your filesystem. Linux can outperform other operating systems on like hardware, so it should.

But when you do those things, then it necessarily follows that people are going to want to run your distribution on marginal hardware, and you can’t count on marginal hardware having a 20-gig hard drive. It’s possible to give people the basic utilities, XFree86, a reasonably slick window manager or environment, and the apps everyone wants (word processing, e-mail, personal finance, a web browser, instant messaging, a media player, a graphics viewer, a few card games, and–I’ll say it–file sharing) in a few hundred megabytes. So why not give it to them?

I guess all of this brings up the nicest thing about Linux. All the source code to anything desirable and all the tools are out there, so a person with vision can take them and build the ultimate distribution with it.

Yes, the idea is tempting.

WordPress Appliance - Powered by TurnKey Linux