CRT burn in is a phenomenon where an image becomes permanently etched into a monitor’s phosphors. This causes the outline of the image to remain visible, even when the monitor is off. It’s a problem that we frequently find on vintage CRT monitors today.
Burn in is most frequently associated with CRTs, though it can happen with other display types. Modern technology usually mitigates burn in on newer displays, so burn in rarely happens with modern displays, such as LCD or LED displays.
What caused CRT burn in
CRT burn in happened from displaying the same image on a screen for too long without changing it. In the early days of television, when a station went off the air, it would usually broadcast a test pattern. This way, if you turned on your TV and the station wasn’t on the air, you at least knew your TV worked. The problem was, if you left your TV on overnight, that pattern could burn into the screen.
The design of tubes improved over the years, and that combined with stations staying on the air longer made CRT burn in less common on televisions. The problem appeared again when computers came into use. When you find a vintage CRT computer monitor that saw heavy use, burn in on the CRT can give you a good clue how it spent the bulk of its life. While interesting from a historical perspective, it’s usually distracting if you want to actually use the monitor. It’s not the ideal kind of provenance, certainly.
Preventing CRT burn in
Computer and video game system manufacturers were aware of the problem of CRT burn in. Atari, in particular, was concerned about damaging your television set, and took measures to prevent it. They saw getting a reputation for breaking your TV as an existential threat. Most first-party Atari 2600 games kept the screen moving, and would change the colors on the screen every couple of minutes. When you see an Atari 2600 game running in demo mode and choosing weird and unattractive screen colors, that’s why. It’s just trying to protect your TV. Atari home computers would cycle the colors on the screen if you went more than a few minutes without touching the keyboard.
Not every computer manufacturer thought of this, though. Computer magazines would advise you to turn off your monitor or TV if you stepped away from your computer for more than a few minutes, but that didn’t help people who missed that issue or didn’t happen to read that far. But word about CRT burn in did travel. When I went to college in the early 1990s, computer literacy wasn’t something we could take for granted, but a surprising number of people knew about screen burn. The introductory computer classes most students took their first year covered it, and no one seemed to have trouble remembering the concept.
The rise of screen savers
Screen savers appeared in the 1980s to try to prevent this problem. And for a time, especially in the early 90s, they were hugely popular. While effective, a lot of computer technicians didn’t like them because they could cause other problems. Many of them weren’t especially well written, and some were laced with spyware. I advised people to use screen blankers, which displayed a blank screen, rather than elaborate screen savers. This prevented security problems and gave the CRT phosphors a chance to actually rest. Always displaying a screen would cause the CRT phosphors to wear out and degrade, resulting in a dimmer picture at the very least. Here’s my anti-screen saver treatise from 2004.
Refresh rate and CRT burn in
The other thing that made CRT burn in less common was increased refresh rate. The refresh rate is how quickly the monitor redraws the whole screen. In the 1980s and early 90s, monitor refresh rate was the same as televisions. That’s 60 Hz in regions that use NTSC video, and 50 Hz in regions that use PAL.
But refreshing at a higher rate reduces eyestrain. So by the mid 90s, pricier computer monitors could optionally run at rates higher than 50 or 60 Hz. Refreshing the screen at higher rates had the side effect of reducing CRT burn in.
Why CRT burn in happened if everyone knew about it
So, if everyone knew about CRT burn in by the early 90s, why do so many surviving CRTs have it? Knowing how to prevent it and actually preventing it everywhere are two different things. People generally did a good job of taking care of the monitors they personally owned, and the monitors on their computers at work. It was the monitors that were in common areas that tended to get neglect, and become susceptible to burn in. Monitors on servers come to mind. Monitors on computers that we set up on a desk in a corner in the lounge for anyone to use might get screen burn. A monitor in a waiting area that was on all day displaying status messages might also.
This was a small percentage of the monitors in use at any given time, but it does seem to be an unusually high percentage of the CRT monitors that survive today. It seems like 10 percent of the CRT monitors that vintage computing Youtubers check out on camera after acquiring have screen burn. As someone who worked in the computer industry in the late 90s, I can tell you I didn’t see screen burn on 1 in 10 monitors. It was more like five out of 100.
When a monitor’s display went dim from high use, we could turn up the brightness and/or contrast to get a bit more use out of it. But once that didn’t work, we discarded the monitor. A spare monitor with screen burn could sit around for years with little use, so they were less likely to wear out. I think that contributes to their high survival rate.
CRT burn in fix
CRT burn in is permanent, so the only fix is to replace the tube in the display. Replacing tubes was once a common occurrence. In the 1983 comedy film Mr. Mom, a television repair woman replaced a tube in a TV that Michael Keaton’s character had kicked and broken. The idea of being able to call someone who could come out to your house today and replace a tube in whatever random 13-inch TV you happen to own is absurd today. But in 1983, that was no more unusual than being able to call an electrician and expecting the electrician to be able to replace a circuit breaker. Everyone knew it was an expensive repair and avoidable. But nothing about the scene seemed unusual.
Today, finding someone who can fix a CRT is difficult, and replacement parts are scarce. There happened to be a shop near me that would repair CRTs, but it closed sometime around 2017. It may surprise you that it was still in business that recently. It held a liquidation sale when it closed, and its parts inventory scattered to the wind. And sadly, few of the niche businesses like that who were in business at the beginning of 2020 survived to the end of 2020.
Not all CRTs are interchangeable, but it is sometimes possible to swap a tube from one monitor or TV to another one of the same size. I’m not qualified to do those repairs or advise you on attempting such a swap. The best I can suggest is that you find a local retro computing enthusiasts group on social media. Most major cities have at least one such group. If there’s someone in your area who does such repairs, groups like that are probably the best place to find them.
Screen burn on newer display types
Screen burn doesn’t only affect CRTs. At the job I worked from 2005-2009, we had several large plasma displays in our helpdesk area displaying CA Unicenter, a status monitoring system. When all was well, the system icons on the display were green. If a system was partially down, it would turn yellow. And if a system was completely down, it turned red. If we were having a good day, the colors wouldn’t change. So those expensive plasma displays had burn in. We did find that if we powered the displays off and left them for a few hours, the screen could heal. So eventually we added an extra display so we could shift screens around and keep one of them off all the time.
LCD and LED displays are less susceptible to burn in, but displaying the same image over and over can cause individual pixels to malfunction. Different devices have different ways of handling this. Most phones or tablets will shift the display a single pixel in a random direction periodically while in use. This protects the display without being too distracting. Phones or tablets will also blank the screen after a period of inactivity. This protects the screen and increases battery life. Computer operating systems still come with screen savers, and typically have one enabled by default. Modern operating systems will frequently display a screen saver for a period of time, then blank the screen after a further period of inactivity. This increases the useful life of the display and saves power, which is an important consideration for a laptop running on batteries.