In the early days of home computing, the two most-used types of displays were television sets and monochrome monitors. I’ll talk about TV sets first.
In the 1980s, TVs were cheaper than monitors. All you needed to do your computing was acquire the cheapest TV set you could find, hook up your computer to the television via the built-in TV OUT port to a RF signal switcher, change the channel on the TV to channel 3 or 4, slide the switch on the box to accept the input from the computer, and you were good to go.
Standard resolutions on 8-bit home computers were low, so characters and graphics could be easily seen. On the Commodore VIC-20 the standard resolution was 176×176, the TRS-80 256×192 and Atari 320×192. Any of these or other 8-bit computer resolutions showed nice big characters on-screen.
Using a TV was also the cheapest way to compute in color. By the early 80s, color televisions were selling for cheap at that point, so there really wasn’t any need for a true computer monitor for most people.
It was only when you went into the PC realm that you actually needed a computer monitor. The Commodore 64 displayed 40 columns, which was easily readable on a TV set. An IBM PC on the other hand displayed 80 columns, and this was exceedingly difficult to read on a television because you reached the limit of what a 525-scanline NTSC could do. At that point, you needed a real monitor.
Monochrome, contrary to popular belief, does not mean "only black and white". When referring to a monitor, it literally means "one displayed color". This color was either white, gray, amber or green. Most of the early monitors displayed the color amber or green, with green being dominant, hence the "green screen monitor".
True black-and-white 2-color monitors were only on early Apple Macintosh computers; they had little 9-inch screens that literally could only display black and white. And yes, it was a true "and" because it did display both black and white at the same time. No, it was not grayscale as the early Macs did not do grays. Any "shade" it made was done by adding small dots or lines to give the appearance of a shade.
Can you emulate the old monochrome experience?
No, because modern OSes don’t allow it. You can however easily set your color settings to zero to recreate the grayscale experience, which is close enough to monochrome.
Although display control software is different depending on what video card you have, here’s how to recreate a grayscale experience using the Catalyst Control Center from AMD (formerly ATI):
In the Catalyst Control Center, expand the menu for your monitor. Depending on the physical connection it will either be under "My Digital Flat-Panels" or "My VGA Displays":
Click the appropriate setting, then just drag the "Saturation" setting to zero:
You’ll instantly see everything go to grayscale mode when you do this.
Some of you might actually really appreciate the ability to "go into grayscale" periodically. It works best when typing up documents and emails when you want less distractions.
The PCMech.com weekly newsletter has been running strong for over 8 years. Sign up to get tech news, updates and exclusive content - right in your inbox. Also get (several) free gifts.