History of Computers - ASCII

From SJS Wiki
Jump to: navigation, search

ASCII, an acronym for American Standard Code for Information Interchange, is a way of encoding text. ASCII provides a binary code for 128 characters so that each character can be described in 7 bits.

Overview

ASCII began as a project to standardize codes for teleprinters, machines that would automatically type the message that was being sent to them. The ASCII list contains 128 characters, broken down as follows:

Decimal codes Description
From 0-31 Non-printable control characters, such as Tab, Return, etc.[1]
From 32-64 Space, Printable symbols, numbers, etc; generally all the numbers and the symbols above the numbers on a keyboard [1]
From 65-90 The Uppercase alphabet; A to Z[1]
From 91-96 A few more symbols[1]
From 97- 122 The lowercase alphabet, a to z[1]
From 123-126 The rest of the symbols that appear on a standard keyboard[1]
127 Delete[1]

The whole set is available here.

Although the codes only go up to 127, since 0 is included, it brings the count up to 128.


Though it seems random, much thought went into deciding where each symbol goes. For example, it's easy to encode a number using ASCII. The numbers are set up so that the binary representation for the digit is the binary code "110" and then the number in binary. Also, when you plot out the ASCII table in rows of 16, above many numbers is the same symbol that you would get if you did shift+ that number. For example, 16 less (or right above on a table of rows of 16) than the code for "3" is the code for "#"

Originally, the committee thought about using 6 bits, 7 bits, or 8 bits[2] 6 bits would have been the cheapest, but it would have required a "Shift" code to display all the symbols. If the "Shift" wasn't transmitted properly, it could cause most of the message to become unreadable. Eight bits was seriously considered because it would work well with machines that used quartets(4 bits) and bytes and would allow many characters to be assigned values. In the end, however, seven bits was chosen because it was the minimum number of bits that could allow for an acceptable number of characters.[2] Using 8 bits would have made it much more costly to send messages. If a person sends 5 140 character messages(Pretty short messages) a day using 8 bit encoding, then over a year they would have sent over a quarter of a million more bits then they would have if they had used 7 bit encoding. Machines that worked with 8 bits, the committee reasoned, could use the 8th bit as an error-checking bit (parity bit) or just leave it a 0[2].

Significance

Since computers can only store data as 1s and 0s, and only numbers can be directly represented with binary, there needed to be a "key" to convert a letter into a number and then into binary. The ASCII table was the first one of these keys to be widely adopted for computers. Though it is no longer as popular as it once was, at the end of 2007 it was still the most common encoding of text files on the internet. Though ASCII is not commonly used today, it had a large impact on Unicode, which is commonly used today. Unicode has a higher bit depth (It uses more bits to store each letter) so it can represent many other characters that ASCII cannot. Unicode can even display characters from Chinese, Ancient Persian and many, many other languages[3]. Yet, the standard English characters and symbols are still in Unicode and are laid out in almost exactly the same way as they are in ASCII.

With the standardization that ASCII provided, all teleprinters could communicate with each other without worrying about compatibility problems. This allowed much better communication, since you had a near guarantee that your message would not come out as gibberish since the encodings were standardized.


Links

http://AsciiTable.com

http://www.pcmag.com/encyclopedia_term/0,2542,t=ASCII&i=38012,00.asp

History of Computers - Binary Arithmetic

References

  1. 1.0 1.1 1.2 1.3 1.4 1.5 1.6 http://AsciiTable.com
  2. 2.0 2.1 2.2 Charles E. Mackenzie (1980). Coded Character Sets, History and Development.
  3. http://www.unicode.org/faq/utf_bom.html#utf16-6

How Computers Work by Ron White

http://edition.cnn.com/TECH/computing/9907/06/1963.idg/index.html

http://www.wps.com/projects/codes/index.html http://www.wps.com/projects/codes/index.html]