Term of the Moment

distracted walking injury


Look Up Another Term


Definition: standards - character codes


The character code built into the computer determines how each letter, digit or special character ($, %, #, etc.) is represented in binary code. Fortunately, there are only two methods in wide use: EBCDIC and ASCII. IBM's mainframes and midrange systems use EBCDIC. ASCII is used for everything else, including PCs and Macs.

ASCII is a 7-bit code placed into an 8-bit storage unit. The seven bits provide the basic set of 128 ASCII characters. The 8th bit adds storage for another 128 symbols, and these symbols vary from font to font and system to system. For example, the DOS character set contains line drawing and foreign language characters. The ANSI character set uses them for foreign languages and math and publishing symbols (copyright, trademark, etc.). In the Mac, the upper 128 characters can be custom drawn.

When systems are moved from one computer to another, converting between ASCII and EBCDIC is just a small part of the data conversion process. It is done in conjunction with converting file formats from the old to the new systems. The following is a sample of ASCII and EBCDIC code. See ASCII chart, hex chart, EBCDIC chart and standards.

      Character   ASCII     EBCDIC

      space      01000000  00100000
      period     01001011  00101110
      < sign     01001100  00111100
      + sign     01001110  00101011
      $ sign     01011011  00100100
      A          11000001  01000001
      B          11000010  01000010