A CDE Definition
(American Standard Code for Information Interchange) Pronounced "ask-ee," it is the built-in binary code for representing characters in all computers except IBM mainframes, which use the EBCDIC coding system. ASCII was originally developed for communications and uses only seven bits per character, providing 128 combinations that include upper and lower case alphabetic letters, the numeric digits and special symbols such as the $ and %. The first 32 characters are set aside for communications and printer control (see ASCII chart).
A Byte Holds ASCII and Then Some
Since the common storage unit in a computer is an 8-bit byte (256 character combinations) and ASCII uses only the first 128 (0-127), the second set of 128 characters (128-255) are technically not ASCII, but are typically used for foreign language and math symbols. In the first PCs running DOS, they also contained elementary graphics symbols. In the Mac, the additional values can be defined by the user.
ASCII Vs. Hex
In technical applications typically used by developers, you may have a choice between entering data in ASCII or "hex" for editing or searching. ASCII is entered by typing in regular text, but because there are not enough keys on the keyboard to enter 256 distinct characters, the hexadecimal (hex) numbering system is used. Hex is entered by typing only the digits 0 to 9 or the letters A to F, and it provides a precise way of defining any of the 256 possible combinations in the byte, whether they be control codes (0-31) or the last 128 (128-255). See hex chart, ASCII file and Unicode.
Before/After Your Search Term
Terms By Topic
Click any of the following categories for a list of fundamental terms.