You are here

letters and numbers for computer progammers

Computers, as many people know, at root work with numbers in the binary (base 2) system. This is because the on-off nature of digital circuits maps very easily to a series of 1s and 0s.

How, then do we get letters (and other symbols), and fractions, and very large numbers, out of the thing? We need to have ways of encoding them in binary.

For example, we might all agree that 01000001 represents 'A'. It wasn't too hard back in the days when we only let Americans use computers (I kid, I kid) and so had a relatively small number of letters, numerals, and punctuation marks to account for; but as we started to deal with accented characters, and typographical symbols like ©, and then Chinese and Japanese and Arabic and...well, it got complicated, and it's something that programmers often get wrong. Gobbledygook all too often shows up in web pages and other documents.

And on the math side, rounding errors continue to be a problem in many applications.

So here are two useful guides that everyone who writes software ought to read:


Add new comment

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
To prevent automated spam submissions leave this field empty.
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Enter the characters shown in the image.

User login

To prevent automated spam submissions leave this field empty.