The Big Picture
As with many things, it is customary to start a history of something from the very origin or inspiration; however, it is important to note that this website will start when the first, true computer started up (what we would think of today to some extent), not a simple machine. There are also two genres of computers: analog and digital. Digital is used a lot more nowadays, but back in the early years of computing, analog was usually the way to go. So, what's the difference? Well, analog computers don't convert information into binary code (the 0s and 1s) while digital computers do. The difference may seem subtle; however, the simple conversion of info to binary code allows the computer to store so much more of the original information, converting the binary code back to the original information when needed. That is one of the many reasons to why you see computers getting smaller over the course of computer history.
1900-1940
Through 1900 to 1940, one can find the IBM schoolhouse, the famous code breakers at Bletchley Park, and George Stibitz's ideas for a relay based computer.
1820-1900
Here, one can find the first mass-produced calculators, the Analytical Engine, and the Felt and Tarrant Comptometer.
1940-1970
Within this time period, one can find the giant vacuum computers like ENIAC, computer information standards like ASCII, and Intel's microprocessor.
1970-2000
When looking at this time period, one can find the Apple II, IBM PC, and IBM Simon.