Exploring a vast array of topics related to computation, Computing: A Historical and Technical Perspective covers the historical and technical foundation of ancient and modern-day computing. The book starts with the earliest references to counting by humans, introduces various number systems, and discusses mathematics in early civilizations. It guides readers all the way through the latest advances in computer science, such as the design and analysis of computer algorithms.
Through historical accounts, brief technical explanations, and examples, the book answers a host of questions, including:
- Why do humans count differently from the way current electronic computers do?
- Why are there 24 hours in a day, 60 minutes in an hour, etc.?
- Who invented numbers, when were they invented, and why are there different kinds?
- How do secret writings and cryptography date back to ancient civilizations?
Innumerable individuals from many cultures have contributed their talents and creativity to formulate what has become our mathematical and computing heritage. By bringing together the historical and technical aspects of computing, this book enables readers to gain a deep appreciation of the long evolutionary processes of the field developed over thousands of years. Suitable as a supplement in undergraduate courses, it provides a self-contained historical reference source for anyone interested in this important and evolving field.
Table of Contents
Chapter 1: The Dawn of Counting
Chapter 2: Representation of Numbers
Chapter 3: Rational and Irrational Numbers
Chapter 4: Prime Numbers
Chapter 5: Euclid’s Elements
Chapter 6: Diophantus of Alexandria and Arithmetica
Chapter 7: Secret Writing in Ancient Civilization
Chapter 8: The Abacus
Chapter 9: Book of Calculation by Fibonacci
Chapter 10: Decimal Fractions and Logarithms
Chapter 11: Calculating Machines
Chapter 12: Solutions to Algebraic Equations
Chapter 13: Real and Complex Numbers
Chapter 14: Cardinality
Chapter 15: Boolean Algebras and Applications
Chapter 16: Computability and Its Limitations
Chapter 17: Cryptography from the Medieval to the Modern Ages
Chapter 18: Electronic Computers
Chapter 19: Numerical Methods
Chapter 20: Modular Arithmetic
Chapter 21: Cybernetics and Information Theory
Chapter 22: Error Detecting and Correcting Codes
Chapter 23: Automata and Formal Languages
Chapter 24: Artificial Intelligence
Chapter 25: Programming Languages
Chapter 26: Algorithms and Computational Complexity
Chapter 27: The Design of Computer Algorithms
Chapter 28: Parallel and Distributed Computing
Chapter 29: Computer Networks
Chapter 30: Public-Key Cryptography
Chapter 31: Quantum Computing