Ancient and Medieval Period
Aristotle (384–322 BC):
Developed formal logic, which laid the groundwork for computational logic used in computer programming and software development.
Al-Khwarizmi (780-850 AD):
A Persian mathematician who wrote on Hindu-Arabic numerals and algebra. He is often credited as one of the founders of algebra. His works were crucial in the development of algorithms in computer science.
Renaissance to 19th Century
Gottfried Wilhelm Leibniz (1646–1716):
Invented the binary number system, now the basis of all digital computers.
Charles Babbage (1791–1871):
Designed the first mechanical computer, which he called the Analytical Engine. He is considered the “father of the computer.”
Ada Lovelace (1815–1852):
Worked with Babbage on the Analytical Engine and is regarded as the first computer programmer.
20th Century
Alan Turing (1912–1954):
Formulated the Turing Machine and Turing Test, laying the fundamental groundwork for modern computer science and artificial intelligence.
John von Neumann (1903–1957):
Developed the von Neumann architecture, the foundation of most modern computer architectures.
Claude Shannon (1916–2001):
Formulated the theory of digital coding and information theory.
Grace Hopper (1906–1992):
Developed the first compiler for a computer programming language and popularized the concept of machine-independent programming languages, leading to the development of COBOL.
Late 20th Century to Present
Donald Knuth (b. 1938):
His multi-volume work “The Art of Computer Programming” is critical in the field of algorithm analysis.
Tim Berners-Lee (b. 1955):
Invented the World Wide Web, fundamentally changing the internet.
Linus Torvalds (b. 1969):
Created Linux, an open source operating system that has become a foundational software in servers, desktops, and embedded systems worldwide.
Andrew Ng (b. 1976):
Co-founder of Coursera; his work in deep learning and artificial intelligence has been seminal.