How large are modern computers?
Large as measured by the number of elementary building blocks: transistors for hardware, source line of code for software
# of transistors in hardware; # of lines of code in software # of
interconnected systems # of asynchronous interactions
# of transistors in a
chip: Oracle (Sun) SPARC M7: 10 x 10^9 (2015, 32 cores); Intel 10-core Xeon Haswell-E5: 5.56 x 10^9 (2014); AMD Fiji GPU: 8.9 x 10^9; Apple A7= >10^9. Xilinx
Virtex-Ultrascale XCVU440: 20 x 10^9 (2014); Samsyng 128 Gb DRAM: 137 x 10^9; Apple A9: 2 x10^9
Why do people always say Moore's Law will
be dead in 10 years? (For example: 2014,
No need to read these links) (Think about driving on a foggy unknown road: you only get to see the 10 feet ahead.)
- The more ``complex'' the design, the fewer transistors per chip/design. From the most complex: ASIC (application-specific integrated circuit) to general-purpose processor to GPU/DSP to FPGA to the least complex: memory/storage.
- Designers tend to use the increasingly available transistors in a brute force way: add/replicate regular structures (think about cache, which often accounts for the majority of the transistors on chip; replicate a design (think about multi-core processors and GPU, which replicate cores or processing units, respectively)
- The design time is bounded by time to market. Most semiconductor products have short shelf lifetime: they become obsolete in a few years, thanks to Moore's Law.
- While the price of a transistor drops exponentially according to Moore's Law, the price of a human designer, measured by $/hour remains relatively stable. The US Federal minumum wage has been relatively stable in its history, with a small dip, perhaps due to globalization. Therefore from the capital's perspectively: the transistors are increasingly cheap while the designers are not. So it makes sense to use transistors in a brute force way that does not require much designer time: adding cache and replicating cores.
- Designers have bounded rationality. Given the same time to market, they can use more transistors in simpler designs, such as memory and FPGA.
- The slow down of Moore's Law and the desire for energy efficiency (especially for battery-powered systems), on the other hand, are incentizing ``smarter'' ways of using transistors: the use of application-specific integrated circuit (ASIC) as part of the chip to implement common tasks efficiently. For example, speech codec, audio codec, face detection are all done with dedicated hardware IP modules in the application processor of smartphones today. As a result, modern processors are becoming heterogeneous and a system-on-a-chip
The economics of Moore's Law
- Moore's Law implies the cost per transitor drops exponentially from one technology node to the next (because of exponentially more transistors per chip).
- Rock's Law says that the cost of the fabrication plant increases exponentially from one technology node to the next.
- Rock's Law dicated the emergence of the foundry model in the semicondutor industry: many fabless companies (and their markets) amortize the growing cost of fabrication plants.
- Rock's Law implies a bigger market is needed for each new process technology node (and its fabrication plant).
- Globalization growed the markets for the semiconductor industry, helped sustain the Moore's Law.
- The fundmental reason for a slowing Moore's Law is economic: since the market can not grow spatially anymore (via globalization and population growth); it has to grow temporally (via a longer lifetime of the fabrication plant and semiconductor products).
Difference between Hardware and Software in Complexity
Hardware has embraced testing and verification earlier and better than software, thanks to the high cost of fabrication and revising hardware.
- Hardware complexity is bounded by fabrication; software complexity is almost unbounded
- Replicating/distributing software is almost free
- Revising software is easier; Internet makes distribute revisions easier; cloud makes it possible to hide software revisions from end users.
Source lines of code in recent releases of Windows: about 50 M; recent Linux kernel: 20 M; recent Linux distribution (Debian 7): 420 M; a premium car: 100 M (70 to 100 ECUs).
How many lines of code can we write per day?
- Depends the complexity of the code: OS code>>compiler>>applications
- Quote from The book the Mythical Man-Month (MMM) by the Turing award winner Frederick Brooks: ``effort=constant X (# of instructions)^1.5 (Chapter 8 Calling the shot)
- Independent of the programming language used
- ~10 lines per man-day according to MMM, four decades ago
Maurice Wilkes, a computer pioneer and Turring award winner, famously said, ``I well remember when this realization first came on me with full force...that a good part of the remainder of my life was going to be spent in finding errors in my own programs.''
Mathematical notes about ``large''
Larger can be simpler
- Central limit theorem: the average of a large number of independent identicallly distributed random variables tends to have a ``simple'' distribution
- Law of large numbers: the average of a large number of repeated experiments should approach the expected value.
- Massive MIMO, a key technology for 5G wireless, argues for a large number of antennas on cellular base stations. A key motivation is that with a large number of antennas, a ``simpler'' algorithm achieves the capacity.
Different sizes of infinities
How large is infinite? A set can have an infinite number of members. Not all infinite sets are equal in size though. The size of an infinite set is measured by its cardinality. Georg Cantor proved
that the set of natural numbers has a lower cardinality than that of real numbers. He also advanced the famous continuum hypothesis that postulates that there is ``no set whose cardinality is strictly between that of the natural numbers and that of the real numbers.
Georg Cantor used the diagonal argument to prove that it is impossible to establish a one-to-one correspondence between the set of natural numbers and that of real numbers. This approach is behind Alan Turing's proof that there are problems that are NOT computable. This result says the answer to the Entscheidungsproblem advanced by David Hilbert in 1928 is No. And it effectively points out the limitation of modern computers, including hardware, programming language, and software, and human languages. In a much simplified sense, they are limited because one-to-one correspondence can be established between the sets of all possible computers, that of all programs, that of English texts, and that of the natural numbers.
Is there a limit to how large systems can we build?
Pessimism vs. Optimism as explained in The Beginning of Infinity: Explanations That Transform the World by David Deutsch.
- Pessimism says there is a hard limit to our understanding of the world and as a result, how complex systems we can build.
- Optimism says there is no limit but the humanity has already discovered powerful methods toward an infinite progress.
- The scientific method for finding a better explanation.
- Liberal democracy for getting rid of bad leaders/policies.
Note about population
As we attempt to understand the end of Moore's Law, the growth of operating systems, the power of opensource, we need to pay attention to human reasons, especially population. The number of developers matters for a software project; the number of paying customers matters for a product, i.e., processors; the number of users matters for user interface design (the more users, the easier the user interface must be).
Readings related to our discussion of population