Tuesday, March 22, 2011

The story of Sissa and Moore


NP-complete problems
(Chapter 8 by S. Dasgupta, C.H. Papadimitriou, and U.V. Vazirani )

Source: Algorithms (2007 McGraw-Hill Higher Education) by
Sanjoy Dasgupta, University of California - San Diego
Christos Papadimitriou, University of California at Berkeley
Umesh Vazirani, University of California at Berkeley

Note: This book evolved over the past ten years from a set of lecture notes developed while teaching the undergraduate Algorithms course at Berkeley and U.C. San Diego.

- http://cseweb.ucsd.edu/~dasgupta/book/index.html
- http://highered.mcgraw-hill.com/sites/0073523402/


The story of Sissa and Moore

Ref.: www.cs.berkeley.edu/~vazirani/algorithms/chap8.pdf

"According to the legend, the game of chess was invented by the Brahmin Sissa to amuse and teach his king. Asked by the grateful monarch what he wanted in return, the wise man requested that the king place one grain of rice in the rst square of the chessboard, two in the second, four in the third, and so on, doubling the amount of rice up to the 64th square. The king agreed on the spot, and as a result he was the rst person to learn the valuable-albeit humbling-lesson of exponential growth. Sissa's request amounted to
264-1 = 18,446,744,073,709,551,615 grains of rice, enough rice to pave all of India several times over!
All over nature, from colonies of bacteria to cells in a fetus, we see systems that grow exponentiall-for a while. In 1798, the British philosopher T. Robert Malthus published an essay in which he predicted that the exponential growth (he called it "geometric growth") of the human population would soon deplete linearly growing resources, an argument that in uenced Charles Darwin deeply. Malthus knew the fundamental fact that an exponential sooner or later takes over any polynomial.

In 1965, computer chip pioneer Gordon E. Moore noticed that transistor density in chips had doubled every year in the early 1960s, and he predicted that this trend would continue. This prediction, moderated to a doubling every 18 months and extended to computer speed, is known as Moore's law. It has held remarkably well for 40 years. And these are the two root causes of the explosion of information technology in the past decades: Moore's law and efficient algorithms.

It would appear that Moore's law provides a disincentive for developing polynomial algorithms. After all, if an algorithm is exponential, why not wait it out until Moore's law makes it feasible? But in reality the exact opposite happens: Moore's law is a huge incentive for developing ef cient algorithms, because such algorithms are needed in order to take advantage of the exponential increase in computer speed.

Here is why. If, for example, an O(2n) algorithm for Boolean satis ability (SAT) were given an hour to run, it would have solved instances with 25 variables back in 1975, 31 variables on the faster computers available in 1985, 38 variables in 1995, and about 45 variables with today's machines. Quite a bit of progress-except that each extra variable requires a year and a half's wait, while the appetite of applications (many of which are, ironically, related to computer design) grows much faster. In contrast, the size of the instances solved by an O(n) or O(n log n) algorithm would be multiplied by a factor of about 100 each decade. In the case of an O(n2) algorithm, the instance size solvable in a fxed time would be multiplied by about 10 each decade. Even an O(n6) algorithm, polynomial yet unappetizing, would more than double the size of the instances solved each decade. When it comes to the growth of the size of problems we can attack with an algorithm, we have a reversal: exponential algorithms make polynomially slow progress, while polynomial algorithms advance exponentially fast! For Moore's law to be refected in the world we need efficient algorithms.

As Sissa and Malthus knew very well, exponential expansion cannot be sustained indefinitely in our nite world. Bacterial colonies run out of food; chips hit the atomic scale. Moore's law will stop doubling the speed of our computers within a decade or two. And then progress will depend on algorithmic ingenuity-or otherwise perhaps on novel ideas such as quantum computation, explored in Chapter 10.
"

New: S. Dasgupta, Two faces of active learning, Theoretical Computer Science, 412(19): 1767-1781, 2011
- http://cseweb.ucsd.edu/~dasgupta/papers/twoface.pdf


Algorithmic Thinking
"Algorithms are the engines of a great majority of systems, natural and artificial alike. This course introduces algorithmic thinking as a discipline for reasoning about systems, taming their complexities, and elucidating their properties. Algorithmic techniques, along with their correctness and efficiency, will be taught through reasoning about systems of interactions, such as markets, that are ubiquitous in our highly connected world."
(Luay Nakhleh, Course: Algorithmic Thinking, COMP 182, www.clear.rice.edu/comp182/)

Laboratory activities are essential to learning activities. Teachers are required to combine computer more effectively as teaching classes with laboratory classes which actually ends a cycle to solve computer problems: the problem --> the mathematical model --> the algorithm --> the program --> computer --> solutions -->check the resulting. There are equivalent Algorithm-Program-Computer System and operations carried out by the algorithm in cyberspace (virtual), that made the Computer System operations by executing the corresponding algorithm program. Algorithm to simulate all operations will be triggered by launching the appropriate program execution algorithm (Vlada, 2004): Abordarea modernă a conceptului de algoritm, CNIV-2004, Noi tehnologii de E-Learning, Conferinţa Naţională de Învăţământ Virtual, Software Educaţional, Ediţia a II-a, [online], Facultatea de Matematică şi Informatică, Universitatea din Bucureşti , Editura Universităţii din Bucureşti,
http://fmi.unibuc.ro/ro/cniv_2004/
Ref.: http://fmi.unibuc.ro/ro/pdf/2004/cniv/gandirealgoritmica.pdf

No comments: