REBLOG: For half a century computer performance has roughly doubled every two years, but the laws of physics place insurmountable barriers on how long this growth can occur.
Originally
by
In April 1965, a young researcher named Gordon Moore wrote a short article for the now-defunct Electronics Magazine pointing out that each year, the number of transistors that could be economically crammed onto an integrated circuit roughly doubled. Moore predicted that this trend of cost-effective miniaturization would continue for quite some time.
Two years later Moore co-founded Intel Corporation with Robert Noyce. Today, Intel is the largest producer of semiconductor computer chips in the world, and Moore is a multi-billionaire. All this can be traced back to the semiconductor industry’s vigorous effort to realize Moore’s prediction, which is now known as “Moore’s Law.”
There are several variations of Moore’s Law—for instance, some formulations measure hard disk storage, while others concern power consumption or the size and density of components on a computer chip. Yet whatever their metric, nearly all versions still chart exponential growth, which translates into a doubling in computer performance every 18 to 24 months. This runaway profusion of powerful, cheap computation has transformed every sector of modern society—and has sparked utopian speculations about futures where our growing technological prowess creates intelligent machines, conquers death, and bestows near-omniscient awareness. Thus, efforts to understand the limitations of this accelerating phenomenon outline not only the boundaries of computational progress, but also the prospects for some of humanity’s timeless dreams.
Read the entire article at Seed Magazine