Showing posts with label Latest Computers History. Show all posts
Showing posts with label Latest Computers History. Show all posts

Saturday, January 15, 2011

Most Popular Laptops History, Key Features, Price List



A computer is a machine that manipulates data according to a list of instructions.
The first devices that resemble modern computers date to the mid-20th century (around 1940 - 1945), although the computer concept and various machines similar to computers existed earlier. Early electronic computers were the size of a large room, consuming as much power as several hundred modern personal computers. Modern computers are based on tiny integrated circuits and are millions to billions of times more capable while occupying a fraction of the space. Today, simple computers may be made small enough to fit into a wristwatch and be powered from a watch battery. Personal computers, in various forms, are icons of the Information Age and are what most people think of as "a computer"; however, the most common form of computer in use today is the embedded computer. Embedded computers are small, simple devices that are used to control other devices — for example, they may be found in machines ranging from fighter aircraft to industrial robots, digital cameras, and children's toys.
The ability to store and execute lists of instructions called programs makes computers extremely versatile and distinguishes them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a certain minimum capability is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, computers with capability and complexity ranging from that of a personal digital assistant to a supercomputer are all able to perform the same computational tasks given enough time and storage capacity.
 
The era in which computing power doubles every two years is drawing to a close, according to the man behind Moore's Law Jonathan Richards For decades it has been the benchmark by which advancements in computing are measured. Now Moore's Law - the maxim which states that computers double in speed roughly every two years - has come under threat, from none other than the man who coined it. Gordon Moore, the retired co-founder of Intel, wrote an influential paper in 1965 called 'Cramming more components onto integrated circuits', in which he theorised that the number of transistors on a computer chip would double at a constant rate. Silicon Valley has kept up with his widely accepted maxim for more than 40 years, to the point where a new generation of chips, which Intel will begin to produce next year, will have transistors so tiny that four million of them could fit on the head of a pin. Related Links * IBM unveils nanotechnology chip advance * Intel chips away at AMD * Intel apologises for 'racist' ad In an interview yesterday, however, Mr Moore said by about 2020, his law would come up against a rather intractable stumbling block: the laws of physics. "Another decade, a decade and a half, I think we'll hit something fairly fundamental," Mr Moore said at Intel's twice-annual technology conference. Then Moore's Law will be no more. Mr Moore was speaking as Intel gave its first demonstration of a new family of processors, to be introduced in November, which contain circuitry 45 nanometres - billionths of a metre - wide. The 'Penryn' processors, 15 of which will be introduced this year, with another 20 to follow in the first quarter of 2008, will be so advanced that a single chip will contain as many as 820 million transistors. Computer experts said today that a failure to live up to Moore's Law would not limit the ultimate speed at which computers could run. Instead, the technology used to manufacture chips would shift. The current method of Silicon-based manufacturing is known as "bulk CMOS", which is essentially a 'top-down' approach, where the maker starts with a piece of Silicon and 'etches out' the parts that aren't needed. "The technology which will replace this is a bottom-up approach, where chips will be assembled using individual atoms or molecules, a type of nanotechnology," Jim Tully, chief of research for semi-conductors at Gartner, the analyst, said. "It's not standardised yet - people are still experimenting - but you might refer to this new breed of chips as 'molecular devices'." Anthony Finkelstein, head of computer science at University College London, said, however, that a more pressing problem in the meantime was to write programs which took full advantage of existing technologies. "It's all very well having multicore chips in desktop machines, but if the software does not take advantage of them, you gain no benefit." "We are hitting the software barrier before we hit the physical barrier," he said. Mr Moore, who is 78, pioneered the design of the integrated circuit, and went on to co-found Intel in 1968, where he served as chief executive between 1975 and 1987.




History


The underlying concept of cloud computing dates back to the 1960s, when John McCarthy opined that "computation may someday be organized as a public utility." Almost all the modern-day characteristics of cloud computing (elastic provision, provided as a utility, online, illusion of infinite supply), the comparison to the electricity industry and the use of public, private, government and community forms was thoroughly explored in Douglas Parkhill's 1966 book, The Challenge of the Computer Utility.


The actual term "cloud" borrows from telephony in that telecommunications companies, who until the 1990s primarily offered dedicated point-to-point data circuits, began offering Virtual Private Network (VPN) services with comparable quality of service but at a much lower cost. By switching traffic to balance utilization as they saw fit, they were able to utilize their overall network bandwidth more effectively. The cloud symbol was used to denote the demarcation point between that which was the responsibility of the provider from that of the user. Cloud computing extends this boundary to cover servers as well as the network infrastructure. The first scholarly use of the term “cloud computing” was in a 1997 lecture by Ramnath Chellappa.


Amazon played a key role in the development of cloud computing by modernizing their data centers after the dot-com bubble, which, like most computer networks, were using as little as 10% of their capacity at any one time, just to leave room for occasional spikes. Having found that the new cloud architecture resulted in significant internal efficiency improvements whereby small, fast-moving "two-pizza teams" could add new features faster and more easily, Amazon initiated a new product development effort to provide cloud computing to external customers, and launched Amazon Web Service (AWS) on a utility computing basis in 2006.


In 2007, Google, IBM and a number of universities embarked on a large scale cloud computing research project. In early 2008, Eucalyptus became the first open source AWS API compatible platform for deploying private clouds. In early 2008, OpenNebula, enhanced in the RESERVOIR European Commission funded project, became the first open source software for deploying private and hybrid clouds and for the federation of clouds . By mid-2008, Gartner saw an opportunity for cloud computing "to shape the relationship among consumers of IT services, those who use IT services and those who sell them" and observed that "organisations are switching from company-owned hardware and software assets to per-use service-based models" so that the "projected shift to cloud computing ... will result in dramatic growth in IT products in some areas and significant reductions in other areas.