Chuyển đến nội dung chính

History of Cloud Computing

Cloud computing is a computing-infrastructure and software model for enabling ubiquitous access to shared pools of configurable resources (such as computer networks, servers, storage, applications and services), which can be rapidly provisioned with minimal management effort, often over the Internet. Cloud computing allows users and enterprises with various computing capabilities to store and process data either in a privately-owned cloud, or on a third-party server located in a data center - thus making data-accessing mechanisms more efficient and reliable. Cloud computing relies on sharing of resources to achieve coherence and economy of scale, similar to a utility.



The idea of an "intergalactic computer network" was introduced in the sixties by J.C.R. Licklider, who was responsible for enabling the development of ARPANET (Advanced Research Projects Agency Network) in 1969.

His vision was for everyone on the globe to be interconnected and accessing programs and data at any site, from anywhere, explained Margaret Lewis, product marketing director at AMD. "It is a vision that sounds a lot like what we are calling cloud computing."

Other experts attribute the cloud concept to computer scientist John McCarthy who proposed the idea of computation being delivered as a public utility, similar to the service bureaus which date back to the sixties.

Since the sixties, cloud computing has developed along a number of lines, with Web 2.0 being the most recent evolution. However, since the internet only started to offer significant bandwidth in the nineties, cloud computing for the masses has been something of a late developer.

One of the first milestones in cloud computing history was the arrival of Salesforce.com in 1999, which pioneered the concept of delivering enterprise applications via a simple website. The services firm paved the way for both specialist and mainstream software firms to deliver applications over the internet.

The next development was Amazon Web Services in 2002, which provided a suite of cloud-based services including storage, computation and even human intelligence through the Amazon Mechanical Turk.

Then in 2006, Amazon launched its Elastic Compute cloud (EC2) as a commercial web service that allows small companies and individuals to rent computers on which to run their own computer applications.

"Amazon EC2/S3 was the first widely accessible cloud computing infrastructure service," said Jeremy Allaire, CEO of Brightcove, which provides its SaaS online video platform to UK TV stations and newspapers.

Another big milestone came in 2009, as Web 2.0 hit its stride, and Google and others started to offer browser-based enterprise applications, though services such as Google Apps.

"The most important contribution to cloud computing has been the emergence of "killer apps" from leading technology giants such as Microsoft and Google. When these companies deliver services in a way that is reliable and easy to consume, the knock-on effect to the industry as a whole is a wider general acceptance of online services," said Dan Germain, chief technology officer at IT service provider Cobweb Solutions.

Other key factors that have enabled cloud computing to evolve include the maturing of virtualisation technology, the development of universal high-speed bandwidth, and universal software interoperability standards, said UK cloud computing pioneer Jamie Turner.

Turner added, "As cloud computing extends its reach beyond a handful of early-adopter Google Docs users, we can only begin to imagine its scope and reach. Pretty much anything can be delivered from the cloud."

ORIGIN OF THE TERM OF CLOUD COMPUTING
The origin of the term cloud computing is unclear. The expression cloud is commonly used in science to describe a large agglomeration of objects that visually appear from a distance as a cloud and describes any set of things whose details are not inspected further in a given context. Another explanation is that the old programs draw network schematics surrounded the icons for servers with a circle, and a cluster of servers in a network diagram had several overlapping circles, which resembled a cloud.

In analogy to above usage the word cloud was used as a metaphor for the Internet and a standardized cloud-like shape was used to denote a network on telephony schematics and later to depict the Internet in computer network diagrams. With this simplification, the implication is that the specifics of how the end points of a network are connected are not relevant for the purposes of understanding the diagram. The cloud symbol was used to represent networks of computing equipment in the original ARPANET by as early as 1977, and the CSNET by 1981 both predecessors to the Internet itself.

References to cloud computing in its modern sense appeared as early as 1996, with the earliest known mention in a Compaq internal document.

The popularization of the term can be traced to 2006 when Amazon.com introduced the Elastic Compute Cloud.

THE DEVELOPMENT OF CLOUD COMPUTING 
During the mid-1970s, time-sharing was popularly known as RJE (Remote Job Entry); this terminology was mostly associated with large vendors such as IBM and DEC. IBM developed the VM Operating System (first released in 1972) to provide time-sharing services via virtual machines.

In the 1990s, telecommunications companies, who previously offered primarily dedicated point-to-point data circuits, began offering virtual private network (VPN) services with comparable quality of service, but at a lower cost. By switching traffic as they saw fit to balance server use, they could use overall network bandwidth more effectively. They began to use the cloud symbol to denote the demarcation point between what the provider was responsible for and what users were responsible for. Cloud computing extends this boundary to cover all servers as well as the network infrastructure.

As computers became more prevalent, scientists and technologists explored ways to make large-scale computing power available to more users through time-sharing. They experimented with algorithms to optimize the infrastructure, platform, and applications to prioritize CPUs and increase efficiency for end users.

Since 2000 cloud computing has come into existence. In early 2008, NASA's OpenNebula, enhanced in the RESERVOIR European Commission-funded project, became the first open-source software for deploying private and hybrid clouds, and for the federation of clouds. In the same year, efforts were focused on providing quality of service guarantees (as required by real-time interactive applications) to cloud-based infrastructures, in the framework of the IRMOS European Commission-funded project, resulting in a real-time cloud environment.

By mid-2008, Gartner saw an opportunity for cloud computing "to shape the relationship among consumers of IT services, those who use IT services and those who sell them" and observed that "organizations are switching from company-owned hardware and software assets to per-use service-based models" so that the "projected shift to computing... will result in dramatic growth in IT products in some areas and significant reductions in other areas." Microsoft Azure became available in late 2008.

In July 2010, Rackspace Hosting and NASA jointly launched an open-source cloud-software initiative known as OpenStack. The OpenStack project intended to help organizations offer cloud-computing services running on standard hardware. The early code came from NASA's Nebula platform as well as from Rackspace's Cloud Files platform.

On March 1, 2011, IBM announced the IBM SmartCloud framework to support Smarter Planet. Among the various components of the Smarter Computing foundation, cloud computing is a critical piece.

On June 7, 2012, Oracle announced the Oracle Cloud. While aspects of the Oracle Cloud are still in development, this cloud offering is posed to be the first to provide users with access to an integrated set of IT solutions, including the Applications (SaaS), Platform (PaaS), and Infrastructure (IaaS) layers.

Tự Học tiếng Anh Giao Tiếp

Tự Học tiếng Anh Giao Tiếp
Những video tự học tiếng Anh giao tiếp đơn giản dễ thuộc

Bài đăng phổ biến từ blog này

Vì sao xác con tàu huyền thoại Titanic vẫn chưa được trục vớt?

Việc khôi phục lại di tích từ các thảm kịch của lịch sử không phải lúc nào cũng dễ dàng, và đó chính là trường hợp của việc trục vớt xác con tàu huyền thoại Titanic. Tháng 9/1985, đoàn thám hiểm chung giữa Mỹ và Pháp đã xác định vị trí xác tàu Titanic nằm dưới đáy đại dương ở độ sâu khoảng 3900m. Tìm thấy xác tàu dưới đáy đại dương sau 73 năm Đống đổ nát dưới đáy đại dương không được phát hiện trong nhiều thập kỷ cho đến năm 1985. Vào tháng 9/1985, nhà hải dương học Robert Ballard muốn thử nghiệm tàu ​​ngầm robot, một loại công nghệ mới sẽ được sử dụng để tìm kiếm tàu ​​chiến và tàu ngầm bị chìm. Ông yêu cầu Hải quân Mỹ cho phép ông thử xác định vị trí Titanic bằng công nghệ mới này và đã được cấp phép để tiến hành. Ngày 1/9/1985, một đoàn thám hiểm chung giữa Mỹ và Pháp đã xác định vị trí xác tàu Titanic Chỉ sau hai tuần tìm kiếm ở độ sâu 12.500m dưới mặt nước, nhóm các nhà khoa học do Robert Ballard dẫn đầu đã xác định vị trí xác con tàu ở độ sâu hơn 3.900m dưới bề mặt Đại Tây Dương,...

Google xóa rào cản ngôn ngữ với cuộc gọi dịch giọng nói sống động như thật

Trong bối cảnh thế giới ngày càng kết nối, rào cản ngôn ngữ vẫn là một trong những thách thức lớn đối với giao tiếp toàn cầu. Với hơn 7.000 ngôn ngữ được sử dụng trên toàn thế giới, việc giao tiếp hiệu quả giữa những người nói các ngôn ngữ khác nhau không phải lúc nào cũng dễ dàng. Tuy nhiên, tại sự kiện Google I/O 2025 diễn ra vào ngày 21/5/2025, Google đã công bố một bước tiến công nghệ mang tính đột phá: tính năng dịch giọng nói trực tiếp trong cuộc gọi. Tính năng này không chỉ hứa hẹn xóa bỏ rào cản ngôn ngữ mà còn mang lại trải nghiệm giao tiếp tự nhiên, gần gũi và liền mạch hơn bao giờ hết. Bài viết này sẽ phân tích chi tiết về công nghệ mới của Google, những ứng dụng tiềm năng và tác động mà nó có thể mang lại cho đời sống và công việc. Công nghệ dịch giọng nói trực tiếp: một bước đột phá Tính năng dịch giọng nói trực tiếp được giới thiệu bởi CEO Google Sundar Pichai tại Google I/O 2025, được mô tả như một “bước đột phá công nghệ” giúp phá vỡ rào cản ngôn ngữ. Khác biệt hoàn toà...

Hiện tượng "nửa nạc nửa mỡ" trong ngôn ngữ giới trẻ: Góc nhìn và suy ngẫm

Trong bối cảnh hội nhập toàn cầu, ngôn ngữ của giới trẻ đang chứng kiến một sự thay đổi đáng kể, nổi bật nhất là xu hướng sử dụng ngôn ngữ “nửa nạc nửa mỡ” – cách nói pha trộn giữa tiếng Việt và tiếng Anh. Những câu nói như “Có ok hay không thì mày nhớ confirm cho người ta nha” hay “Deadline gần kề rồi, mày finish cái project đi nha!”  đã trở thành một phần quen thuộc trong giao tiếp hàng ngày của giới trẻ, đặc biệt ở các đô thị lớn. Hiện tượng này không chỉ phản ánh sự sáng tạo và năng động của thế hệ trẻ mà còn đặt ra những câu hỏi về việc bảo vệ sự trong sáng của tiếng Việt và hiệu quả giao tiếp trong bối cảnh văn hóa đa dạng.     Ngôn ngữ “nửa nạc nửa mỡ” không phải là một hiện tượng mới mẻ. Từ hàng chục năm trước, nó đã manh nha xuất hiện trong các bài hát nhạc trẻ. Trào lưu này không chỉ dừng ở âm nhạc mà còn lan sang nghệ danh của các nghệ sĩ, tạo nên một làn sóng “Tây hóa” trong cách đặt tên và giao tiếp. Những cách dùng từ này nhanh chóng được giới trẻ đón ...

Some of the best muscle cars to ever tear up the tarmac

1964 Pontiac GTO The original 1960s muscle car, the Pontiac GTO had amazing performance for its day 1964 Pontiac GTO The Pontiac GTO is widely acknowledged as the car that really kicked off the 1960s Muscle Car era. It was initially offered as an optional package on the mid-size Pontiac Tempest and was the first truly mass-market high performance model to follow the big displacement engine route, using a tuned 389 cubic inch (6.4-litre) V8 engine in place of the entry-level Tempest’s 140bhp six. 1970 Chevrolet Chevelle SS 454 The Chevelle Super Sport was Chevrolet's 60s muscle car monster 1970 Chevrolet Chevelle SS 454 Chevrolet’s first foray into the Muscle Car world was with the Chevelle Super Sport (or SS) introduced in 1964. Early in its life it was significantly out-gunned by the Pontiac GTO, but it wasn’t long before Chevy started turning up the wick. By 1970, the Chevelle SS had reached its most outrageous specification, with a huge 454 cubic inch (7.5-litre) big block V8 th...

The first robot to paint like an artist

Gripping the brush, Ai-Da's robot arm moves slowly but accurately, dipping into the palette one by one, then sketching the lines on the paper. Ai-Da (centre) is painting a guest portrait. Photo: Guardian In her small London room, Ai-Da glued her eyes to every stroke, with the same attention as the average person. Unlike robots that rely on available paintings, Ai-Da chooses and makes decisions for each stroke to produce works. This robot spends an average of 5 hours on each picture, no two pictures are alike. "Ai-Da is an intellectual and groundbreaking tool," said Aidan Meller, head of the robotics team. "We spent a lot of time and money creating a smart painter." Ai-Da started showing off its painting abilities last year, but new enhancements allow the robot to think at a higher level thanks to an upgraded AI algorithm. According to Meller, machines like Ai-Da change the way people envision robots. Now, there is no longer the question "can robots create a...

Chevrolet Impala

The 1959 Chevrolet Impala was redesigned. Sharing bodyshells with lower-end Buicks and Oldsmobiles as well as with Pontiac, part of a GM economy move, the Chevrolet's wheelbase 1-1/2 inches longer. Using a new X-frame chassis, the roof line was three inches lower, bodies were two inches wider, and curb weight increased. Its tailfins protruded outward, rather than upward. The taillights were a large "teardrop" design at each side, and two slim-wide nonfunctional front air intake scoops were added just above the grille. 1959 Chevrolet Impala 4-Door Sedan The Impala became a separate series, adding a four-door hardtop and four-door sedan, to the two-door Sport Coupe and convertible. Sport Coupes featured a shortened roof line and wrap-over back window. The standard engine was an I6, while the base V8 was the carryover 283 cu in (4,640 cc), at 185 hp (138 kW). Optional were a 283 cu in with 290 hp (220 kW) and 348 cu in (5,700 cc) V8 up to 315 hp (235 kW). Standard were front...

The ten Iconic American Muscle Cars That Defined Power And Performance

Some of the most iconic muscle cars of all time had extremely limited production numbers and can sell for 6 figures or even 7 figures in some extreme cases. Many collectors are willing to pay such exorbitant price tags for outstanding condition vehicles with original factory parts and matching serial numbers because these models literally defined what many see as the greatest era of American muscle cars. With that in mind, the American scene really started to take off with iconic muscle cars from the 1960s leading to what many would call the most iconic muscle cars ever made in the 1970s. 1970 Dodge Challenger R/T - 425 HP, 0-60 MPH In 5.4 Seconds The 1970 Dodge Challenger R/T had 4 engine options: the 383 Magnum, 440 Magnum, 440 Six Pack, and 426 Hemi. The 426 Hemi V8 engine put out 425 hp and 490 ft-lbs of torque which was more than enough to get the adrenaline pumping. The R/T only options included a Rallye instrument cluster which consisted of a 150 MPH speedo, 8,000 rpm tach, and ...