Twenty years ago, the work of Tim Berners-Lee gave birth to the WWW and
ushered in the internet revolution. Now, in 2009, we stand on the threshold
of another major disruption in computing, as computing begins to move to a
worldwide network of clouds, the Intercloud.
At Cloudscale we're preparing for the public launch, later in the year, of
the first Intercloud service, a platform for the world's realtime apps. In
response to a number of requests for information, here is a brief overview of
what we have been developing.
The Cloudscale platform can handle all types of realtime data, and is as easy
to use as a spreadsheet. Complex analytics can be run continuously, with
automatic scaling and fault tolerance to ensure realtime responsiveness. The
platform offers users seamless integration from standard desktop or mobile
clients to realtime intercloud apps.
As a self-... (more)
Cloudcel on Ulitzer
Back in 1985, the world was pre-web, data volumes were small, and no one was
grappling with information overload. Relational databases and the shiny new
SQL query language were just about perfect for this era. At work, 100% of the
data required by employees was internal business data, the data was highly
structured, and was organized in simple tables. Users would pull data from
the database when they realized they needed it.
Fast forward to 2010. Today, everyone is grappling constantly with
information overload, both in their work and in their social life. Most ... (more)
Virtualization Track at Cloud Expo
SQL was the first-generation Big Data tool, and MapReduce/Hadoop was the
second-generation tool. Unfortunately, neither of these tools has the
characteristics required to break into the mainstream of data analytics,
where there are now over 100 million business professionals (non-programmers)
grappling with exponentially growing data volumes that they simply can't
However, a new third generation of tools for Big Data is now emerging that
offer the scalability, parallelism, performance and data flexibility of tools
like Hadoop, but, unlik... (more)
Data is growing exponentially everywhere - in business, web, finance,
government, science, and in the world of sensors and smart grids.
Speaking earlier this week at OSBC, Tim O'Reilly said "The future will be
all about who has most data, and who is able to extract meaning from it and
deliver it in real time". He noted that the IT industry is now in the
process of being reinvented around the idea of realtime analysis of "Big
Data" in the cloud, as a must-have adjunct to the much more limited kinds of
data processing and analytics that can be performed on desktop PCs or mobile
Two weeks ago I wrote about "The Need for Speed" in cloud computing, and
asked "Who is going to build the low-latency cloud for enterprise
customers?". Today Werner Vogels and his team at Amazon announced their
Cluster Compute Instances offering.
This is a very important step forward towards the kind of realtime, high
performance cloud that customers such as Cloudscale require to deliver the
next generation of cloud services. In our case, it means we now have three
distinct alternatives for deployment of our massively parallel realtime data
warehouse architecture: standard public... (more)