Blog

Technology

The Cloud Horizon

In an opinion article for Information Age this week I talked about the paradigm shift that the industry is currently going through

Whilst many commentators believe that the battles being fought right now will define the entirety of that future, there are also those in computer science who think that we’ve just started to break the ground of the new frontiers in computing. If the latter holds true, then as with at so many points in the history of computing, the giants of today may well become the footnotes of the future as more and more disruptive technologies emerge in the coming years. Putting aside for a moment that more radical viewpoint, there are a number of clearly emerging trends in the on-demand space which we think will be substantially influential in the near to mid term.

A race to the bottom

The cloud industry as of 2014 is currently dominated by Amazon and Google, with Microsoft and IBM investing heavily to try and keep up. At this end of the market the big players are engaged in a race to the bottom over pricing, and that is a race that very few can run to the end. Most of the smaller players trying to play in this space will end up being swallowed, and only those with the deepest pockets will be left standing at the end. Amazon and Google have vast revenue streams which they can leverage, whilst Microsoft are betting all on this last throw of the dice as they see their historical core revenue streams being slowly eroded. For IBM too, this has a smell of desperation as their hardware business implodes and the high end consultancy market they rely on becomes increasingly fragmented. Most of the ‘non-native’ cloud offerings, the dominant model in the telco and ISP industry and essentially hosted virtualisation platforms built around tier 1 vendor hardware and VMware, are also ultimately doomed as the economics of massive scalability are inherently opposed to the financial requirements of proprietary licensing.

Although many commentators see this competition and the inevitable consolidation of the market as the endgame for the cloud industry, I don’t believe that being the biggest is necessarily going to be all there is to the still emerging world of on-demand computing – this is just the beginning of the story for the cloud revolution.

It’s not all about price

For many organisations, the question of price is far from the most important issue around their transition to on-demand computing. As we engage with customers at Data News Blog, the conversations we’re having time and time again are about emerging operational and organisational problems. If you know you need 1000 VM’s to solve all your problems, then you’re already well served by Amazon or Google, but this is rarely the actual solution to any real world problems.

The kinds of problems we hear from customers are about crunching complex data sets, storing and accessing huge amounts of data which is growing exponentially year on year, and about the uncertain nature and size of workloads into the future.

Different types of cloud required

Based on this, we see an emerging market for a new type of provider in the cloud computing space, companies who are much more deeply engaged with their customers in particular vertical markets and who share an understanding of the business problems they are facing.

Clouds designed for the broadcast media sector will have very different characteristics to those designed for academic research, and both will differ from the requirements of local and national government. These more specialised clouds will offer targeted software configurations, designed for the particular vertical market, and may also have very specific hardware characteristics. This is already starting with the deployment of GPU based hardware, and this trend will continue into ARM based platforms and more specialised hardware like FPGA’s.

The specialisation will also extend to the network layer, with different requirements for inter-connectivity and routing, and for latency and throughput. As data volumes continue to grow exponentially, being close to the storage will be a key requirement. In the case of the as yet unclear demands of the internet of things, having computation available close to the creators, to to the aggregation points or close to the repository that all the data is aggregated into will also matter. This naturally leads to a requirement for highly localised and regional cloud providers, which will also be driven by general concerns about data security. In many situations, there are strong reasons why using multi-national cloud providers is simply not an option for security or regulatory reasons, and this trend is set to continue with the ongoing emergence of information on surveillance programs and the disintegration of the Safe Harbour agreement.

More and more applications will become naturalised to the distributed environment, becoming massively parallelised through use of eventual consistency and the ability to work around partial failure states. This will lead to cloud brokerage emerging as the standard abstraction layer, with workloads automatically and dynamically allocated across many different physical cloud platforms depending on customer definable characteristics. Cost will undoubtedly be one of these, but performance will also be key and is very dependent on the type of workload. Federation like this depends on interoperability, and open standards like those around Openstack will be the key to participating in these emerging markets.

A new type of relationship

When we first started thinking about the ideas behind Data News Blog, the intention was never to directly compete with the mass market players in the retail cloud space. Instead, we’ve always believed that there are emerging opportunities for a new kind of service provider and new kinds of collaborative relationships with customers. These kind of relationships cross the traditional boundaries of service provision and consulting, and are based on mutual trust and an ambition to push the boundaries of both the traditional customer/supplier relationship and the boundaries of the technology in order to deliver solutions to complex problems.

To operate in these new spaces will require a very particular set of tools, people and approaches, and we think that the envelope of colocation provides the basic building blocks to do this. Colocation has always been about sharing, at it’s lowest level the sharing of space and power, but carrier neutral colocation data centres are also by their nature highly connected hot spots in the fabric of the internet, and so this is the ideal jumping off point for the direction that we’re taking with Data News Blog. Whilst we definitely don’t know all the answers to the wide variety of technology related problem spaces our customers talk to us about, our approach has been to work on assembling the tools we think are going to be needed to tackle these emerging challenges. The combination of multi-site geographically-specific co-location spaces with our own high bandwidth, low latency Metropolitan Area Networks connected into the full range of telco networks and paired with a massively scalable on-demand storage and compute platform and a team of highly skilled engineering talent seems as good a place to start as any.

close
Start typing to see posts you are looking for.
Scroll To Top