Marten Bütow
25. March 2019 0
Cloud

Colocation as sidekick for the Public Cloud

The IT manager has just communicated a new epoch-making IT strategy that will catapult the company into the digital future. “Public Cloud only” made the faces of those present light up as if by magic. There’s a very real spirit of optimism. Then somebody in the back row puts up their hand: “What will we do with our Teradata?” The temperature in the room cools down rapidly.

Legacy systems and the Public Cloud strategy

Companies with Public-Cloud-First or Public-Cloud-Only strategies are quite common today. The cloud not only looks attractive but is also in many cases an actual alternative to traditional IT purchasing and delivery models and provides tangible benefits – adaptable costs, fast scalability, and relief for infrastructure management. You know all that. But when a company has been in the market for a few years (or even decades), it’s very likely that somewhere in a corner of its (in-house) data center an old and traditional (legacy) system is still happily churning out data. And usually for business-critical purposes too. It’s one of those “better don’t touch” situations. I know what I’m talking about. There are a considerable number of such systems in our data centers – mostly in caged areas.

And what now? As early as 2011, Gartner put forward the 5-R strategy for the migration of applications to the cloud. And a little later, Stephen Orban of AWS refined and expanded this approach (on 6 R). He states that in addition to Rehost, Replatform, Repurchase, and Refactor/Rearchitect, the Retire and Retain options also exist. That is: switch off or preserve/retain in the existing state.

Switch off or retain?

You can easily solve the  question of switching off or retaining for a business-critical system. In all seriousness, nobody wants to risk retiring his central data warehouse (though that would certainly be an interesting experience). So, the Exadata, Teradata, or iSeries remain – if it’s too expensive is to rebuild or migrate them. The perfectly reasonable overall solution in the real world is: everything that you can migrate in a commercially and technically sensible way goes straight to the cloud. Everything else stays just where it is, for the time being.

The new space for legacy systems: in the cloud data center

However, if as a company you intend a radical and very fundamental switch toward the Public Cloud (and possibly the in-house data center has been leased out or sold), then it becomes high time that you  find a suitable place for the legacy systems. And that’s unlikely to be under the desk of the colleague whose words triggered the discussion in the first place.

Because the legacy systems frequently have to cooperate with the new cloud systems in a business process sense, the ideal solution is that you simply put the legacy systems into the same data center from which the cloud is obtained. The reward will be unbeatable latencies. And thus the rather outdated concept of colocation is given a new meaning – as a complementary module in a Public Cloud strategy.

In this way, the company that is moving into the cloud also achieves a crucial advantage of the cloud for its legacy systems: the employees are relieved of the relatively unprofitable task of infrastructure management and the company benefits from the probably higher-level IT management and security level of the cloud provider by moving to a tier-3 data center (this level at least should be expected from a Public Cloud provider.).

Getting rid of outdated methods

True – a complete cloud migration would be more spectacular. With a combined Public Cloud/ colocation approach you can get rid of at least one outdated method: the provision of electricity, air conditioning, access security, and reliability are no longer parameters that differentiate between competitors. So why waste your own capacities on it?

Though Public Cloud plus colocation is not definitively convincing, the concept does provide tangible added value, holistic prospects for the complete IT inventory and – let’s not forget – simplicity. It’s not rocket science, but it’s pragmatic. Worth a thought, right?

 

Leave a Reply

Your email address will not be published.

* Mandatory field

 
 

Twitter

tsystemsCom @tsystemsCom
T-Systems  @tsystemsCom
#TAKEPART: David Rodríguez, DevOps Lead Architect & Agile evangelist at T-Systems Iberia says, that agile working i… https://t.co/WjBmmKKKCl 
T-Systems  @tsystemsCom
From #warehouse management to #autonomous delivery vehicles: Dr. Nico Piatkowski from TU Dortmund explains what… https://t.co/uf4OkCk2vv 
T-Systems  @tsystemsCom
How do #production companies benefit from #realtime – now and in the future? Professor and founder Günther Schuh ex… https://t.co/jxPBwYm1KD 
T-Systems  @tsystemsCom
Companies that utilize DevOps practices are deploying code up to 30 times faster than their competitors. But how sh… https://t.co/eMPYz4WOID 
T-Systems  @tsystemsCom
From $182.4 billion in 2018 to $214.3bn in 2019: @cloud_comp_news uncovers the insight behind @Gartner_inc’s latest… https://t.co/MHlDFcIJQU 
T-Systems  @tsystemsCom
"News can be delivered fitting to the user and its specific usage situation," says data journalist Marco Maas. Read… https://t.co/uk5hx8tAyW 
T-Systems  @tsystemsCom
Superior overall performance: In comparison with its competitors, the Open #TelekomCloud from @Telekom_group achiev… https://t.co/WKXWrNr8XO 
T-Systems  @tsystemsCom
New technologies have disrupted the old way of doing business, but there are also threats: This is how #Blockchain https://t.co/NN1bpcaUj7 
T-Systems  @tsystemsCom
Live website tracking: How does #realtime in #eCommerce increase sales? And how can companies better understand… https://t.co/eHQyaYDdYB