As a large, global enterprise with multiple business units, General Electric (GE) is at the forefront of the digital transformation that’s taking place across all industries. Cloud computing is enabling enterprises like GE to ease this transition and allow the company to optimize its infrastructure at the same time.
We talked to GE Executive Thomas Martin, the Application Transformation Leader for the corporation and part of GE Digital, about the company’s current transition to the cloud. Here’s what he had to say:
Ryan Schradin (RS): Why is GE moving to the cloud? What drivers are making the cloud essential for the company now and into the future?
Thomas Martin (TM): By developing software operations in each of our core business lines we are on track to generate $15B of our revenue from software by 2020, making us one of the top ten software companies in the world.
Many of the analytical models that drive Customer outcomes rely on the physics-based engineering designs and product related data that comes from within our enterprise IT systems. To optimize access to this information we are rapidly working through decades of application portfolio bloat and complexity. At the start of this journey we had over 9,000 applications, 300 plus ERPs running the business, and countless physical datacenters.
To move forward, we needed to simultaneously simplify the application portfolio by eliminating and consolidating many legacy systems, while moving away from bespoke, stand-alone applications. The resulting integrated ecosystem enables the data to remain connected across our Digital Thread of business processes.
This transformation has also required us to rethink the focus of what differentiates us in the market place. We are not going to help grow our business units because of our ability to rack and provision physical infrastructure. And that’s why we’re utilizing cloud service providers for our basic compute, storage and networking infrastructure. The self-service “as-software” capacity offerings provide a whole new level of experience for our developers, so we’re partnering with cloud providers whose entire business is to enable those capabilities.
RS: How has the move to the cloud benefited GE? Are there examples of cost savings or improved efficiency as a result of cloud implementations?
TM: As of today, we’ve moved more than 2,000 workloads into the public cloud, and have eliminated over 2,500 applications as part of our transformation efforts. Conservatively we are seeing an average cost reduction of 45% over our traditional in-house hosting solutions.
We continue to decrease the amount of applications we’re running in our datacenters with a goal to move a total of 9,000 workloads to the public cloud and reduce the number of datacenters that we have to 4. These remaining internal centers will only contain our most sensitive data, with everything else is going into the cloud.
One of the biggest impacts has been cultural. We’ve moved more responsibility, access, and capabilities to the applications teams for control of grassroots infrastructure. We are also moving things closer to the development team making sure all the security controls are in place. This has resulted in application deployments going from days to minutes with a significant reduction in manual touch points.
RS: Are there any case studies within GE that illustrate the power of the cloud?
TM: GE’s Oil & Gas business has been a fast adopter of cloud and to date has migrated over 300 applications. One of these applications is a configurator that the sales team uses at customer sites to sell products.
We used to spend $62,000 annually to run this application in our physical datacenter and bring in about $600,000 in orders. Changes to the application took approximately 20 days to complete through traditional release cycles. After transforming the application to a modern architectural design, and implementing continuous integration/development for code deployment, this application now costs $6,000 to run in the cloud and code update are pushed in two minutes. We also improved performance, availability, and mobility of the application helping drive additional sales.
RS: You’ve said previously that GE is looking to embrace a hybrid approach to the cloud by utilizing services from multiple cloud providers. Why is this a necessary step for GE today? Does this create any challenges?
TM: There are multiple factors, but the first and largest factor is that each of the different providers brings some nuance. Each has aspects to them that are differentiators in and of themselves.
For example, for a lot of the codebase that can be run in Lambda, we are now starting to run serverless applications. We see Microsoft as having an advantage in the .NET space to do similar things as they mature out their offerings.
For our pure Infrastructure as a Service (IaaS) deployments, we want the ability to fluidly move resources, enabling the ability to work across providers. One of the things that we’re experimenting with as we ramp up in Azure is having applications with a hybrid architecture with some of the assets in AWS and some in Azure.
We’re also working with Oracle’s Cloud Platform. There we are looking to run our ERPs – Enterprise Resource Planning systems – in the cloud. Foundational systems like ERPs, when coupled with predictive analytics drive our Digital Thread. As systems of record, these ERPs are part of the larger ecosystem of applications within the enterprise.
One of the challenges that we’re going to face is that as we deploy an ERP into Oracle’s Cloud Platform, is managing that overall ecosystem. We need to ensure that our other applications hosted in AWS and Azure, all tie together, and can continue to operate. We have to ensure a cohesive experience across the providers.
One of the ways we’re going to be able to do that – and manage it – is through Event Driven Automation, the GE Bot Army. That’s the only way we’re going to be able to tackle this level of cross cloud complexity. I don’t think it’s feasible without them.
RS: What are these Bots? And how do they work?
TM: We’ve evolved our concept of the Bot Army quite a bit. Originally, Bots were single purpose, simplistic scripts that automated a specific function in a specific cloud. Today, we have bots running around the network to enforce policy.
Not only do they identify non-compliance (regarding cost, security or best practices), they also take automated action to bring our cloud resource back into compliance. We’ve employed a cloud-agnostic platform from DivvyCloud to build Bots in a unified way that enables us to enable consistent policy automation across different cloud deployments (AWS, Azure and internal VMware for example).
As these Bots take action, it is essential that we have the right user groups and security in place to prevent unintentional impact, and malicious threats.
RS: What are the benefits you’ve experienced from the deployment of these Bots?
TM: Compliance is a big one that speaks for itself. Another big one is the optimization of capacity by making sure servers are sized correctly and data is stored only as long as needed.
Being able to sense variations globally and proactively diagnosing issues is essential so that we can automatically shift the network or load without the end-user ever noticing.
RS: GE’s cloud initiatives seem to be paying dividends, and the bots that you’re implementing appear to be effective in helping to manage your hybrid cloud environment. What’s next for the company as it relates to the cloud?
TM: This is just the tip of a big iceberg. We are driving a better user experience and productivity is a big piece of it. As we continue to innovate in the area of Industrial Internet of Things (IoT) and build the factory of the future, there will be continuous integration and deployment of cloud applications. We will continue working on innovative ways to optimize our infrastructure accordingly.