FOSE definitely saved one of the most interesting sessions for last – I sat in on a panel discussion today on cloud computing, which provided some interesting industry and agency insight on clouds and how they should be used within the federal government.
“Transforming Government Technology with Cloud Computing” was moderated by Dan Mintz, former CIO for the US Department of Transportation, and featured three panelists with a great deal of cloud computing and general IT experience between them:
Linda Cureton, CIO, Goddard Space Flight Center and Director of Information Technology and Communications Directorate with NASA Tim May, Senior Vice President, Corporate Development, with Apptis David Rubal, Regional Manager, Federal Unified Communications with Cisco Dan Mintz started off by tying the high attendance (it was one of the better attended sessions I went to at FOSE this year) back to Federal CIO Vivek Kundra’s technology pillars, particularly “Finding the Innovative Path.”
He laid out the goals of the panel, which were to discuss what cloud computing is/isn’t and how to implement it in the federal government – he finished off his introduction by saying that cloud computing is another example of “what’s old is new,” a theme that would be repeated throughout the session.
First up was Linda Cureton, and she echoed Dan’s omment about cloud computing being an old technology – referred to in its first iteration as “time sharing.” Linda then provided her explanation of cloud computing, with the caveat that the platform is still being shaped -she defined it as services delivered via IP and associated with implied or explicit service levels.
She brought up that cloud offerings take many shapes, including virtualization, software as a service (SaaS), infrastructure and even Web 2.0. She looks at the cloud as simply the delivery of “old services” over the Internet, which scales to nothing more than a big management problem. Linda also stated that with the consumerization of IT, anyone can be a data center manager – all they do is get their services over the Internet, with no infrastructure required.
Linda went on to state that federal CIOs should embrace the cloud as an opportunity to advise agency heads on how new technologies can help federal workers be more productive and provide better services/information to citizens. She listed some ways that NASA has embraced new technology, not necessarily the cloud, through Facebook, YouTube and iTunes to encourage conversation and transparency. Apptis’ Tim May spoke next, and said flat out that at this time last year, the cloud wasn’t even in consideration for government IT. He defines the cloud as, quite simply, IT assets not owned by the user, with a pay per use or metered approach, with on-demand availability, scalability and elasticity.
While the framework for the current cloud platform has been around for 30 years, it’s nothing like those older technologies – timesharing, facilities management and ASPs being three specific examples used. He also broke cloud offerings down from “XaaS,” as in “whatever-as-a-service,” to:
– Infrastructure-as-a-Service (IaaS) – examples are Amazon EC2, GoGrid, Mosso;
– Platform-as-a-Service (PaaS) – examples are Google App Engine, Force(dot)com and Microsoft Azure; and
– Software-as-a-Service (SaaS) – examples are Google Search, Google Apps and Yahoo Maps
Tim also provided an overview of what public and private mean with respect to clouds – public clouds are open for general use, whether it’s for businesses, agencies or individuals, while private clouds are exclusively used within the boundaries of an enterprise. He then posed the question “Is a private cloud actually cloud computing?”
I’d say by his own definition, private clouds can absolutely be cloud computing. Let’s say that private clouds are based on shared resources within an enterprise. Those resources/assets are not “owned by the user – some line of business for example – merely “rented out” for some period of time with an associated service level.
He also brought up the concept of a hybrid cloud – where organizations use internal IT infrastructure only until there is a surge in usage of an application. When a surge occurs, the application would extend into the cloud, leveraging the additional processing power only while it’s needed.
This segued into Apptis’own experience with the cloud, where they had an in-house financial application that took over 15 hours to run – and needed to be run twice a month. They decided to move the app into the cloud-and it promptly took 22 hours to run the first time. Tim then gave his first lesson in cloud computing: “It’s not easy to take a non-cloud application onto the cloud.” The application now runs in 5 hours and costs Apptis $ 30, albeit almost a full year after they started the project.
The final official speaker was David Rubal from Cisco. David varied slightly from Linda and Tim, as he said that if you use a service provider, you ARE using the cloud already. He also explained that governments in Japan, Korea and China have already embraced the cloud and have turned it into a range of citizen services – but the big difference is that broadband is ubiquitous in Asia and is commoditized much like electricity is in the US.
Agencies need to ask themselves what they really are trying to solve with the cloud. The answer, according to Dave, is information-sharing: intra-agency, inter-agency and to citizens/private partners. This constant need will help drive new technology adoption in the federal government, especially the cloud. Going back to yesterday’s IPv6 session, David also brought up the fact that IPv6 is integral to cloud computing -“EOIP” or “everything over IP,” as he phrased it.
The new protocol is key to vast, expansive networks, which is what cloud computing demands and provides, so IPv6 adoption is vital. Dan then opened the floor up for questions, with the first asking why hybrid cloud models cannot be used today. David believes that they can and should be used today – but only for certain, non-secure activities in the government. Linda seconded David’s answer and said that cloud computing should be used right now only where appropriate and does not need to be an agency-wide practice yet.
The next question asked why we should make the switch to this new technology since they obviously aren’t up to security standards yet. Linda immediately answered, saying that security is somewhat of a fallacy – if hackers want to get in to data that is stored online, they will. Agencies need to start making the switch and securing as they go. Dan took it a step further and asked who would have a more secure network, Google or the Department of Transportation.
Tim took the last question, which was to provide some lessons for taking an application into the cloud and how concerned someone should be about moving from cloud to cloud or out of the cloud. His first lesson was to take the application out of the cloud when you’re not using it, as most services will still charge you. Tim’s second lesson was to keep the database and the application in the cloud to beat the issue of latency. And his third is to make sure that you have tools to monitor the cloud – although he acknowledged that the gap between monitoring internal IT assets and cloud resources is fairly large right now. Gee, wouldn’t it be nice if someone developed an IT management appliance for the cloud and for internal systems?