The Cloud: Is NASA's Data lost in space?
Written by Campbell Williams (pictured, right) group strategy and marketing director, Six Degrees Group
NASA’s own auditor has recently rated its cloud computing deployments very poorly in a report that raises some interesting questions on the use of the cloud at the space agency. I’d encourage you to read the NASA report itself, if you have time, as it’s genuinely interesting and can be found here http://oig.nasa.gov/audits/reports/FY13/IG-13-021.pdf
I won’t repeat the content of the article and report but will summarise thus: in short, of the five cloud provider contracts NASA has in place, none addresses the business and IT security risks of public cloud and none meet “best practices for data security”; moreover, much of the information was moved onto the public cloud by various parts of NASA without knowledge or consent from the CIO’s office. This throws up a few points.
A bit of history – NASA and Cloud
NASA’s history with cloud computing is interesting. Through their Nebula private cloud project (see the report for more information), they developed significant expertise in building large scale-out compute environments. In 2010, the partnered with Rackspace to develop OpenStack, an open source software stack for building clouds (a de facto competitor to the likes of VMware and Microsoft in the proprietary space and Xen, KVM and CloudStack in the open source space).
This was a logical move for Rackspace, leveraging their storage expertise. It is a less obvious play for NASA (there are no clouds in space) and one can only assume that Rackspace has no plans for building rockets. This history is useful mainly to make the point that NASA is far from a Johnny-come-lately in cloud, far from it; rather they are one of the pioneers. So they really ought to know better.
Single v multi-tenant or public v private
It would be all too easy for me to make this about public cloud versus private cloud. But NASA’s own research makes it clear that private cloud was the more expensive option. As we’ve argued many times, we prefer the distinction of single-tenant versus multi-tenant. If a customer builds their own cloud, the hardware is obsolete immediately whereas a multi-tenant provider is required, by market forces, to maintain bang up-to-date hardware and software specs and to refresh their infrastructure to remain competitive.
However, we would strongly argue that for deployments such as this, a multi-tenanted virtual private cloud, with customised contracts and bespoke SLAs, would have been a far better fit than off-the-shelf, one-size-fits-all, public cloud technology.
Does NIST mean “Not if strategic technology”?
The rigid definition-by-committee of the NIST definition is a personal bugbear. As we have stated many times, users serving themselves IS NOT NECESSARILY A GOOD THING and it certainly shouldn’t be a mandated part of a cloud service. The NASA experience demonstrates this better than I possibly could. The US government themselves, through NIST, has encouraged exactly the sort of behaviour that the NASA auditor slams – namely departments within NASA spinning up some VMs and shifting sensitive data upon it with no thought to governance, compliance, the law, intellectual property protection, anything.
The most damning part of the NASA audit was, for me, the table that outlined the contractual status of the five cloud deals (presumably from five different providers) they had in place. NONE had defined roles and responsibilities. NONE had service level reporting metrics. NONE had data retention and destruction policies. NONE had data privacy requirements. You really must read the report – it’s a brilliant “what not to do guide”.
By NASA’s own admission, there are only two types of cloud contract: negotiated (like all managed hosting providers offer on a fixed term) and predefined, non-negotiable contracts. In their own words:
“Under a predefined contract, the contract terms are prescribed by the cloud provider. As such, these contracts typically do not impose requirements on the provider beyond meeting a base level of service and availability. Nor do they address Federal IT security, privacy, data production, or retention and destruction requirements. Furthermore, the provider is often empowered to modify the contract unilaterally without notifying the customer.”
By definition, ALL self-service public clouds fall into this category. After all, the SP isn’t going to allow you to write your own contract terms and SLA so it’s invariably lowest common denominator. Yet it’s these same “users must serve themselves or it’s not real cloud” environments that our governments seem so enamoured with, at the cost of data protection, sovereignty, security and common sense.
Reducing cost is usually a bad driver in isolation
Another personal bête noire is the obsession with reducing costs as an absolute motivator. Our own government has delivered terrible results with this aim in mind and is guilty of encouraging the same procurement practices in G-Cloud. It’s inevitable that at the scale NASA is talking about, a multi-tenant cloud would be more cost effective than building and maintaining their own. But at what cost, besides the dollar amount?
Value – that hard-to-define blend of quality and price – should always be the aim, with defined outcomes preceding it. Rarely is one supplier the best and the cheapest. But it is a beautiful thing when you deliver a new technology project that adds value, enhances efficiency, improves competitiveness – in short, makes your organisation better – AND then it also delivers it at a lower price point. But if all you do is change how you deploy tech in order to reduce cost, the business breaks, you break the law, you get sacked, do you still care about the cost saving?
A wiser man than me, Oscar Wilde, once opined that a cynic is one “who knows the price of everything and the value of nothing”. It’s important not to be cynical with IT procurement; it’s too important to be viewed merely as a cost centre to be slashed if possible.
It’s not about technology, it’s about supplier management
This is such an under-rated and under-stated area when people talk about cloud. NASA’s litany of mistakes has nothing to do with public clouds being less secure than private/virtual private ones (they are). It’s not even about cloud. It’s barely even about technology. It’s about contracts, expectations and good governance. If you don’t know how to manage a cloud supplier and sort out the contractuals, you shouldn’t be let loose on sensitive data. Indeed, they weren’t allowed – they broke their own rules. It’s not about cloud. IT in a cloud-based world is as much about managing suppliers and SLAs as it is about keeping the tin working.
Good news for wannabe doctor evils
NASA may be concerned with their intellectual property falling into the wrong hands – good news if you always wanted a space shuttle or a base on the moon. But anybody outside wealthy rogue governments will have nothing to fear. No patient records, criminal records, tax records or other sensitive information about members of the public have been exposed here, to our knowledge.
However, the object lesson is clear. The UK government has swallowed the NIST definition hook, line and sinker too, so the risk is there. Happily, our civil service with the CESG security regulations – the likes of IL2, IL3, etc – is well on top of things. But caveat emptor – if people are encouraged to serve themselves in a cloud world, then public cloud platforms could give us the new “CD-ROM left on train” or “documents left in park bin” headlines.
Focus on your core competencies
It’s particularly interesting that it’s NASA involved. You just can’t get an organisation better qualified than NASA to stand up a cloud project – AT A TECHNICAL LEVEL – but they still got it badly wrong. They should, perhaps, focus on aeronautics and space exploration and let those who wake up every day keeping data safe get on with doing that. One of my most vivid childhood memories is being a small boy back in 1981 watching the first orbital flight of Columbia and I’ve loved NASA and spacecraft ever since. I’d love it if we had a side project at 6DG on “satellite computing” but I suspect we’ll remain on terra firma. Back on Planet Earth, it’s important that people stick to what they’re good at.
About the author
Campbell is a 15 year industry veteran who has held various sales, marketing and business development management roles in a variety of manufacturing, carrier and reseller companies. He is a well-known figure in the industry and he has travelled the world as a presenter and subject-matter expert in a variety of technology fields, including twice addressing global United Nations conferences.
At 6DG, Campbell is responsible for channel strategy, product strategy, go-to-market messaging, proposition and solution development, public relations and branding matters
NTT DATA Services, Remodelling Supply Chains for Resilience
Joey Dean, the man with the coolest name ever and Managing Director in the healthcare consulting practice for NTT DATA and is focused on delivering workplace transformation and enabling the future workforce for healthcare providers. Dean also leads client innovation programs to enhance service delivery and business outcomes for clients.
The pandemic has shifted priorities and created opportunities to do things differently, and companies are now looking to build more resilient supply chains, none needed more urgently than those within the healthcare system. Dean shares with us how he feels they can get there.
A Multi-Vendor Sourcing Approach
“Healthcare systems cannot afford delays in the supply chain when there are lives at stake. Healthcare procurement teams are looking at multi-vendor sourcing strategies, stockpiling more inventory, and ways to use data and AI to have a predictive view into the future and drive greater efficiency.
“The priority should be to shore up procurement channels and re-evaluate inventory management norms, i.e. stockpiling for assurance. Health systems should take the opportunity to renegotiate with their current vendors and broaden the supplier channel. Through those efforts, work with suppliers that have greater geographic diversity and transparency around manufacturing data, process, and continuity plans,” says Dean.
But here ensues the never-ending battle of domestic vs global supply chains. As I see it, domestic sourcing limits the high-risk exposure related to offshore sourcing— Canada’s issue with importing the vaccine is a good example of that. So, of course, I had to ask, for lifesaving products, is building domestic capabilities an option that is being considered?
“Domestic supply chains are sparse or have a high dependence on overseas centres for parts and raw materials. There are measures being discussed from a legislative perspective to drive more domestic sourcing, and there will need to be a concerted effort by Western countries through a mix of investments and financial incentives,” Dean explains.
Wielding Big Tech for Better Outcomes
So, that’s a long way off. In the meantime, leveraging technology is another way to mitigate the risks that lie within global supply chains while decreasing costs and improving quality. Dean expands on the potential of blockchain and AI in the industry.
“Blockchain is particularly interesting in creating more transparency and visibility across all supply chain activities. Organisations can create a decentralised record of all transactions to track assets from production to delivery or use by end-user. This increased supply chain transparency provides more visibility to both buyers and suppliers to resolve disputes and build more trusting relationships. Another benefit is that the validation of data is more efficient to prioritise time on the delivery of goods and services to reduce cost and improve quality.
“Artificial Intelligence and Machine Learning (AI/ML) is another area where there’s incredible value in processing massive amounts of data to aggregate and normalise the data to produce proactive recommendations on actions to improve the speed and cost-efficiency of the supply chain.”
Evolving Procurement Models
From asking more of suppliers to beefing up stocks, Dean believes procurement models should be remodelled to favour resilience, mitigate risk and ensure the needs of the customer are kept in view.
“The bottom line is that healthcare systems are expecting more from their suppliers. While transactional approaches focused solely on price and transactions have been the norm, collaborative relationships, where the buyer and supplier establish mutual objectives and outcomes, drives a trusting and transparent relationship. Healthcare systems are also looking to multi-vendor strategies to mitigate risk, so it is imperative for suppliers to stand out and embrace evolving procurement models.
“Healthcare systems are looking at partners that can establish domestic centres for supplies to mitigate the risks of having ‘all of their eggs’ in overseas locations. Suppliers should look to perform a strategic evaluation review that includes a distribution network analysis and distribution footprint review to understand cost, service, flexibility, and risks. Included in that strategy should be a “voice of the customer” assessment to understand current pain points and needs of customers.”
“Healthcare supply chain leaders are re-evaluating the Just In Time (JIT) model with supplies delivered on a regular basis. The approach does not require an investment in infrastructure but leaves organisations open to risk of disruption. Having domestic centres and warehousing from suppliers gives healthcare systems the ability to have inventory on hand without having to invest in their own infrastructure. Also, in the spirit of transparency, having predictive views into inventory levels can help enable better decision making from both sides.”
But, again, I had to ask, what about the risks and associated costs that come with higher inventory levels, such as expired product if there isn’t fast enough turnover, tying up cash flow, warehousing and inventory management costs?
“In the current supply chain environment, it is advisable for buyers to carry an in-house inventory on a just-in-time basis, while suppliers take a just-in-case approach, preserving capacity for surges, retaining safety stock, and building rapid replenishment channels for restock. But the risk of expired product is very real. This could be curbed with better data intelligence and improved technology that could forecast surges and predictively automate future supply needs. In this way, ordering would be more data-driven and rationalised to align with anticipated surges. Further adoption of data and intelligence and will be crucial for modernised buying in the new normal.
These are tough tasks, so I asked Dean to speak to some of the challenges. Luckily, he’s a patient guy with a lot to say.
On managing stakeholders and ensuring alignment on priorities and objectives, Dean says, “In order for managing stakeholders to stay aligned on priorities, they’ll need more transparency and collaborative win-win business relationships in which both healthcare systems and medical device manufacturers are equally committed to each other’s success. On the healthcare side, they need to understand where parts and products are manufactured to perform more predictive data and analytics for forecasting and planning efforts. And the manufacturers should offer more data transparency which will result in better planning and forecasting to navigate the ebbs and flows and enable better decision-making by healthcare systems.
Due to the sensitive nature of the information being requested, the effort to increase visibility is typically met with a lot of reluctance and push back. Dean essentially puts the onus back on suppliers to get with the times. “Traditionally, the relationships between buyers and suppliers are transactional, based only on the transaction between the two parties: what is the supplier providing, at what cost, and for what length of time. The relationship begins and ends there. The tide is shifting, and buyers expect more from their suppliers, especially given what the pandemic exposed around the fragility of the supply chain. The suppliers that get ahead of this will not only reap the benefits of improved relationships, but they will be able to take action on insights derived from greater visibility to manage risks more effectively.”
He offers a final tip. “A first step in enabling a supply chain data exchange is to make sure partners and buyers are aware of the conditions throughout the supply chain based on real-time data to enable predictive views into delays and disruptions. With well understand data sets, both parties can respond more effectively and work together when disruptions occur.”
As for where supply chain is heading, Dean says, “Moving forward, we’ll continue to see a shift toward Robotic Process Automation (RPA), Artificial Intelligence (AI), and advanced analytics to optimise the supply chain. The pandemic, as it has done in many other industries, will accelerate the move to digital, with the benefits of improving efficiency, visibility, and error rate. AI can consume enormous amounts of data to drive real-time pattern detection and mitigate risk from global disruptive events.”