- “Data stored overseas should be accessible to US government, judge rules” – Source Reuters
- “Obama administration contends that company with operations in US must comply with warrants for data, even if stored abroad” – Source The Guardian
With the rulings this summer that Microsoft must provide the US government with customer data even if it is stored outside of the United States, many organizations and individuals alike are concerned about data sovereignty and privacy – And they should be however, legal issues like data sovereignty and Safe Harbor are distractions from the real issue.
Let’s start with a definition of Data Sovereignty:
|Definition: Data sovereignty is the concept that information which has been converted and stored in digital form is subject to the laws of the country in which it is located.
– Source TechTarget
If you are at all concerned about data security and privacy, it’s not just legal jurisdictions that you need to be worried about. Consider some of the more high profile security breaches over the past few weeks (let alone the past year) in both cloud services and private data centers:
- “Hundreds of Intimate Celebrity Pictures Leaked Online Following Alleged iCloud Breach” – Source Newsweek
- “Prosecutors: Accused Russian hacker jailed here had 2.1 million stolen credit card numbers when arrested” – Source – Fox
- “Data Breach Bulletin: Home Depot Credit Card Breach Could Prove To Be Larger Than Target Breach” – Source Forbes
- “Russian Hackers Amass Over a Billion Internet Passwords“ – Source New York Times
The message to me is that it doesn’t matter where the data is, it isn’t safe. In fact one could argue that while the US DOJ, SEC or IRS having access to your data is a privacy concern, it is less of a threat than a major security breach like Home Depot etc.
So what’s the answer?
Obviously this is a complex problem and large organizations with lots of smart people have been struggling with it for years. I don’t have a simple answer nor should you expect one. I know that many of the technology problems we faced in the past have been solved – and even seem quaint Remember having to rewind VHS movies before DVDs? Or returning DVDs before Netflix? Since I can’t travel to the future to tell you what the solution will eventually be, let’s look to somebody who has seen the future. Namely Captain Jack Harkness.
He definitely doesn’t want to get caught with his pants down while saving the earth. Notice that he is wearing both suspenders (braces for our British readers) and a belt? So what can we learn from this?
While taking all of the precautions that you can with data center processes is an important part of a security strategy, some additional steps can also be taken. Consider data encryption. Yes, the data may still be accessed by unauthorized parties but the data will be of little use to them if they can’t decrypt it. In a private data center that has been compromised, the data may still be safe.
In public cloud environments, data can be encrypted before it enters the vendors cloud. The keys can reside in the client’s data center or in a third party escrow facility. In order for the data to be useful, a double breach would be necessary.
The same holds true for data sovereignty. Who cares if the DOJ has your data if they can’t read it.
Of course all of this assumes that the level of encryption being used is sufficiently strong that it is non-trivial to decrypt it through brute force or other means.
What do you think the future holds for data sovereignty and security?
Microsoft’s newest Billion Dollar business units include Office 365 and Azure. There’s lots of marketing, sales, and ROI information about Office365 and cloud services in general. So I’m not going to bore you with another post about how to save your organization money or accelerate value by adopting Office365. I’m going to describe two real world use cases that I have personally found Office365 to help with. I might even through in some anecdotal cost benefit analysis but my main purpose is to explore some less common uses for Office 365 that you may not have thought of.
The two scenarios are:
- External consultants
- Text and Development
I manage a team of consultants that regularly have to work at client sites. Often at some very security conscious organizations. We can’t always use our own laptops in their environment or if we can it is typically through guest wireless networks. We’ve encountered situations where the guest wireless prevents us from connecting back to our office through VPN. This makes it difficult to access some of our collaboration services like SharePoint. We have moved my team to Office365 specifically to do things like coauthoring documents in SharePoint from customer sites. This enables some interesting scenarios. We’ve had cases where an offsite consultant was able to review and update some documentation while it was being simultaneously authored by another consultant working in our lab.
Test and Development
We do a lot of System Center work. System Center is a complex suite of products that interact with each other as well as core Windows infrastructure like Active Directory and Exchange. When we are building out a proof of concept for a customer, they typically don’t want us to touch their production AD and Exchange environments. I don’t blame them. Ultimately in order to complete the project we would need to somehow build out an Active Directory and Exchange infrastructure dedicated to the proof of concept or pilot. Consider the additional costs in hardware, software, and time required to accomplish this. Lately we’ve started using Office365 to provide Exchange services. It takes minutes to provision and connect to. Examples we’ve used recently include the Exchange connector for Configuration Manager and Service Manager. Using this approach, in under and hour I was able to get more than a half dozen mobile devices loaded into Configuration Manager for a MDM/UDM proof of concept without touching any production AD or Exchange infrastructure simply by adding an additional email account the devices.
We’ve extended this to Azure as well. We have been using Azure to host System Center instances for proof of concept and sandbox deployments. I’m looking forward to combining Azure with Office365 to further accelerate our pilots and proofs of concept deployments.
Yesterday I met with David Ker, one of the founders of RealWat Inc. They currently offer the Ti-Took Nuage browser which is based on Google’s Chromium (the Chrome open source project). The Nuage browser seeks to improve the browsing experience by adding improved privacy, security, speed, and other Web 2.0 cloud based services such as social bookmarking (more feature details here). While the Ti-Took Nuage browser is interesting I’m unsure of the long term mass appeal it will have as other players (including the big browser shops) add similar functionality to their offerings, but for now Ti-Took is blazing a new trail.
Download a copy of the Nuage Browser here.
What got me more excited is a new project that they are working on called the Ti-Took Cloud Router. It’s an innovative offering that essentially frontends an IaaS offering such as Amazon’s EC2. The Ti-Took Cloud Router is targeted at small organizations that want to take advantage of cloud service offerings but still require security and scope of control. Using the Cloud Router essentially created a virtual private cloud (vPC?) inside a public cloud that encapsulates the services that are important for an individual organizations business and users. It also allows secure access to a virtual datacenter from public locations. The key to all of this is their web based identity management service that provides a unified single sign-on that securely validates users into the vPC and then controls access to other cloud services like email or CRM. We discussed the importance of extended authentication protocols and they assure me that they are investigating two factor authentication.
I foresee this type of offering accelerating the adoption of cloud services in the SMB space. I’m looking forward to more announcements from RealWat.
I’ve been looking at public cloud offerings informally for a while and while I appreciate the case for a full cloud infrastructure, it will be quite some time before large enterprise datacenters can realistically retool everything for the cloud. Sure, small startups can very successfully have large parts if not all of their IT services in the cloud but there are still too many barriers for larger organizations with large investments in “in-house” IT resources.
My initial thoughts were that there could be an opportunity for hybrid clouds that can provide organizations with excess capacity on demand. This would be a great way to augment datacenters with cloud bursting opportunities and introduce organizations to cloud offerings with low risk. Of course, accomplish this, enterprises will need to build infrastructures that are compatible with public clouds.
The problem is that none of the (admittedly small sample) of private cloud infrastructures that I’ve looked at use the same APIs as their public cloud counterparts. Until now that is …
Here’s an interesting announcement from Eucalyptus and Terracotta that essentially provides the management tools and a private cloud infrastructure that uses the same APIs as the Amazon AWS. This is a good start and I hope and expect to see more offerings that make it easy to build hybrid cloud solutions.
I expect that MS will soon announce on-premises availability of the Azure platform.