Welcome!

Open Source Cloud Authors: Zakia Bouachraoui, Liz McMillan, Yeshim Deniz, Elizabeth White, Pat Romanski

Related Topics: @CloudExpo

@CloudExpo: Blog Feed Post

Cloud Needs Context-Aware Provisioning

The Need for More Awareness of Context During the Provisioning

confused-route The awareness of the importance of context in application delivery and especially in the “new network” is increasing, and that’s a good thing. It’s a necessary evolution in networking as both users and applications become increasingly mobile. But what might not be evident is the need for more awareness of context during the provisioning, i.e. deployment, process.

A desire to shift the burden of management of infrastructure does not mean a desire for ignorance of that infrastructure, nor does it imply acquiescence to a complete lack of control. But today that’s partially what one can expect from cloud computing . While the fear of applications being deployed on “any old piece of hardware anywhere in the known universe” is not entirely a reality, the possibility of having no control over where an application instance might be launched – and thus where corporate data might reside - is one that may prevent some industries and individual organizations from choosing to leverage public cloud computing.

This is another one of those “risks” that tips the scales of risk versus benefit to the “too risky” side primarily because there are legal implications to doing so that make organizations nervous.

The legal ramifications of deploying applications – and their data – in random geographic locations around the world differ based on what entity has jurisdiction over the application owner. Or does it? That’s one of the questions that remains to be answered to the satisfaction of many and which, in many cases, has led to a decision to stay away from cloud computing.

quote-left According to the DPA, clouds located outside the European Union are per se unlawful, even if the EU Commission has issued an adequacy decision in favor of the foreign country in question (for example, Switzerland, Canada or Argentina).

-- German DPA Issues Legal Opinion on Cloud Computing

Back in January, Paul Miller published a piece on jurisdiction and cloud computing, exploring some of the similar legal juggernauts that exist with cloud computing:

quote-left While cloud advocates tend to present 'the cloud' as global, seamless and ubiquitous, the true picture is richer and complicated by laws and notions of territoriality developed long before the birth of today's global network. What issues are raised by today's legislative realities, and what are cloud providers — and their customers — doing in order to adapt?


CONTEXT-AWARE PROVISIONING

To date there are two primary uses for GeoLocation technology. The first is focused on performance, and uses the client location as the basis for determining which data center location is closest and thus, presumably, will provide the best performance. This is most often used as the basis for content delivery networks like Akamai and Amazon’s CloudFront. The second is to control access to applications imageor data based on the location from which a request comes. This is used, for example, to comply with U.S. export laws by preventing access to applications containing certain types of cryptography from being delivered to those specifically prohibited by law from obtaining such software.

There are additional uses, of course, but these are the primary ones today. A third use should be for purposes of constraining application provisioning based on specified parameters.

While James Urquhart twitterbird touches on location as part of the criteria for automated acquisition of cloud computing services what isn’t delved into is the enforcement of location-based restrictions during provisioning. The question is presented more as “do you support deployment in X location” rather than “can you restrict deployment to X location”. It is the latter piece of this equation that needs further exploration and experimentation specifically in the realm of devops and automated provisioning because it is this part of the deployment equation that will cause some industries to eschew the use of cloud computing.

Location should be incorporated into every aspect of the provisioning and deployment process. Not only should a piece of hardware – server or network infrastructure – be capable of describing itself in terms of resource capabilities (CPU, RAM, bandwidth) it should also be able to provide its physical location. Provisioning services should further be capable of not only including location restrictions as part of the policies governing the automated provisioning of applications, but enforcing them as well.


STANDARDS NEED LOCATION-AWARENESS

Current standards efforts today such as the OCCI specification pdf-icon (intended as a means to query cloud computing implementations and its components for information) do not make easily available the ability to query a resource for location at run-time. It does, however, allow the ability to select all resources residing in a specific location – assuming you know what that location is, which nearly ends up in a circular reference loop. The whole problem revolves around the fact that standards and specifications and APIs have been developed with the belief that location wasn’t important – you shouldn’t have to know – without enough consideration for regulatory compliance and the problems of mixing data, laws, and location. It would be very useful, given the state of cloud computing and its “Wizard of Cloud” attitude toward infrastructure transparency, to provide location as an attribute of every resource – dynamically - and further offer the means by which location can easily be one of the constraints.

Having available some standardized method of retrieving the physical location of a device or system would allow the provisioning systems to restrict its pool of available resources based on a match between any existing location restrictions required by the customer and the location of available resources. The reason for making location an attribute of every “kind” of resource is that restrictions on application or data location may extend to data traversal paths. Some industries have very specific requirements regarding not only the storage of data and access by applications, but over the transmission of data, as well. These types of requirements may include the location of network devices which have access to, for processing purposes, that data. What seems to many of us to be trivial becomes highly important to courts and lawyers and thus it behooves network devices and components to also be able to provide location from which eventually automated application-specific routing tables could be derived, thus protecting the interests of organizations highly sensitive to location at all times of its data.

This also implies, of course, that the infrastructure itself is capable of enforcing such policies, which means it must be location-aware and able to collaborate with the infrastructure ecosystem to ensure not just at-rest location complies with application restrictions but traversal-location as well, if applicable. That’s going to require a new kind of network, one based on Infrastructure 2.0 principles of collaboration, connectivity, integration and intelligence.

The inclusion of physical location as part of the attributes of a component, made available to automated provisioning and orchestration systems, could enable these types of policies to be constructed and enforced. It may be that a new attribute descriptor is necessary, something that better describes the intent of the meta-data, such as restriction. A broad restriction descriptor could, in addition to location, contain other desired provisioning-based attributes such as minimum RAM and CPU, network speed, or even – given the rising concerns regarding the depletion of IPv4 addresses – the core network protocol supported, i.e. IPv6, IPv4, or “any”.

If not OCCI, then some other standard – de facto or agreed upon – needs to exist because one thing is certain: something needs to make that information available and some other thing needs to be able to enforce those policies. And that governance over deployment location must occur during the provisioning process, before an application is inadvertently deployed in a location not suited to the organization or application.

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

IoT & Smart Cities Stories
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
DXWorldEXPO LLC announced today that Big Data Federation to Exhibit at the 22nd International CloudEXPO, colocated with DevOpsSUMMIT and DXWorldEXPO, November 12-13, 2018 in New York City. Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed...
All in Mobile is a place where we continually maximize their impact by fostering understanding, empathy, insights, creativity and joy. They believe that a truly useful and desirable mobile app doesn't need the brightest idea or the most advanced technology. A great product begins with understanding people. It's easy to think that customers will love your app, but can you justify it? They make sure your final app is something that users truly want and need. The only way to do this is by ...
CloudEXPO | DevOpsSUMMIT | DXWorldEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Cell networks have the advantage of long-range communications, reaching an estimated 90% of the world. But cell networks such as 2G, 3G and LTE consume lots of power and were designed for connecting people. They are not optimized for low- or battery-powered devices or for IoT applications with infrequently transmitted data. Cell IoT modules that support narrow-band IoT and 4G cell networks will enable cell connectivity, device management, and app enablement for low-power wide-area network IoT. B...
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...