|By Eric Bannon||
|May 3, 2014 01:00 PM EDT||
I recently found myself intrigued by an article by Jon William Toigo on Tech Target titled - Software-defined infrastructure or how storage becomes software. In his article, Toigo poses the question: Could a software-defined infrastructure, with software-based controls and policies, be the answer to managing and allocating storage? While I am sure Jon would agree we're all tired of the buzz words around "software-defined-anything", the fact of the matter is that we all use it anyway for lack of a better description of what's occurring today.
Storage specifically is one of those areas I always say is like organizing your room: everyone has their own way of doing it. You want your dresser to be a certain height. You want you mattress pointing a certain direction and the light shining through your window on your desk at just the right time of day. The truth is, storage is the under-pinning of virtualization that everyone wants to architect and manage the way they want to, and no one is going to tell them otherwise... UNTIL - wait for it - ...software can manage this FOR people.
Where is the robot (I wish the Roomba folks would hurry up with this, but I digress;) that keeps your room exactly the way you want it to be? The industry seems constantly to toss this idea around, but no one really seems to know where to find the solution. The truth is that no matter what quirks people have with their storage, there is one goal everyone has in common: Ensuring applications have ability to consume the storage resources they need while preserving priority and business logic; as well as increasing efficiency without introducing risk. This is the promise of every storage vendor on the planet trying to sell their latest auto-tiering, de-dupe, compression solution and all the wonderful bling to trick out their man cave.
The problem, however, is that virtualization obscures the lines with storage and it becomes intractably complex to manage the macro-level supply and demand of resources occurring across the stack. Traditional management tools - storage- and virtualization-related - are running into the limitations of their stats-based, linear approach to managing diverse environments.
As an illustration, let's use a real-life example. Let's assume we have a NetApp environment supporting VMWare. The NetApp consist of 4 Aggregates spread across two filers.
- Aggr 1 SATA and Aggr 2 SATA are on Controller A, and each Aggr comprises of 2 Volumes in VMWare
- Aggr 3 SAS and Aggr 4 SAS are on Controller B, and each Aggr comprises of 2 Volumes in VMWare
Aggr1 sees a spike in IOPS driven by 3 virtual machines demanding IOPS on its 2 Volumes. This results in Aggr1 utilizing 93% of its available IOPS capacity due to several high consumers, or "Bully" VMs (to use a traditional storage vendor's language). To compound the issue, the high utilization on disk has now manifested itself up the stack and begins to impact the ability of other workloads to access the storage resources they require on Aggr1.
In this example, a traditional software management system for the storage platform will alert an administrator that the Aggr1 has exceeded a tolerable utilization on IOPS and that it is time for an administrator to act. Similarly, the virtualization vendor (in this case VMWare), will generate alarms related to the virtualized components layered on top of the storage platform.
The administrator must then siphon through the charts and graphs in their storage vendor's tool, or their virtualization management system, with the end goal being some sort of resource allocation decision to intelligently allocate storage resources to the applications that need them while avoiding quality of service disruption at the expense of low-priority applications.
More likely the administrator needs to access both of these interfaces to try and accomplish this. In this example, the resource decision might be to move the volume to separate aggregate on Controller A (when it reality this won't do much due to performance constraints underneath), move the volume itself to faster disk associated with a separate storage controller, or move the virtual machine to a volume hosted on Controller B.
Now the second part of this equation (and arguably more difficult to get right) is: How does the administrator ensure that the domino they decide to push over doesn't create another resource constraint within the environment? Fundamentally, traditional storage vendor software offerings and virtualization management tools are incapable of understanding the impact and outcome of any prospective resolution because they simply do not analyze the interdependencies of this decision across both (virtualized) compute and storage components. The best case for operations is a head start on the troubleshooting process after quality of service has already been impacted or is in the process of being degraded.
Are Things Even at Human Scale Anymore?
In order to truly accomplish a software-defined storage system, there needs to be a new type of management system capable of connecting these two obscure worlds for the purpose of intelligent decision making and resource allocation.
Toigo paints this gap perfectly when he states, "Our storage needs to be managed and allocated by intelligent humans, with software-based controls and policies serving as a more efficient extension of our ability to translate business needs into automation support."
Following this logic, this new system must go above and beyond looking at application issues in isolation to determine how to properly allocate the infrastructure's entire supply of finite storage resources to every virtualized workload and application - at scale. Inevitably, this means looking across all application resource demands concurrently and then determining how to service each application's request for the best cost/benefit to the overall platform by allocating the supply of storage resource intelligently and in the most efficient way. Ideally, this will be done prescriptively - before quality of service is degraded.
The second phase of this brave new world will involve incorporating business logic that allows the software-driven control plane to consider business constraints alongside of capacity and performance metrics in real time. If tier 1 applications need to have priority for faster disk over low-priority applications, then the system should be set it and forget it. If tier 3 applications must be confined to bronze or slow storage, then the constraint should carry over dynamically for any workload matching this criterion that is provisioned across the lifecycle of the environment.
If 20% overhead needs to be maintained across tier-1 storage resources, then software should be intelligent enough to control utilization below this level, instead of notifying administrators once they have crossed it and forcing them to bring the infrastructure back from the brink.
The reality is that everyone has their own idea how to best trick out their room - in this case, their precious storage. Administrators will never truly be comfortable with putting their storage architecture on auto-pilot until they can rest assured that their policies are maintained while assuring application performance. Any system developed to tackle this brave new world, must be able to solve both of these goals simultaneously - a challenge that Toigo argues is beyond human capacity to do so at scale.
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Oct. 21, 2016 10:50 AM EDT
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
Oct. 21, 2016 10:45 AM EDT Reads: 1,631
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Oct. 21, 2016 10:30 AM EDT Reads: 1,214
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
Oct. 21, 2016 09:15 AM EDT Reads: 210
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
Oct. 21, 2016 08:45 AM EDT Reads: 11,114
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Oct. 21, 2016 08:45 AM EDT Reads: 1,358
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
Oct. 21, 2016 08:00 AM EDT Reads: 5,571
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Oct. 21, 2016 07:45 AM EDT Reads: 3,728
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
Oct. 21, 2016 07:15 AM EDT Reads: 1,275
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service.
Oct. 21, 2016 07:15 AM EDT Reads: 873
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Oct. 21, 2016 06:45 AM EDT Reads: 1,784
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Oct. 21, 2016 06:15 AM EDT Reads: 4,635
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessi...
Oct. 21, 2016 05:45 AM EDT Reads: 5,047
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Oct. 21, 2016 05:00 AM EDT Reads: 3,935
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
Oct. 21, 2016 04:30 AM EDT Reads: 3,074
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
Oct. 21, 2016 04:15 AM EDT Reads: 1,717
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
Oct. 21, 2016 04:00 AM EDT Reads: 10,932
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Oct. 21, 2016 03:15 AM EDT Reads: 3,861
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Oct. 21, 2016 03:15 AM EDT Reads: 1,654
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
Oct. 21, 2016 02:00 AM EDT Reads: 5,910