Welcome!

Open Source Cloud Authors: Pat Romanski, Liz McMillan, Stackify Blog, Wesley Coelho, Rainer Ersch

Related Topics: @BigDataExpo, Java IoT, Microservices Expo, Microsoft Cloud, Containers Expo Blog, @CloudExpo

@BigDataExpo: Article

The ‘A-ha’ About Active and Inactive Data

Despite advances in technology, the way we manage the growth in files has not changed much in the last decade or two

How many kinds of files are in use in your organization? Fifty? A hundred? More?

Wrong. There are only two kinds of files: active files and inactive files. (I admit it was a trick question.) Users create a file - a spreadsheet, an image, an invoice, etc. As long as the file is being worked with, it's an active file. Users can share, collaborate, and edit the file to their heart's content, but eventually the file is no longer current, and it becomes inactive.

But in this litigious and increasingly regulated world, there is no such thing as deleting a file. You're stuck with that inactive file, for years, if not forever, like your 30-year-old college dropout kid who still lives in the basement.

Despite advances in technology and a proliferation of alternatives, the way we manage the growth in files has not changed much in the last decade or two. We let the volume of files grow until it consumes all of the available resources, and then we throw money at the problem. Or we throw money at someone else who does essentially the same thing, just with cheaper disks.

Years ago, it was not a bad strategy. The majority of all files were active. The cost of having inactive files in the same pool was small. Today, it's an outdated strategy, and it's a costly strategy, one that few CIOs and budget-minded IT administrators feel particularly good about anymore. That's one positive side effect of the global economic recession: operations of all sizes, including Fortune 1000s and large government agencies, have much less tolerance for deferring a problem by throwing money at it.

It frankly makes no sense to spend thousands to millions of dollars on more and more primary storage to store obsolete or redundant files. It makes even less sense when we consider that runaway data growth is something of a myth when it comes to active files: in most organizations; the volume of active files - the files people create and use to do their jobs day in and day out - remains nearly fixed.

How can this be? Consider the way most workplaces operate. Your company pays its staff to work on different things this week than what they worked on last week or last month. Employees create new files for each new project, and their previous ones become inactive. In a typical organization, inactive data is responsible for data growth, not active data, whereas, the number of active files is proportional to the number of employees available to use them. For the most part, in three years, you'll have about the same number of active files as you have today, assuming you have the same headcount. Most companies add a few employees, let a few employees go, generate more files during the busy season, then fewer files during the lean times. In general, however, the number of users and the amount of active files will stay relatively constant. In this post melt-down economy, no one is hiring like there's no tomorrow. In fact, five-figure layoffs are still a common occurrence; they just don't make the news anymore.

If you've been in business for at least a few years, it's safe to assume two-thirds of your files are inactive. What's more, once saved to the network, 80 percent of your files will never even be looked at again.

Note that "looking at" an old file is not the same as "working on" an old file. An inactive file may be requested for research or reference purposes, but it won't be edited. When it reaches that stage of inactivity, it's in essence a read-only file. This means your inactive file data should be managed as a read-only archive - since the odds bet is read never, and the next best bet is read-only.

With these considerations in mind, purchasing additional primary storage to manage inactive file growth is unjustifiable, especially considering that storage operating costs can be five times the initial acquisition costs.

The growth of inactive files has an additional administrative cost when one adds on the constant cycle of backups. It's not unusual for an organization performing a standard weekly full backup, plus nightly incremental backups, to have 15, 20, even 25 copies of file data at any time. One of our customers, a multinational banking and financial services firm, was maintaining 28 copies of each file until we came in to provide relief.

Employing the same old backup tactics with online storage saves money on hardware, but does not address the core problem of file proliferation. Transferring inactive files to less costly archival storage media works well on paper, but not in practice. Both these methods suffer from inefficiencies and insecurities. Physically moving a petabyte of data can take months, and finding a file takes arduously long (if it's ever found at all; more than 60 percent of the time, it's not). This can spell the end of the business when a federal court judge comes around asking the company to produce a few thousand documents.

The real savings kick in when the typical file backup strategy is scrapped in favor of tiering, which is based on an online archive for inactive files. The key to achieving this is controlling and managing how and when files are moved - in other words, when a file becomes inactive.

The goal is to categorize data, properly manage it, and move the right data to the second tier to reduce costs, address compliance issues, and meet electronic discovery requirements. Granted, the cost, time, and complexity involved in scanning the entire file system and moving data to an archive can be onerous. However, event- and policy-driven software that automates these processes based on age, type, size or access frequency makes this a much smaller chore.

With this method, it's not unusual to see a two-thirds reduction in the cost of storing and protecting inactive files in the first year alone, and the savings accumulate as the inactive file load grows. One manufacturing customer calculated their first-year savings as 82 percent. Even if the initial costs of this tiered approach are twice as much as storage hardware (which seems unlikely), there would still be a savings as time goes by, since operating costs always exceed acquisition costs.

This second tier for inactive data can be maintained as a read-only archive, since it's very unlikely these files will be changed again. For example, an engineering firm may want to view the architectural plans from a project they did two years ago, but it's unlikely they would make any edits on that original file. This can deliver additional savings over time.

If no other message resonates, we hope we've at least established that traditional methods of accommodating data growth do not make sense when it's inactive, infrequently accessed, obsolete files that are typically responsible for that growth. Expensive primary storage, and the Sisyphean task of regular backups, will not solve the problem of inactive files.

More Stories By Bruce Backa

Bruce Backa, CEO of NTP Software®, a leader in File Data Management, is recognized as a Technology Pioneer by the World Economic Forum. Prior to founding NTP Software nearly 20 years ago, he held CTO positions at the international insurance and financial services firm AIG and the American Stock Exchange. He holds several U.S. and international patents related to data management, and was awarded the Kemeny Prize in Computing from Dartmouth College.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
SYS-CON Events announced today that MIRAI Inc. will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MIRAI Inc. are IT consultants from the public sector whose mission is to solve social issues by technology and innovation and to create a meaningful future for people.
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
SYS-CON Events announced today that TidalScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TidalScale is the leading provider of Software-Defined Servers that bring flexibility to modern data centers by right-sizing servers on the fly to fit any data set or workload. TidalScale’s award-winning inverse hypervisor technology combines multiple commodity servers (including their ass...
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
Infoblox delivers Actionable Network Intelligence to enterprise, government, and service provider customers around the world. They are the industry leader in DNS, DHCP, and IP address management, the category known as DDI. We empower thousands of organizations to control and secure their networks from the core-enabling them to increase efficiency and visibility, improve customer service, and meet compliance requirements.
As popularity of the smart home is growing and continues to go mainstream, technological factors play a greater role. The IoT protocol houses the interoperability battery consumption, security, and configuration of a smart home device, and it can be difficult for companies to choose the right kind for their product. For both DIY and professionally installed smart homes, developers need to consider each of these elements for their product to be successful in the market and current smart homes.
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, will discuss how given the magnitude of today's applicati...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
SYS-CON Events announced today that mruby Forum will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. mruby is the lightweight implementation of the Ruby language. We introduce mruby and the mruby IoT framework that enhances development productivity. For more information, visit http://forum.mruby.org/.
Digital transformation is changing the face of business. The IDC predicts that enterprises will commit to a massive new scale of digital transformation, to stake out leadership positions in the "digital transformation economy." Accordingly, attendees at the upcoming Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA, Oct 31-Nov 2, will find fresh new content in a new track called Enterprise Cloud & Digital Transformation.
SYS-CON Events announced today that NetApp has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. NetApp is the data authority for hybrid cloud. NetApp provides a full range of hybrid cloud data services that simplify management of applications and data across cloud and on-premises environments to accelerate digital transformation. Together with their partners, NetApp emp...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere delivers a more modern architectural approach to storage that doesn't require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbui...
SYS-CON Events announced today that Avere Systems, a leading provider of hybrid cloud enablement solutions, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere Systems was created by file systems experts determined to reinvent storage by changing the way enterprises thought about and bought storage resources. With decades of experience behind the company’s founders, Avere got its ...
Amazon is pursuing new markets and disrupting industries at an incredible pace. Almost every industry seems to be in its crosshairs. Companies and industries that once thought they were safe are now worried about being “Amazoned.”. The new watch word should be “Be afraid. Be very afraid.” In his session 21st Cloud Expo, Chris Kocher, a co-founder of Grey Heron, will address questions such as: What new areas is Amazon disrupting? How are they doing this? Where are they likely to go? What are th...
As hybrid cloud becomes the de-facto standard mode of operation for most enterprises, new challenges arise on how to efficiently and economically share data across environments. In his session at 21st Cloud Expo, Dr. Allon Cohen, VP of Product at Elastifile, will explore new techniques and best practices that help enterprise IT benefit from the advantages of hybrid cloud environments by enabling data availability for both legacy enterprise and cloud-native mission critical applications. By rev...
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, will discuss how they b...
SYS-CON Events announced today that SkyScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. SkyScale is a world-class provider of cloud-based, ultra-fast multi-GPU hardware platforms for lease to customers desiring the fastest performance available as a service anywhere in the world. SkyScale builds, configures, and manages dedicated systems strategically located in maximum-security...