Click here to close now.

Welcome!

Open Source Authors: Elizabeth White, Carmen Gonzalez, Trevor Parsons, VictorOps Blog, Liz McMillan

Related Topics: Big Data Journal, Java, SOA & WOA, .NET, Virtualization, Cloud Expo

Big Data Journal: Article

The ‘A-ha’ About Active and Inactive Data

Despite advances in technology, the way we manage the growth in files has not changed much in the last decade or two

How many kinds of files are in use in your organization? Fifty? A hundred? More?

Wrong. There are only two kinds of files: active files and inactive files. (I admit it was a trick question.) Users create a file - a spreadsheet, an image, an invoice, etc. As long as the file is being worked with, it's an active file. Users can share, collaborate, and edit the file to their heart's content, but eventually the file is no longer current, and it becomes inactive.

But in this litigious and increasingly regulated world, there is no such thing as deleting a file. You're stuck with that inactive file, for years, if not forever, like your 30-year-old college dropout kid who still lives in the basement.

Despite advances in technology and a proliferation of alternatives, the way we manage the growth in files has not changed much in the last decade or two. We let the volume of files grow until it consumes all of the available resources, and then we throw money at the problem. Or we throw money at someone else who does essentially the same thing, just with cheaper disks.

Years ago, it was not a bad strategy. The majority of all files were active. The cost of having inactive files in the same pool was small. Today, it's an outdated strategy, and it's a costly strategy, one that few CIOs and budget-minded IT administrators feel particularly good about anymore. That's one positive side effect of the global economic recession: operations of all sizes, including Fortune 1000s and large government agencies, have much less tolerance for deferring a problem by throwing money at it.

It frankly makes no sense to spend thousands to millions of dollars on more and more primary storage to store obsolete or redundant files. It makes even less sense when we consider that runaway data growth is something of a myth when it comes to active files: in most organizations; the volume of active files - the files people create and use to do their jobs day in and day out - remains nearly fixed.

How can this be? Consider the way most workplaces operate. Your company pays its staff to work on different things this week than what they worked on last week or last month. Employees create new files for each new project, and their previous ones become inactive. In a typical organization, inactive data is responsible for data growth, not active data, whereas, the number of active files is proportional to the number of employees available to use them. For the most part, in three years, you'll have about the same number of active files as you have today, assuming you have the same headcount. Most companies add a few employees, let a few employees go, generate more files during the busy season, then fewer files during the lean times. In general, however, the number of users and the amount of active files will stay relatively constant. In this post melt-down economy, no one is hiring like there's no tomorrow. In fact, five-figure layoffs are still a common occurrence; they just don't make the news anymore.

If you've been in business for at least a few years, it's safe to assume two-thirds of your files are inactive. What's more, once saved to the network, 80 percent of your files will never even be looked at again.

Note that "looking at" an old file is not the same as "working on" an old file. An inactive file may be requested for research or reference purposes, but it won't be edited. When it reaches that stage of inactivity, it's in essence a read-only file. This means your inactive file data should be managed as a read-only archive - since the odds bet is read never, and the next best bet is read-only.

With these considerations in mind, purchasing additional primary storage to manage inactive file growth is unjustifiable, especially considering that storage operating costs can be five times the initial acquisition costs.

The growth of inactive files has an additional administrative cost when one adds on the constant cycle of backups. It's not unusual for an organization performing a standard weekly full backup, plus nightly incremental backups, to have 15, 20, even 25 copies of file data at any time. One of our customers, a multinational banking and financial services firm, was maintaining 28 copies of each file until we came in to provide relief.

Employing the same old backup tactics with online storage saves money on hardware, but does not address the core problem of file proliferation. Transferring inactive files to less costly archival storage media works well on paper, but not in practice. Both these methods suffer from inefficiencies and insecurities. Physically moving a petabyte of data can take months, and finding a file takes arduously long (if it's ever found at all; more than 60 percent of the time, it's not). This can spell the end of the business when a federal court judge comes around asking the company to produce a few thousand documents.

The real savings kick in when the typical file backup strategy is scrapped in favor of tiering, which is based on an online archive for inactive files. The key to achieving this is controlling and managing how and when files are moved - in other words, when a file becomes inactive.

The goal is to categorize data, properly manage it, and move the right data to the second tier to reduce costs, address compliance issues, and meet electronic discovery requirements. Granted, the cost, time, and complexity involved in scanning the entire file system and moving data to an archive can be onerous. However, event- and policy-driven software that automates these processes based on age, type, size or access frequency makes this a much smaller chore.

With this method, it's not unusual to see a two-thirds reduction in the cost of storing and protecting inactive files in the first year alone, and the savings accumulate as the inactive file load grows. One manufacturing customer calculated their first-year savings as 82 percent. Even if the initial costs of this tiered approach are twice as much as storage hardware (which seems unlikely), there would still be a savings as time goes by, since operating costs always exceed acquisition costs.

This second tier for inactive data can be maintained as a read-only archive, since it's very unlikely these files will be changed again. For example, an engineering firm may want to view the architectural plans from a project they did two years ago, but it's unlikely they would make any edits on that original file. This can deliver additional savings over time.

If no other message resonates, we hope we've at least established that traditional methods of accommodating data growth do not make sense when it's inactive, infrequently accessed, obsolete files that are typically responsible for that growth. Expensive primary storage, and the Sisyphean task of regular backups, will not solve the problem of inactive files.

More Stories By Bruce Backa

Bruce Backa, CEO of NTP Software®, a leader in File Data Management, is recognized as a Technology Pioneer by the World Economic Forum. Prior to founding NTP Software nearly 20 years ago, he held CTO positions at the international insurance and financial services firm AIG and the American Stock Exchange. He holds several U.S. and international patents related to data management, and was awarded the Kemeny Prize in Computing from Dartmouth College.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
The IoT market is projected to be $1.9 trillion tidal wave that’s bigger than the combined market for smartphones, tablets and PCs. While IoT is widely discussed, what not being talked about are the monetization opportunities that are created from ubiquitous connectivity and the ensuing avalanche of data. While we cannot foresee every service that the IoT will enable, we should future-proof operations by preparing to monetize them with extremely agile systems.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. Learn about IoT, Big Data and deployments processing massive data volumes from wearables, utilities and other machines.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Intelligent Systems Services will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Established in 1994, Intelligent Systems Services Inc. is located near Washington, DC, with representatives and partners nationwide. ISS’s well-established track record is based on the continuous pursuit of excellence in designing, implementing and supporting nationwide clients’ mission-critical systems. ISS has completed many successful projects in Healthcare, Commercial, Manufacturing, ...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon, will discuss how as enterprise IT embraces other IoT-related technology trends, enterprises with i...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to make sense of it all.
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities. In his session at @ThingsExpo, Gary Hall, Chief Technology Officer, Federal Defense at Cisco Systems, will break down the core capabilities of IoT in multiple settings and expand upon IoE for bo...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics archive, in his session at @ThingsExpo, Jim Kaskade, Vice President and General Manager, Big Data & Ana...