Welcome!

Open Source Cloud Authors: Jason Bloomberg, Elizabeth White, Ed Featherston, Automic Blog, Aater Suleman

Related Topics: @BigDataExpo, Java IoT, Microservices Expo, Microsoft Cloud, Containers Expo Blog, @CloudExpo

@BigDataExpo: Article

The ‘A-ha’ About Active and Inactive Data

Despite advances in technology, the way we manage the growth in files has not changed much in the last decade or two

How many kinds of files are in use in your organization? Fifty? A hundred? More?

Wrong. There are only two kinds of files: active files and inactive files. (I admit it was a trick question.) Users create a file - a spreadsheet, an image, an invoice, etc. As long as the file is being worked with, it's an active file. Users can share, collaborate, and edit the file to their heart's content, but eventually the file is no longer current, and it becomes inactive.

But in this litigious and increasingly regulated world, there is no such thing as deleting a file. You're stuck with that inactive file, for years, if not forever, like your 30-year-old college dropout kid who still lives in the basement.

Despite advances in technology and a proliferation of alternatives, the way we manage the growth in files has not changed much in the last decade or two. We let the volume of files grow until it consumes all of the available resources, and then we throw money at the problem. Or we throw money at someone else who does essentially the same thing, just with cheaper disks.

Years ago, it was not a bad strategy. The majority of all files were active. The cost of having inactive files in the same pool was small. Today, it's an outdated strategy, and it's a costly strategy, one that few CIOs and budget-minded IT administrators feel particularly good about anymore. That's one positive side effect of the global economic recession: operations of all sizes, including Fortune 1000s and large government agencies, have much less tolerance for deferring a problem by throwing money at it.

It frankly makes no sense to spend thousands to millions of dollars on more and more primary storage to store obsolete or redundant files. It makes even less sense when we consider that runaway data growth is something of a myth when it comes to active files: in most organizations; the volume of active files - the files people create and use to do their jobs day in and day out - remains nearly fixed.

How can this be? Consider the way most workplaces operate. Your company pays its staff to work on different things this week than what they worked on last week or last month. Employees create new files for each new project, and their previous ones become inactive. In a typical organization, inactive data is responsible for data growth, not active data, whereas, the number of active files is proportional to the number of employees available to use them. For the most part, in three years, you'll have about the same number of active files as you have today, assuming you have the same headcount. Most companies add a few employees, let a few employees go, generate more files during the busy season, then fewer files during the lean times. In general, however, the number of users and the amount of active files will stay relatively constant. In this post melt-down economy, no one is hiring like there's no tomorrow. In fact, five-figure layoffs are still a common occurrence; they just don't make the news anymore.

If you've been in business for at least a few years, it's safe to assume two-thirds of your files are inactive. What's more, once saved to the network, 80 percent of your files will never even be looked at again.

Note that "looking at" an old file is not the same as "working on" an old file. An inactive file may be requested for research or reference purposes, but it won't be edited. When it reaches that stage of inactivity, it's in essence a read-only file. This means your inactive file data should be managed as a read-only archive - since the odds bet is read never, and the next best bet is read-only.

With these considerations in mind, purchasing additional primary storage to manage inactive file growth is unjustifiable, especially considering that storage operating costs can be five times the initial acquisition costs.

The growth of inactive files has an additional administrative cost when one adds on the constant cycle of backups. It's not unusual for an organization performing a standard weekly full backup, plus nightly incremental backups, to have 15, 20, even 25 copies of file data at any time. One of our customers, a multinational banking and financial services firm, was maintaining 28 copies of each file until we came in to provide relief.

Employing the same old backup tactics with online storage saves money on hardware, but does not address the core problem of file proliferation. Transferring inactive files to less costly archival storage media works well on paper, but not in practice. Both these methods suffer from inefficiencies and insecurities. Physically moving a petabyte of data can take months, and finding a file takes arduously long (if it's ever found at all; more than 60 percent of the time, it's not). This can spell the end of the business when a federal court judge comes around asking the company to produce a few thousand documents.

The real savings kick in when the typical file backup strategy is scrapped in favor of tiering, which is based on an online archive for inactive files. The key to achieving this is controlling and managing how and when files are moved - in other words, when a file becomes inactive.

The goal is to categorize data, properly manage it, and move the right data to the second tier to reduce costs, address compliance issues, and meet electronic discovery requirements. Granted, the cost, time, and complexity involved in scanning the entire file system and moving data to an archive can be onerous. However, event- and policy-driven software that automates these processes based on age, type, size or access frequency makes this a much smaller chore.

With this method, it's not unusual to see a two-thirds reduction in the cost of storing and protecting inactive files in the first year alone, and the savings accumulate as the inactive file load grows. One manufacturing customer calculated their first-year savings as 82 percent. Even if the initial costs of this tiered approach are twice as much as storage hardware (which seems unlikely), there would still be a savings as time goes by, since operating costs always exceed acquisition costs.

This second tier for inactive data can be maintained as a read-only archive, since it's very unlikely these files will be changed again. For example, an engineering firm may want to view the architectural plans from a project they did two years ago, but it's unlikely they would make any edits on that original file. This can deliver additional savings over time.

If no other message resonates, we hope we've at least established that traditional methods of accommodating data growth do not make sense when it's inactive, infrequently accessed, obsolete files that are typically responsible for that growth. Expensive primary storage, and the Sisyphean task of regular backups, will not solve the problem of inactive files.

More Stories By Bruce Backa

Bruce Backa, CEO of NTP Software®, a leader in File Data Management, is recognized as a Technology Pioneer by the World Economic Forum. Prior to founding NTP Software nearly 20 years ago, he held CTO positions at the international insurance and financial services firm AIG and the American Stock Exchange. He holds several U.S. and international patents related to data management, and was awarded the Kemeny Prize in Computing from Dartmouth College.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo | @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA. Learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Data is an unusual currency; it is not restricted by the same transactional limitations as money or people. In fact, the more that you leverage your data across multiple business use cases, the more valuable it becomes to the organization. And the same can be said about the organization’s analytics. In his session at 19th Cloud Expo, Bill Schmarzo, CTO for the Big Data Practice at Dell EMC, introduced a methodology for capturing, enriching and sharing data (and analytics) across the organization...
Five years ago development was seen as a dead-end career, now it’s anything but – with an explosion in mobile and IoT initiatives increasing the demand for skilled engineers. But apart from having a ready supply of great coders, what constitutes true ‘DevOps Royalty’? It’ll be the ability to craft resilient architectures, supportability, security everywhere across the software lifecycle. In his keynote at @DevOpsSummit at 20th Cloud Expo, Jeffrey Scheaffer, GM and SVP, Continuous Delivery Busine...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
SYS-CON Events announced today that Juniper Networks (NYSE: JNPR), an industry leader in automated, scalable and secure networks, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Juniper Networks challenges the status quo with products, solutions and services that transform the economics of networking. The company co-innovates with customers and partners to deliver automated, scalable and secure network...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists will examine how DevOps helps to meet th...
SYS-CON Events announced today that Hitachi, the leading provider the Internet of Things and Digital Transformation, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Hitachi Data Systems, a wholly owned subsidiary of Hitachi, Ltd., offers an integrated portfolio of services and solutions that enable digital transformation through enhanced data management, governance, mobility and analytics. We help globa...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Analytic. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
SYS-CON Events announced today that Hitachi, the leading provider the Internet of Things and Digital Transformation, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Hitachi Data Systems, a wholly owned subsidiary of Hitachi, Ltd., offers an integrated portfolio of services and solutions that enable digital transformation through enhanced data management, governance, mobility and analytics. We help globa...
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software in the hope of capturing value in IoT. Although IoT is relatively new in the market, it has already gone through many promotional terms such as IoE, IoX, SDX, Edge/Fog, Mist Compute, etc. Ultimately, irrespective of the name, it is about deriving value from independent software assets participating in an ecosystem as one comprehensive solution.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Val...
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
SYS-CON Events announced today that T-Mobile will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. As America's Un-carrier, T-Mobile US, Inc., is redefining the way consumers and businesses buy wireless services through leading product and service innovation. The Company's advanced nationwide 4G LTE network delivers outstanding wireless experiences to 67.4 million customers who are unwilling to compromise on ...
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in compute, storage and networking technologies, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/...
Judith Hurwitz is president and CEO of Hurwitz & Associates, a Needham, Mass., research and consulting firm focused on emerging technology, including big data, cognitive computing and governance. She is co-author of the book Cognitive Computing and Big Data Analytics, published in 2015. Her Cloud Expo session, "What Is the Business Imperative for Cognitive Computing?" is scheduled for Wednesday, June 8, at 8:40 a.m. In it, she puts cognitive computing into perspective with its value to the busin...
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Val...
SYS-CON Events announced today that CollabNet, a global leader in enterprise software development, release automation and DevOps solutions, will be a Bronze Sponsor of SYS-CON's 20th International Cloud Expo®, taking place from June 6-8, 2017, at the Javits Center in New York City, NY. CollabNet offers a broad range of solutions with the mission of helping modern organizations deliver quality software at speed. The company’s latest innovation, the DevOps Lifecycle Manager (DLM), supports Value S...