Welcome!

Open Source Cloud Authors: Pat Romanski, Liz McMillan, Yeshim Deniz, Zakia Bouachraoui, William Schmarzo

Related Topics: Containers Expo Blog, Microservices Expo, Open Source Cloud, @CloudExpo

Containers Expo Blog: Article

Data Virtualization – The Quest for Agility

In the quest for agility, let’s not sweep productivity under the carpet

Yes, data virtualization is definitely about agility. We covered this from different angles in my previous article. Agility in this context is about accelerating the time-to-delivery of new critical data and reports that the business needs and trusts. It is also about an architecture that is flexible to changes in underlying data sources, without impacting the consuming applications. It's about doing data virtualization the right way so you can deliver new critical data and reports in days vs. months.

However, what about productivity? A discussion around agility cannot and must not exclude productivity. Productivity thus goes hand-in-hand with agility. Whether it is about the rate at which goods are produced or work is completed, or a measure of the efficiency of production, it's about doing something well and effectively. This begs a deeper conversation around productivity, both as a capability as well as a benefit. However, looks like somebody forgot to talk about this - or, did they?

According to the 2011 TDWI BI Benchmark Report - "By a sizable margin, development and testing is the predominant activity for BI/DW teams, showing that initiatives to build and expand BI systems are under way across the industry. Nearly 53 percent of a team's time is spent on development and testing, followed by maintenance and change management at 26 percent." If a majority of time is spent in development and testing, and if BI agility is the end goal, isn't increased productivity absolutely critical?

I think Heraclitus of Ephesus was talking of productivity in the context of data virtualization when he said, "a hidden connection is stronger than an obvious one." There is definitely a hidden connection here. However, productivity often gets brushed under the carpet. Not because it's not critical, but because data virtualization built-on data federation has its roots in SQL or XQuery. As we know, with manual coding, limited reuse, and unnecessary work, productivity simply goes out of the window.

Let's see how and where productivity fits in a data virtualization project. Such a project involves a discrete set of steps, starting from defining the data model to deploying the optimized solution. Like a piece of art, each step can be approached differently - either painfully, or by applying best practices that support productivity. Here's the typical life cycle of a data virtualization project, as prescribed by industry architects. By questioning how each step is undertaken, we can understand the impact on productivity:

  1. Model - define and represent back-end data sources as common business entities
  2. Access and Merge - federate data in real-time across several heterogeneous data sources
  3. Profile - analyze and identify issues in the federated data
  4. Transform - apply advanced transformations including data quality to federated data
  5. Reuse - rapidly and seamlessly repurpose the same virtual view for any application
  6. Move or Federate - reuse the virtual view for batch use cases
  7. Scale and Perform - leverage optimizations, caching and patterns (e.g., replication, ETL, etc.)

Let's start with model. The questions to ask here are - is there an existing data model? If yes, can you easily import the model? This means, you must be able to speak to and suck in a model from the numerous modeling tools in existence. If not, if you need to start from scratch, you must be able to jump-start the creation of a logical data model with a metadata-driven, graphical approach. Hopefully, this should be a business user friendly experience, as the business user knows the data the best. Correct?

Next, let's consider access and merge. Yes, it's the domain of data federation. It's about making many data sources look like one. However, in my discussion about why a one-trick pony won't cut it, I mentioned a recent blog by Forrester Research, Inc. It states that traditional BI approaches often fall short because BI hasn't fully empowered information workers, who still largely depend on IT. Let's ask the question - can business users also access and merge diverse data, directly, without help from IT?

After you federate data from several heterogeneous data sources, it's all over right? Wrong! That's where data virtualization built on data federation ends. To do data virtualization the right way, the show must go on, logically speaking. The same business user (not IT) must now be able to analyze not just the data sources, but also the integration logic - be it the source, the inline transformations, or the virtual target. And this is profiling of federated data - which must mean no staging and no further processing.

What follows next must be the logical progression of what needs to happen once the business user uncovers inconsistencies and inaccuracies. Yes, we need to apply advanced transformation logic that includes data quality and data masking, on the federated data, as it is in flight. This is where you must ask if there are pre-built libraries available that you can easily leverage in your canvas. I take it that we all realize that this means not having to manually code such logic, which will obviously limit any reuse.

You have defined a data model, federated data, profiled it in real-time and applied advanced transformations on the fly - all without IT. Yes, get some technical help to develop new transformations, if required. But look to do this graphically, in a metadata-driven way, where business and IT users collaborate instantly with role-based tools. Ask if you can take a virtual view and reuse it with just a few clicks. Check to see if could work with a few reusable objects instead of thousands of lines of SQL code.

We have discussed the steps involved in virtual data integration. Yes there is no physical data movement, and yes, it's magical. However, ask what you can do if you need to persist data for compliance reasons. Or, ask for the flexibility to also process by batch, when data volumes go beyond what an optimized and scalable data virtualization solution can handle. Don't just add to the problem with yet another tool. Ask if you can toggle between virtual and physical modes with the same solution.

Finally, as you prepare to deploy the virtual view, a fair question must be about performance. After all, we are talking about operating in a virtual mode. Ask about optimizations and get the skinny on caching. Try to go deep and find out how the engine has been built. Is it based on a proven, mature and scalable data integration platform or one that only does data federation with all its limitations? Also, don't forget to ask about if the same solution leverages change data capture and replication patterns.

If the solution can support the entire data virtualization life cycle discussed above, you are probably within reach of utopia - productivity and agility. The trick however, is to ask the tough questions - as each step not only shaves off many weeks from the process but also helps users become more efficient. Ah, this is beginning to sound like cutting the wait and waste in a process, as discussed in detail in the book Lean Integration. We've come full circle. But I think we just might have successfully rescued productivity from being swept under the carpet.

•   •   •

Don't forget to join me at Informatica World 2012, May 15-18 in Las Vegas, to learn the tips, tricks and best practices for using the Informatica Platform to maximize your return on big data, and get the scoop on the R&D innovations in our next release, Informatica 9.5. For more information and to register, visit www.informaticaworld.com.

More Stories By Ash Parikh

Ash Parikh is responsible for driving Informatica’s product strategy around real-time data integration and SOA. He has over 17 years of industry experience in driving product innovation and strategy at technology leaders such as Raining Data, Iopsis Software, BEA, Sun and PeopleSoft. Ash is a well-published industry expert in the field of SOA and distributed computing and is a regular presenter at leading industry technology events like XMLConference, OASIS Symposium, Delphi, AJAXWorld, and JavaOne. He has authored several technical articles in leading journals including DMReview, AlignJournal, XML Journal, JavaWorld, JavaPro, Web Services Journal, and ADT Magazine. He is the co-chair of the SDForum Web services SIG.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Every organization is facing their own Digital Transformation as they attempt to stay ahead of the competition, or worse, just keep up. Each new opportunity, whether embracing machine learning, IoT, or a cloud migration, seems to bring new development, deployment, and management models. The results are more diverse and federated computing models than any time in our history.
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...