Welcome!

Open Source Cloud Authors: Zakia Bouachraoui, Yeshim Deniz, Liz McMillan, Pat Romanski, Elizabeth White

Related Topics: Containers Expo Blog, Microservices Expo, Open Source Cloud, @CloudExpo

Containers Expo Blog: Article

Data Virtualization – The Quest for Agility

In the quest for agility, let’s not sweep productivity under the carpet

Yes, data virtualization is definitely about agility. We covered this from different angles in my previous article. Agility in this context is about accelerating the time-to-delivery of new critical data and reports that the business needs and trusts. It is also about an architecture that is flexible to changes in underlying data sources, without impacting the consuming applications. It's about doing data virtualization the right way so you can deliver new critical data and reports in days vs. months.

However, what about productivity? A discussion around agility cannot and must not exclude productivity. Productivity thus goes hand-in-hand with agility. Whether it is about the rate at which goods are produced or work is completed, or a measure of the efficiency of production, it's about doing something well and effectively. This begs a deeper conversation around productivity, both as a capability as well as a benefit. However, looks like somebody forgot to talk about this - or, did they?

According to the 2011 TDWI BI Benchmark Report - "By a sizable margin, development and testing is the predominant activity for BI/DW teams, showing that initiatives to build and expand BI systems are under way across the industry. Nearly 53 percent of a team's time is spent on development and testing, followed by maintenance and change management at 26 percent." If a majority of time is spent in development and testing, and if BI agility is the end goal, isn't increased productivity absolutely critical?

I think Heraclitus of Ephesus was talking of productivity in the context of data virtualization when he said, "a hidden connection is stronger than an obvious one." There is definitely a hidden connection here. However, productivity often gets brushed under the carpet. Not because it's not critical, but because data virtualization built-on data federation has its roots in SQL or XQuery. As we know, with manual coding, limited reuse, and unnecessary work, productivity simply goes out of the window.

Let's see how and where productivity fits in a data virtualization project. Such a project involves a discrete set of steps, starting from defining the data model to deploying the optimized solution. Like a piece of art, each step can be approached differently - either painfully, or by applying best practices that support productivity. Here's the typical life cycle of a data virtualization project, as prescribed by industry architects. By questioning how each step is undertaken, we can understand the impact on productivity:

  1. Model - define and represent back-end data sources as common business entities
  2. Access and Merge - federate data in real-time across several heterogeneous data sources
  3. Profile - analyze and identify issues in the federated data
  4. Transform - apply advanced transformations including data quality to federated data
  5. Reuse - rapidly and seamlessly repurpose the same virtual view for any application
  6. Move or Federate - reuse the virtual view for batch use cases
  7. Scale and Perform - leverage optimizations, caching and patterns (e.g., replication, ETL, etc.)

Let's start with model. The questions to ask here are - is there an existing data model? If yes, can you easily import the model? This means, you must be able to speak to and suck in a model from the numerous modeling tools in existence. If not, if you need to start from scratch, you must be able to jump-start the creation of a logical data model with a metadata-driven, graphical approach. Hopefully, this should be a business user friendly experience, as the business user knows the data the best. Correct?

Next, let's consider access and merge. Yes, it's the domain of data federation. It's about making many data sources look like one. However, in my discussion about why a one-trick pony won't cut it, I mentioned a recent blog by Forrester Research, Inc. It states that traditional BI approaches often fall short because BI hasn't fully empowered information workers, who still largely depend on IT. Let's ask the question - can business users also access and merge diverse data, directly, without help from IT?

After you federate data from several heterogeneous data sources, it's all over right? Wrong! That's where data virtualization built on data federation ends. To do data virtualization the right way, the show must go on, logically speaking. The same business user (not IT) must now be able to analyze not just the data sources, but also the integration logic - be it the source, the inline transformations, or the virtual target. And this is profiling of federated data - which must mean no staging and no further processing.

What follows next must be the logical progression of what needs to happen once the business user uncovers inconsistencies and inaccuracies. Yes, we need to apply advanced transformation logic that includes data quality and data masking, on the federated data, as it is in flight. This is where you must ask if there are pre-built libraries available that you can easily leverage in your canvas. I take it that we all realize that this means not having to manually code such logic, which will obviously limit any reuse.

You have defined a data model, federated data, profiled it in real-time and applied advanced transformations on the fly - all without IT. Yes, get some technical help to develop new transformations, if required. But look to do this graphically, in a metadata-driven way, where business and IT users collaborate instantly with role-based tools. Ask if you can take a virtual view and reuse it with just a few clicks. Check to see if could work with a few reusable objects instead of thousands of lines of SQL code.

We have discussed the steps involved in virtual data integration. Yes there is no physical data movement, and yes, it's magical. However, ask what you can do if you need to persist data for compliance reasons. Or, ask for the flexibility to also process by batch, when data volumes go beyond what an optimized and scalable data virtualization solution can handle. Don't just add to the problem with yet another tool. Ask if you can toggle between virtual and physical modes with the same solution.

Finally, as you prepare to deploy the virtual view, a fair question must be about performance. After all, we are talking about operating in a virtual mode. Ask about optimizations and get the skinny on caching. Try to go deep and find out how the engine has been built. Is it based on a proven, mature and scalable data integration platform or one that only does data federation with all its limitations? Also, don't forget to ask about if the same solution leverages change data capture and replication patterns.

If the solution can support the entire data virtualization life cycle discussed above, you are probably within reach of utopia - productivity and agility. The trick however, is to ask the tough questions - as each step not only shaves off many weeks from the process but also helps users become more efficient. Ah, this is beginning to sound like cutting the wait and waste in a process, as discussed in detail in the book Lean Integration. We've come full circle. But I think we just might have successfully rescued productivity from being swept under the carpet.

•   •   •

Don't forget to join me at Informatica World 2012, May 15-18 in Las Vegas, to learn the tips, tricks and best practices for using the Informatica Platform to maximize your return on big data, and get the scoop on the R&D innovations in our next release, Informatica 9.5. For more information and to register, visit www.informaticaworld.com.

More Stories By Ash Parikh

Ash Parikh is responsible for driving Informatica’s product strategy around real-time data integration and SOA. He has over 17 years of industry experience in driving product innovation and strategy at technology leaders such as Raining Data, Iopsis Software, BEA, Sun and PeopleSoft. Ash is a well-published industry expert in the field of SOA and distributed computing and is a regular presenter at leading industry technology events like XMLConference, OASIS Symposium, Delphi, AJAXWorld, and JavaOne. He has authored several technical articles in leading journals including DMReview, AlignJournal, XML Journal, JavaWorld, JavaPro, Web Services Journal, and ADT Magazine. He is the co-chair of the SDForum Web services SIG.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to great conferences, helping you discover new conferences and increase your return on investment.
DXWorldEXPO LLC announced today that ICOHOLDER named "Media Sponsor" of Miami Blockchain Event by FinTechEXPO. ICOHOLDER gives detailed information and help the community to invest in the trusty projects. Miami Blockchain Event by FinTechEXPO has opened its Call for Papers. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Miami Blockchain Event by FinTechEXPOalso offers sp...
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or per...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time t...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
SYS-CON Events announced today that IoT Global Network has been named “Media Sponsor” of SYS-CON's @ThingsExpo, which will take place on June 6–8, 2017, at the Javits Center in New York City, NY. The IoT Global Network is a platform where you can connect with industry experts and network across the IoT community to build the successful IoT business of the future.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.