|By Dana Gardner||
|July 1, 2014 09:30 AM EDT||
When Capgemini's business information management (BIM) practices unit needed to provide big data capabilities to its insurance company customers, it needed to deliver the right information to businesses much faster from the very bottom up.
That means an improved technical design and an architectural way of delivering information through business intelligence (BI) and analytics. The ability to bring together structured and unstructured data -- and be able to slice and dice that data in a rapid fashion; not only deploy it, but also execute rapidly for organizations out there -- was critical for CapGemini.
And that's because Capgemini's Financial Services Global Business Unit, based in the United Kingdom, must drive better value to its principal-level and senior-level consultants as they work with group-level CEOs in the financial services, insurance, and capital markets arenas. Their main focus is to drive a strategy and roadmap, consulting work, enterprise information architecture, and enterprise information strategy with a lot of those COO- and CFO-level customers.
BriefingsDirect had an opportunity to learn first-hand how big data and analysis help its Global 500 clients identify the most pressing analysis from huge data volumes we interviewed Ernie Martinez, Business Information Management Head at the Capgemini Financial Services Global Business Unit in London. The discussion, at the HP Discover conference in Barcelona, is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
Here are some excerpts:
Gardner: Risk has always been with us. But is there anything new, pressing, or different about the types of risks that your clients are trying to reduce and understand?
Martinez: I don't think it's as much about what's new within the risk world, as much as it's about the time it takes to provision the data so companies can make the right decisions faster, therefore limiting the amount of risk they may take on in issuing policies or taking on policies with new clients.
Gardner: In addition to the risk issue, of course, there is competition. The speed of business is picking up, and we’re still seeing difficult economic climates in many markets. How do you step into this environment and find a technology that can improve things? What have you found?
Martinez: There is the technology aspect of delivering the right information to business faster. There is also the business-driven way of delivering that information faster to business.
The BIM practice is a global practice. We’re ranked in the top upper right-hand quadrant in Gartner as one of the best BIM practices out there with about 7,000 BIM resources worldwide.
Our focus is on driving better value to the customer. So we have principal-level and senior-level consultants that work with group-level CEOs in the financial services, insurance, and capital markets arenas. Their main focus is to drive a strategy and roadmap, consulting work, enterprise information architecture, and enterprise information strategy with a lot of those, the COO- and CFO-level customers.
We then drive more business into the technical design and architectural way of delivering information in business intelligence (BI) and analytics. Once we define what the road to good looks like for an organization, when you talk about integrating information across the enterprise, it's about what is that path to good looks like and what are the key initiatives that an organization must do to be able to get there.
This is where our technical design, business analysis, and data analysis consultants fit in. They’re actually going in to work with business to define what do they need to see out of their information to help them make better decisions.
Gardner: Of course, the very basis of this is to identify the information, find the information, and put the information in a format that can be analyzed. Then, do the analysis, speed this all up, and manage it at scale and at the lowest possible cost. It’s a piece of cake, right? Tell us about the process you go through and how you decide what solutions to use and where the best bang for the buck comes from?
Martinez: Our approach is to take that senior-level expertise in big data and analytics, bring that into our practice, put that together with our business needs across financial services, insurance, and capital markets, and begin to define valid use cases that solve real business problems out there.
We’re a consulting organization, and I expect our teams to be able to be subject matter experts on what's happening in the space and also have a good handle on what the business problems are that our customers are facing. If that’s true, then we should be able to outline some valid use cases that are going to solve some specific problems for business customers out there.
In doing so, we’ll define that use case. We’ll do the research to validate that indeed it is a business problem that's real. Then we’ll build the business case that outlines that if we do build this piece of intellectual property (IP), we believe we can go out and proactively affect the marketplace and help customers out there. This is exactly what we did with HP and the HAVEn platform.
Why Capgemini and our BIM practices jumped in with a partnership with HP and Vertica in the HAVEn platform is really about the ability to deliver the right information to business faster from the bottom up. That means the infrastructure and the middleware by which we serve that data to business. From the top down, we work with business in a more iterative fashion in delivering value quickly out of the data that they are trying to harvest.
Gardner: So we’re talking about a situation where you want to have wide applicability of the technology across many aspects of what you are doing, that make sense economically, but of course it also has to be the right tool for the job, that's to go deep and wide. You’re in a proof-of-concept (POC) stage. How did you come to that? What were some of the chief requirements you had for doing this at that right balance of deep and wide?
Martinez: We, as an organization, believe that our goal as BI and analytics professionals is to deliver the right information faster to business. In doing so, you look at the technologies that are out there that are positioned to do that. You look at the business partners that have that mentality to actually execute in that manner. And then you look at the organization, like ours, whose sole purpose is to mobilize quickly and deliver value to customer.
I think it was a natural fit. When you look at HP Vertica in the HAVEn platform, the ability to integrate social media data through Autonomy and then of course through Vertica and Hadoop -- the integration of the entire architecture -- gives us the ability to do many things.
But number one, it's the ability to bring in structured and unstructured data, and be able to slice and dice that data in a rapid fashion; not only deploy it, but also execute rapidly for organizations out there.
Over the course of the last six months of 2013, that conversation began to blossom into a relationship. We all work together as a team and we think we can mobilize not just the application or the solution that we’re thinking about, but the entire infrastructure derivatives to our customers quickly. That's where we’re at.
What that means is that once we partnered and got the go ahead with HP Vertica to move forward with the POC, we mobilized a solution in less than 45 days, which I think shows the value of the relationship from the HP side as well as from Capgemini.
Gardner: Down the road, after some period of implementation, there are general concerns about scale when you’re dealing with big data. Because you’re near the beginning of this, how do you feel about the ability for the platform to work to whatever degree you may need?
Martinez: Absolutely no concern at all. Being here at HP Discover has certainly solidified in my mind that we’re betting on the right horse with their ability to scale. If you heard some of the announcements coming out, they’re talking about the ability to take on big data. They’re using Vertica and the HAVEn network.
There’s absolutely zero question in my mind that organizations out there can leverage this platform and grow with it over time. Also, it gives us the ability to be able to do some things that we couldn’t do a few years back.
Gardner: Ernie, let's get back to the business value here. Perhaps you can identify some of the types of companies that you think would be in the best position to use this. How will this hit the road? What are the sweet spots in the market, the applications you think would be the most urgently that make a right fit for this?
Martinez: When you talk about the largest insurers around the world, whether from Zurich to Farmers in the US to Liberty Mutual, you name it, these are some of our friendly customers that we are talking to that are providing feedback to us on this solution.
We’ll incorporate that feedback. We’ll then take that to some targeted customers in North America, UK, and across Europe, that are primed and in need of a solution that will give them the ability to not only assess risk more effectively, but reduce the time to be able to make these type of decisions.
Reducing the time to provision data reduces costs by integrating data across multiple sources, whether it be customer sentiment from the Internet, from Twitter and other areas, to what they are doing around their current policies. It allows them to identify customers that they might want to go after. It will increase their market share and reduce their costs. It gives them the ability to do many more things than they were able to do in the past.
Gardner: And Capgemini is in the position of mastering this platform and being able to extend the value of that platform across multiple clients and business units. Therefore, that reduces the total cost of that technology, but at the same time, you’re going to have access to data across industries, and perhaps across boundaries that individual organizations might not be able to attain.
So there's a value-add here in terms of your penetration into the industry and then being able to come up with the inferences. Tell me a little bit about how the access-to-data benefit works for you?
Martinez: If you take a look at the POC or the use case that he POC was built on, it was built on a commercial insurance risk assessment. If you take a look at the underlying architecture around commercial insurance risk, our goal was to be able to build an architecture that will serve the uses case that HP bought into, but at the same time, flatten out that data model and that architecture to also bring in better customer analytics for commercial insurance risk.
So we’ve flattened out that model and we’ve built the architecture so we could go after additional business, instead of more clients, across not just commercial insurance, but also general insurance. Then, you start building in the customer analytics capability within that underlying architecture and it gives us the ability to go from the insurance market over to the financial services market, as well as into the capital markets area.
Gardner: All the data in one place makes a big difference.
Martinez: It makes a huge difference, absolutely.
Gardner: Tell us a bit about the future. We’ve talked about a couple of aspects of the HAVEn suite. Autonomy, Vertica, and Hadoop seem to be on everyone's horizon at some point or another due to scale and efficiencies. Have you already been using Hadoop, or how do expect to get there?
Martinez: We haven’t used Hadoop, but certainly, with its capability, we plan to. I’ve done a number of different strategies and roadmaps in engaging with larger organizations, from American Express to the largest retailer in the world. In every case, they have a lot of issues around how they’re processing the massive amounts of data that are coming into their organization.
When you look at the extract, transform, load (ETL) processes by which they are taking data from systems of record, trying to massage that data and move it into their large databases, they are having issues around load and meeting load windows.
The HAVEn platform, in itself, gives us the ability to leverage Hadoop, maybe take out some of that processing pre-ETL, and then, before we go into the Vertica environment, be able to take out some of that load and make the Vertica even more efficient than it is today, which is one of the biggest selling points of Vertica. It certainly is in our plans.
Gardner: Another announcement here at Discover has been around converged infrastructure, where they’re trying to make the hardware-software efficiency and integration factor come to bear on some of these big-data issues. Have you thought about the deployment platform as well as the software platform?
Martinez: You bet. At the beginning of this interview, we talked about the ability to deliver the right information faster to business. This is a culture that organizations absolutely have to adopt if they are going to be able to manage the amount of data at the speed at which that data is coming to their organizations. To be able to have a partner like HP who is talking about the convergence of software and infrastructure all at the same time to help companies manage this better, is one of the biggest reasons why we're here.
We, as a consulting organization, can provide the consulting services and solutions that are going to help deliver the right information, but without that infrastructure, without that ability to be able to integrate faster and then be able to analyze what's happening out there, it’s a moot point. This is where this partnership is blossoming for us.
Gardner: Before we sign off, Ernie, now that you have gone through this understanding and have developed some insights into the available technologies and made some choices, is there any food for thought for others who might just be beginning to examine how to enter big data, how to create a common platform across multiple types of business activities? What did you not think of before that you wish you had known?
Martinez: If I look back at lessons learned over the last 60 to 90 days for us within this process, it’s one thing to say that you're mobilizing the team right from the bottom up, meaning from the infrastructure and the partnership with HP, and as well as the top-down with your business needs to finding the right business requirements and then actually building to that solution.
In most cases, we’re dealing with individuals. While we might talk about an entrepreneurial way of delivering solutions into the marketplace, we need to challenge ourselves, and all of the resources that we bring into the organization, to actually have that mentality.
What I’ve learned is that while we have some very good tactical individuals, having that entrepreneurial way of thinking and actually delivering that information is a different mindset altogether. It's about mentoring our resources that we currently have, bringing in that talent that has more of an entrepreneurial way of delivering, and trying to build solutions to go to market into our organization.
I didn’t really think about the impact of our current resources and how it would affect them. We were a little slow as we started the POC. Granted, we did this in 45 days, so that’s the perfectionist coming out in me, but I’d say it did highlight a couple of areas within our own team that we can improve on.
You may also be interested in:
- Big data meets the supply chain — SAP’s Supplier InfoNet and Ariba Network combine to predict supplier risk
- Big data should eclipse cloud as priority for enterprises
- HP Updates HAVEn Big Data Portfolio as Businesses Seek Transformation from More Data and Better Analysis
- Perfecto Mobile goes to cloud-based testing so developers can build the best apps faster
- HP's Project HAVEn rationalizes HP's portfolio while giving businesses a path to total data analysis
- Big data’s big payoff arrives as customer experience insights drive new business advantages
- Fast-changing demands on data centers drive need for uber data center infrastructure management
- How healthcare SaaS provider PointClickCare masters quality and DevOps using cloud ITSM
- HP delivers the IT means for struggling enterprises to remake themselves
- Istanbul-based Finansbank Manages Risk and Security Using HP ArcSight, Server Automation
- HP Access Catalog smooths the way for streamlined deployment of mobile apps
- HP adds new value to Vertica data analytics platform with community marketplace
- HP Vertica General Manager Colin Mahony on the next generation of analytics platforms
- HP Vertica architecture gives massive performance boost to toughest BI queries for Infinity Insurance
- HP Experts analyze and explain the HAVEn big data news from HP Discover
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" in this scenario: microservice A (releases daily) depends on a couple of additions to backend B (re...
Nov. 29, 2015 05:00 AM EST Reads: 456
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound effect on the world, and what should we expect to see over the next couple of years.
Nov. 29, 2015 04:30 AM EST Reads: 482
Container technology is shaping the future of DevOps and it’s also changing the way organizations think about application development. With the rise of mobile applications in the enterprise, businesses are abandoning year-long development cycles and embracing technologies that enable rapid development and continuous deployment of apps. In his session at DevOps Summit, Kurt Collins, Developer Evangelist at Built.io, examined how Docker has evolved into a highly effective tool for application delivery by allowing increasingly popular Mobile Backend-as-a-Service (mBaaS) platforms to quickly crea...
Nov. 29, 2015 04:00 AM EST Reads: 372
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, wil...
Nov. 29, 2015 03:00 AM EST Reads: 590
PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNub’s Data Stream Network.
Nov. 29, 2015 03:00 AM EST Reads: 336
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Ben Perlmutter, a Sales Engineer with IBM Cloudant, demonstrated techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk was on IBM Cloudant, Apache CouchDB, and ...
Nov. 29, 2015 02:45 AM EST Reads: 419
I recently attended and was a speaker at the 4th International Internet of @ThingsExpo at the Santa Clara Convention Center. I also had the opportunity to attend this event last year and I wrote a blog from that show talking about how the “Enterprise Impact of IoT” was a key theme of last year’s show. I was curious to see if the same theme would still resonate 365 days later and what, if any, changes I would see in the content presented.
Nov. 29, 2015 01:00 AM EST Reads: 434
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical to maintaining positive ROI. Raxak Protect is an automated security compliance SaaS platform and ma...
Nov. 28, 2015 08:00 PM EST Reads: 431
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Nov. 28, 2015 01:00 PM EST Reads: 479
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York and Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound cha...
Nov. 28, 2015 12:00 PM EST Reads: 554
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Nov. 28, 2015 12:00 PM EST Reads: 341
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, demonstrated examples of com...
Nov. 28, 2015 11:45 AM EST Reads: 408
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningful and actionable insights. In his session at @ThingsExpo, Paul Turner, Chief Marketing Officer at...
Nov. 28, 2015 11:15 AM EST Reads: 417
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
Nov. 28, 2015 11:00 AM EST Reads: 517
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
Nov. 28, 2015 10:30 AM EST Reads: 316
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
Nov. 28, 2015 10:00 AM EST Reads: 199
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
Nov. 28, 2015 08:45 AM EST Reads: 332
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Nov. 28, 2015 08:45 AM EST Reads: 442
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now all corporate assets – people, objects, and spaces – can share information about themselves and thei...
Nov. 28, 2015 06:00 AM EST Reads: 255
Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
Nov. 28, 2015 05:30 AM EST Reads: 736