Welcome!

Open Source Cloud Authors: Liz McMillan, Pat Romanski, Elizabeth White, Yeshim Deniz, Zakia Bouachraoui

Related Topics: @CloudExpo, Microservices Expo, Open Source Cloud

@CloudExpo: Article

More Use Cases for Big Data Analytics

Measuring Development Productivity with Hadoop

After its initial start in research work and in social network sites Hadoop is now becoming a big part of the enterprise IT landscape. There were recent announcements from Microsoft about embracing Hadoop as part of its Windows Azure High Performance Computing initiative and from Oracle regarding new options like Oracle Loader support for Hadoop-processed data.

Initial Use Cases for Hadoop
The following are typical use cases that can be realized with the power of Hadoop:

  • Analyzing customer web usage towards predicting what would be of interest to the customer and target advertisements accordingly
  • Detecting fraud in online systems based on various behavioral patterns
  • Market and customer segmentation
  • Recommendation engines - increase an average order size by recommending complementary products based on predictive analysis for cross-selling.
  • You can visit the Cloudera site, which distributes Hadoop, along with various support options to suit to the enterprise to learn more about the Hadoop use cases: http://www.cloudera.com/why-hadoop/

You can also refer to my earlier article on Traditional vs Big Data Analytics on various enterprise class use cases that can be realized using big analytical tools like Hadoop.

Providing Real-Time Dashboards for Development Productivity
While most of the above use cases are about runtime benefits to the enterprise, we do find that Hadoop, if used properly, can provide much-needed insight to the development teams by providing valuable dashboards to program managers and directors about the team's productivity and where they stand with respect to code quality, code coverage and whether code can meet the required deadlines with respect to the development life cycle. Let's analyze how this can be enabled with proper usage of Hadoop.

Large application developments happen, especially when your organization is developing products or other large custom applications. As a program manager you want to get a real-time dashboard of how your development teams are progressing. The following live information may provide you with lot of insight to track the projects:

  • Lines of code (a measure of function points that also provides an idea of functional coverage of the system)
  • Code Coverage %, i.e., the percentage of code that is covered through various unit test cases.
  • Types of exception generated during unit testing, whether they are application related or system related, for example, if during development there is lot of application-related exceptions, this may be an indication that the development team does not fully know the functionality.
  • Code quality analysis - whether code is not having any audit- or metric-related issues like depth of inheritance, cyclomatic complexity, etc.
  • Traceability of application modules to requirements.
  • Whether the build process is failing to integrate the code; if so where are all the dependencies.
  • Whether the development team is following the standards with the code conventions and development standards.

Currently most of the program managers are dependent on weekly meetings with the developers to derive this information and are subject to interpretation by individual developers. The main problem is that the above mentioned metrics are scattered in multiple log files and with a large development team, this may run into a huge volume of unstructured text. Some of the following log files will be of interest in this case:

  • Source code stored in various repositories
  • Eclipse or Visual Studio Log Files generated during development and unit testing
  • Log files generated by the test tools like JUnit
  • Logging information generated by the application servers and web servers during development as the developers will likely turn on their LOG4J or equivalent logging mechanisms
  • Debugging information generated by built-in tools like Eclipse or Visual Studio
  • Logs generated by the code quality analysis tools
  • Logs generated by code vulnerability scanning tools
  • Logs generated by build environments like Ant or cruise control or the equivalent

Typically Hadoop can be used to analyze these large amounts of unstructured log files and the output can be utilized to create dashboards in real time for the program managers.

Summary
The success of this use of Hadoop depends on the technical implementation of map and reduce functionalities that will act on the huge set of log files listed above from each developer's machine. However, considering the fact that similar algorithms have been implemented for various web-based log analytics, this implementation should not be too difficult. If implemented properly this can provide a real-time dashboard for program managers to monitor the performance of the development team and take corrective actions.

More Stories By Srinivasan Sundara Rajan

Highly passionate about utilizing Digital Technologies to enable next generation enterprise. Believes in enterprise transformation through the Natives (Cloud Native & Mobile Native).

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
SYS-CON Events announced today that IoT Global Network has been named “Media Sponsor” of SYS-CON's @ThingsExpo, which will take place on June 6–8, 2017, at the Javits Center in New York City, NY. The IoT Global Network is a platform where you can connect with industry experts and network across the IoT community to build the successful IoT business of the future.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time t...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DXWorldEXPO | CloudEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
In this Women in Technology Power Panel at 15th Cloud Expo, moderated by Anne Plese, Senior Consultant, Cloud Product Marketing at Verizon Enterprise, Esmeralda Swartz, CMO at MetraTech; Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems; Seema Jethani, Director of Product Management at Basho Technologies; Victoria Livschitz, CEO of Qubell Inc.; Anne Hungate, Senior Director of Software Quality at DIRECTV, discussed what path they took to find their spot within the tec...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Disruption, Innovation, Artificial Intelligence and Machine Learning, Leadership and Management hear these words all day every day... lofty goals but how do we make it real? Add to that, that simply put, people don't like change. But what if we could implement and utilize these enterprise tools in a fast and "Non-Disruptive" way, enabling us to glean insights about our business, identify and reduce exposure, risk and liability, and secure business continuity?