|By Kamyar Emami||
|January 14, 2013 07:00 AM EST||
The industrial revolution continues - starting with the steam engines of the 18th century, continuing with large-scale steel production, oil exploitation, electrical and photographic innovations of the 19th century, and moving on to the transportation, communications, computation and electronics of the 20th century. It is still early in the 21st century, but we can safely say software has become the engine that feeds the industrial, economic, medical, and gradually the political issues of our existence. The only way to satisfy the demand for the volume and complexity of the software that is needed to keep our world moving is to maximally share and reuse code within and across application domains.
Open Source Software (OSS) is the epitome of code reuse, enabling complex applications to be realized rapidly, economically and safely. Probably the largest collaborative endeavor in human kind to date, open source feeds on itself. Independent studies have converged on the fact that open source is everywhere, and depending on the source of the study, 80-100% of all software organizations now use open source software in their products or operations.
There is an implicit understanding that good developers do not write code from scratch any more. Rather, they can adapt a piece of existing code to furnish a desired function. Use of off-the-shelf code such as OSS brings the usual due diligence and precautions associated with deployment of any third-party content within an organization. The pedigree of the code, its ownership attributes, and the rules around the use of open source code (typically captured in a license document) govern its introduction within an organization and its suitability for an end-target use.
Open source software brings with it an unusual set of ownership issues. Unlike other commodities, open source code can be brought into an organization freely. While anything that is purchased in a transaction has implied ownership, ownership and usage of OSS can be confusing to many developers or organizations. Generally, the copyright ownership of OSS always stays with the creator of the open source code. The OSS copyright owner creates and communicates a license that explicitly sets out the rules governing the use of that open source software.
Continuous Versus One-Time OSS Assessment
Detecting and complying with OSS code in a software project has become an important part of a quality and governance process in organizations that create or consume software.
Product quality considerations and standards require that a recorded knowledge of all the third-party components within a product be maintained at all times. These records should also include attributes such as pedigree, defect history and improvements over time, potential vulnerabilities, and code propagation within the organization. These records can be best maintained, not by a one-time examination of the organization's code portfolio, but through an ongoing and structured third-party and open source software adoption process. The practice of creating and updating the records automatically as development proceeds ensures ongoing compliance with the requirements of a quality organization.
As opposed to the continuous recordkeeping requirements of a quality process, a software audit is a one-time activity targeted at providing insight into the intellectual property (IP) ownership or IP rights, in anticipation of a transaction such as an M&A or a product shipment to market.
Open source software audits, generally carried out by an external body, involves an examination of a software portfolio in order to detect OSS and third-party code within that portfolio. The result of the audit is a report which highlights open source and third party components and their attributes. At a high level, statistics such as the names of any public-domain software packages and whether they are used in a modified or unmodified format, composition of the license mix, copyrights, vulnerabilities, languages, and open source lines of code are provided. At a more detailed level, specific open source or proprietary packages that were discovered, their license attributes, links to resources that contain additional information, text of the licenses, copyrights, and know security vulnerabilities associated with the components of the software are provided.
An audit process would highlight code that is specifically copyrighted, but for which no license is offered or mentioned. These cases are one of the challenging aspects in establishing IP ownership, as the copyright owner must be contacted for explicit permission to use their code. Also, any code that is not in the public domain and has no identifying information, such as headers, must be highlighted as requiring further investigation.
The Science: Detecting Third-Party Packages in a Portfolio
Once all OSS and other third-party packages are identified and a software Bill of Materials (BoM) is available, then the list can be examined for properties such as licenses, known deficiencies, security vulnerabilities, various obligations associated with their use, and functions such as encryption that could restrict its use in certain markets.
A number of methods can be used to identify open source and commercial software within a software portfolio. A short list of these methods will include the following.
- Records by developers: Any records maintained by developers and development managers will assist the process.
- Information held within a file, or folder: A quality-development practice is to include a header on every file that holds information about the software package, organization, copyright and license associated with the file or the package. Often binaries also include identifying information, such as copyright owner, project name, and the license pointers within the file. Another quality practice is to include licensing or other information about a software package within the package.
- File/folder names and paths: These could be additional indicators of the presence of a known public domain or commercial software.
- Similarity with public-domain software: Any similarity between a software file, or portions of a software file, and a file in public domain could accurately indicate the presence of OSS in a portfolio. A full-file code similarity to a public domain file would indicate unmodified use of the OSS software. A partial code match would indicate use of OSS in a modified form. This is significant because many open source licenses trigger different obligations based on how the file is used.
There are hundreds of thousands of public-domain projects accessible to developers. When you consider that an OSS project can have multiple versions in the public domain and each package can consist of anything between two and 200,000 files, we gain an appreciation for the task involved in this method. Manual identification of code similarity to millions of files is obviously impractical. Only intelligent automated solutions can go through a software portfolio and examine similarity between each and every file in that portfolio and software files in public domain.
The Art: Reading Between the Lines
The methods described above could theoretically provide insight into composition of a code portfolio. However, those methods alone are not sufficient to reveal an accurate view of the code composition.
- Manual records are the least reliable method as third-party content is often brought into a project without registering a record. Also, in today's typical development environment it is very difficult to guarantee access to the original developer's piece of software.
- File header information is not necessarily an accurate representation of the file pedigree and license. These can be changed by a developer, or automatically as in Linux kernel header files that were used in Android packages.
- Almost all OSS uses OSS. For example, there are more than 65,000 instances of commons.logging (a popular Apache logging layer), and more than 50,000 instances each of Log4j (another Apache logging utility for Java) and JUnit (a popular testing framework for Java) code in various public-domain projects. This leads to OSS project and license nesting complexities and contributes to challenges in correctly identifying the OSS packages within a portfolio. It is critical to detect the genesis OSS project and the version of the original and framing OSS projects. Practical examples of this challenge are:
- Open source packages that use other OSS but do not maintain or propagate the original OSS license. This action may not be legitimate depending on the original OSS license and the license of the derived OSS package and can lead to erroneous conclusions about the quality as well as usage obligations associated with the organization's software.
- Open source packages that change the license in subsequent versions such as moving from one version of a license. For example, GPLv2 to GPLv3 on a new release of the OSS project or moving from one license to another compatible license.
- Projects where the OSS file or folder name is modified. This happens regularly on libraries (those with .jar or .bin extensions), and less frequently on source code files.
- Seemingly known but non-existent licenses mentioned within a file or folder. A prime example is an OSS package that claims it's released under GPL. There is no GPL, only versioned GPL such as GPL v1, v2 licenses are available.
The art of OSS audit activity and license management relies on a clear understanding of the open source software community, open source packages, open source licenses, and development practices. This understanding comes with both academic knowledge as well as experience in scanning, reviewing, and auditing hundreds of software portfolios. Automated solutions that combine the science of scanning and license management with empirical methods that embody the art of open source package detection and license discovery can significantly speed up the discovery and management process and minimize, although not eliminate, the human involvement factor.
Rapid software development is necessary for sustaining the pace of innovation needed in today's world and the use of OSS is perhaps the best way to maintain this pace. While most organizations are now using open source to their advantage, they must avoid potential complications that OSS license obligations present. The complexity that OSS licenses present makes it almost impossible to manage obligations manually. This is where automated solutions come in. Conducting a one-time audit, or preferably, having a continuous process in place to automatically detect OSS licenses and their obligations is the best way to protect your organization from risk while ensuring continuous innovation.
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understanding the kinds of data: structured, unstructured, big/small? Analytics: What kinds and how responsiv...
Mar. 4, 2015 05:00 AM EST Reads: 2,641
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing demand and the rapidly changing workspace model.
Mar. 4, 2015 04:00 AM EST Reads: 1,081
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
Mar. 4, 2015 03:30 AM EST Reads: 2,725
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. He discussed opportunities and challenges ahead for the IoT from a market and technical point of vie...
Mar. 4, 2015 02:45 AM EST Reads: 3,814
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS solutions that provide a Hadoop flavor either make choices for customers very flexible in the name of opti...
Mar. 4, 2015 02:30 AM EST Reads: 1,169
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
Mar. 4, 2015 02:00 AM EST Reads: 3,125
Advanced Persistent Threats (APTs) are increasing at an unprecedented rate. The threat landscape of today is drastically different than just a few years ago. Attacks are much more organized and sophisticated. They are harder to detect and even harder to anticipate. In the foreseeable future it's going to get a whole lot harder. Everything you know today will change. Keeping up with this changing landscape is already a daunting task. Your organization needs to use the latest tools, methods and expertise to guard against those threats. But will that be enough? In the foreseeable future attacks w...
Mar. 4, 2015 01:30 AM EST Reads: 3,519
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize supplier management. Learn about enterprise architecture strategies for designing connected systems tha...
Mar. 4, 2015 12:30 AM EST Reads: 3,542
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
Mar. 4, 2015 12:00 AM EST Reads: 3,088
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along with a steady stream of well-publicized data breaches, only add to the uncertainty
Mar. 3, 2015 11:15 PM EST Reads: 748
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
Mar. 3, 2015 09:15 PM EST Reads: 874
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will discuss how to cut costs, scale easily, and unleash insight with CommVault Simpana software, the only si...
Mar. 3, 2015 05:00 PM EST Reads: 963
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems, will focus on how to set up a cloud data governance program and s...
Mar. 3, 2015 04:15 PM EST Reads: 927
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been involved at the beginning of four IT industries: EDA, Open Systems, Computer Security and now SOA.
Mar. 3, 2015 04:00 PM EST Reads: 1,443
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Mar. 3, 2015 03:15 PM EST Reads: 1,521
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
Mar. 3, 2015 02:00 PM EST Reads: 1,480
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
Mar. 3, 2015 01:45 PM EST Reads: 1,369
HP and Aruba Networks on Monday announced a definitive agreement for HP to acquire Aruba, a provider of next-generation network access solutions for the mobile enterprise, for $24.67 per share in cash. The equity value of the transaction is approximately $3.0 billion, and net of cash and debt approximately $2.7 billion. Both companies' boards of directors have approved the deal. "Enterprises are facing a mobile-first world and are looking for solutions that help them transition legacy investments to the new style of IT," said Meg Whitman, Chairman, President and Chief Executive Officer of HP...
Mar. 3, 2015 01:00 PM EST Reads: 940
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch of Docker's initial release in March of 2013, interest was revved up several notches. Then late last...
Mar. 3, 2015 01:00 PM EST Reads: 996
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
Mar. 3, 2015 12:00 PM EST Reads: 1,455