Welcome!

Open Source Cloud Authors: Stackify Blog, Vaibhaw Pandey, Liz McMillan, Pat Romanski, Wesley Coelho

Related Topics: @DXWorldExpo, Open Source Cloud, Apache

@DXWorldExpo: Blog Post

NoSQL Integration with the Hadoop Ecosystem By @MapR | @BigDataExpo

How NoSQL and Hadoop can work together to tackle Big Data challenges

Apache Hadoop is an open source Big Data processing platform that comes with its own extensive ecosystem to support various business and technical needs. Hadoop's specialty is large-scale processing and analytics over volumes of data that cannot be efficiently handled by traditional technologies. Hadoop is often complemented by the class of database management technologies referred to as NoSQL, which is also great for large volumes of data, but NoSQL is more about fast reads and writes than about massive processing. NoSQL and Hadoop can work together to tackle big data challenges.

One thing to note up front is that because Hadoop has an associated storage system, it is sometimes mistakenly assumed to be a database management system. It was also sometimes classified as a NoSQL system during its early days, though the NoSQL label is generally accepted today to refer specifically to databases. And while Hadoop is ideal for storing a variety of data types, it is actually about spreading work across many servers in a cluster, which is something that databases were generally not designed to do.

Hadoop Summarized
To describe what Hadoop covers, let's first look at the four primary components of Hadoop:

  • MapReduce: A distributed programming framework that manages the spreading of work across many nodes in a cluster
  • Hadoop Common: A package containing the libraries and utilities to support associated Hadoop modules
  • YARN: A resource management platform (Yet Another Resource Negotiator) for managing computing resources and scheduling tasks
  • HDFS: The Hadoop Distributed File System which manages Hadoop data, and can be substituted with more sophisticated file systems to handle business-critical needs

Each of these components play a role in defining what Hadoop is. Collectively these Hadoop components support the processing of data-intensive distributed applications, enabling them to work in a deployment potentially made up of thousands of nodes and petabytes of data. Each node is an independent computer and is often assigned a subtask by Hadoop that is run in parallel with other nodes to efficiently complete a much bigger task.

MapReduce and Hadoop Common represent the data processing tools that make Hadoop a great platform for big data. MapReduce supports efficient parallel processing, and its function is to ship applications (which will do the processing) to the nodes where the data reside. This enables "data locality" in which nodes perform the processing on the data they store, to minimize excess network traffic that would result from having nodes process data that reside on other nodes in the cluster.

YARN is a relatively new component to Hadoop that helps schedule tasks across the cluster. Known as MapReduce 2.0 (or "MRv2"), it represents a framework that allows you to run new non-MapReduce jobs in your Hadoop cluster in addition to your standard MapReduce jobs.

HDFS provides the storage functionality in Hadoop and splits large files into small blocks (default is 64 MB) and distributes it across the clustered nodes. It ensures data is replicated so that if a node in the cluster fails, data replicas minimize the risk of data loss. In other words, it has the key design goal of overcoming hardware failure, which is critical particularly when low-cost, commodity hardware is used. Another important design goal in Hadoop was to enable swapping out HDFS for another file system. Some Hadoop vendors took advantage of this architecture to provide value-added capabilities beyond those which standard HDFS provides. As an example, MapR Technologies provides MapR-FS which improves the high availability, disaster recovery, and snapshot capabilities over HDFS, while also adding full read/write capabilities, true NFS access, and higher performance.

MapReduce and HDFS are derived from Google's work on MapReduce and the Google File System (GFS). In addition to the above components, Hadoop consists of a number of related projects like Apache Hive, Apache HBase, and Apache Pig. The wide variety of projects in the Hadoop ecosystem gives you the opportunity to select the right tool for specific use cases with specific requirements when processing and analyzing big data.

... And Now NoSQL
NoSQL, on the other hand, is a category of database management systems, but differs from the likes of Oracle, DB2, MySQL, and other relational database management systems (RDBMS), and so are often described as "non-relational." This means they don't rely on the relational model, in which data is stored in tabular form with consistent rows and columns. Instead, they are more free-form in structure to accommodate varying and changing data types.

NoSQL databases provide a fast and efficient mechanism for the storage and retrieval of data, promoting goals such as simplicity, horizontal scaling capability, and better availability. NoSQL databases are used most often in big data and analytic applications, particularly ones in which fast data access is more important than large-scale, parallel processing.

How Can Hadoop and NoSQL Work Together?
While Hadoop and NoSQL do not have exactly the same functions, they are both related to solving big data problems. The Hadoop framework is used most commonly for processing huge amounts of data, and NoSQL is designed for fast, efficient storage and retrieval of large volumes of data. Considering that many early deployments of Hadoop entailed integrations with RDBMSs, integrating Hadoop with NoSQL was a logical next step.

In many cases, data sets processed in the Hadoop system are originally created and stored in a NoSQL database. Whenever you have interactive applications that create new data, you should consider how you can analyze that data to derive important business insights. For example, you might use NoSQL to store and deliver messages between end users, and then use Hadoop to scan the aggregate collection of messages for sentiment analysis. Tools like Apache Sqoop, database-specific connectors, or third-party data integration products let you copy data from a NoSQL system into Hadoop for the large-scale processing.

There are also independent use cases which may not require the support of both platforms. For example, if it is only necessary to perform parallel processing of simple log data, and then store it in HDFS, then Hadoop alone may be sufficient. Similarly, if the only required function in a given use case is to store and then retrieve data such as web application session state, a NoSQL database will be sufficient. But these "standalone" use cases might be short-lived, as enterprises will continue to find more ways to leverage seemingly low value data into important business insights.

An emerging architecture for NoSQL and Hadoop integration that's worth considering entails the "in-Hadoop" databases that are built specifically to run within the Hadoop framework. Examples include Apache HBase, Apache Accumulo, as well as the MapR-DB In-Hadoop NoSQL database, which was architected for business-critical production deployments. With the combined advantages of both Hadoop's processing framework and NoSQL's fast data access, but without the overhead of moving data from one cluster to another, the Enterprise Database Edition of the MapR Distribution including Hadoop supports high performance, extreme scalability, high availability, snapshots, disaster recovery, integrated security, and more. The best of both technologies make this an ideal environment for big data solutions.

To learn more about how you can optimize your enterprise architecture, down this free whitepaper: Optimize Your Enterprise Architecture with Hadoop and NoSQL.

More Stories By Dale Kim

Dale is Director of Industry Solutions at MapR. His technical and managerial experience includes work with relational databases, as well as non-relational data in the areas of search, content management, and NoSQL. Dale holds an MBA from Santa Clara University, and a BA in Computer Science from the UC Berkeley.

@ThingsExpo Stories
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics gr...