Click here to close now.


Open Source Cloud Authors: Elizabeth White, Jayaram Krishnaswamy, XebiaLabs Blog, Liz McMillan, Pat Romanski

News Feed Item

Texas A&M System Teams with IBM to Drive Computational Sciences Research through Big Data and Analytics

High performance computing (HPC) system will speed research to advance energy resource management, accelerate materials development, ensure the sustainability of food supplies, and improve animal health

COLLEGE STATION, Texas and ARMONK, N.Y., Jan. 29, 2014 /PRNewswire/ -- Texas A&M University System and IBM (NYSE: IBM) today announced an agreement that is the beginning of a broad research collaboration supported by one of the largest computational sciences infrastructure dedicated to advances in agriculture, geosciences and engineering.

(Logo: )

The collaboration will leverage the power of big data analytics and high performance computing (HPC) systems for innovative solutions across a spectrum of challenges, such as improving extraction of Earth-based energy resources, facilitating the smart energy grid, accelerating materials development, improving disease identification and tracking in animals, and fostering better understanding and monitoring of our global food supplies.

"Combining the incredible intellectual and technological resources of Texas A&M University and IBM will further position Texas as a leader in identifying and solving some of the most complex challenges we face," Texas Gov. Rick Perry said. "The work that will be done here will change lives and potentially save lives not just in our state, but our nation and around the world."

IBM will provide the infrastructure for the joint research consisting of Blue Gene/Q technology, Power and System x servers, and General Parallel File Systems (GPFS) Storage Systems. A test of the Blue Gene/Q on campus found that it ran a material sciences problem that previously took weeks to solve and produced a solution in "a fraction of an hour" with much greater analytical depth.

"The Texas A&M System and IBM share a passion and a commitment to research that identifies practical solutions to global challenges," said Chancellor John Sharp, Texas A&M University System. "As the largest research university in the state, this agreement is a major step forward for the A&M System in research computing power. This brings together the best computer scientists and technology in the world to focus on issues so important to our role as a leading research institution and to our land-grant mission of serving the state while also providing resources to serve the greater good throughout the world."

IBM Research and the A&M System intend to align skills, assets and resources to pursue fundamental research, applied development, educational reach and sustainable commercial activities with projects that may include:

  • Sustainable Availability of Food: Efficiently providing sufficient food for a growing global population
  • Disease Spread Tracking, Modeling and Prediction: Early and accurate detection and prediction of infectious disease spread to allow the design, testing and manufacturing of medical countermeasures
  • Energy Resource Management: Responsibly explore, extract, and deliver energy resources
  • New Materials Development: Atomic-level modeling, design and testing of new materials for advanced applications in energy, aerospace, structural and defense applications

As a premier engineering research agency of Texas, Texas A&M Engineering Experiment Station (TEES), which conducts research to provide practical answers to critical state and national needs, will be heavily involved from the Texas A&M University System and according to Katherine Banks, Director of TEES and Vice Chancellor of Engineering, "This is a unique opportunity to meet the needs of engineering, geosciences and agriculture and life sciences researchers to expand in areas not feasible before with small-scale HPC systems."

"IBM and the Texas A&M System have crafted a unique collaboration that could apply computational science and big data analytics to some of the most daunting problems in agriculture, geosciences and engineering," said William LaFontaine, Vice President of High Performance Analytics and Cognitive Markets at IBM. "With the combined research capabilities of both institutions and ready access to state-of-the-art computing technology, we feel this collaboration could produce significant scientific insights leading to industry-changing solutions and material economic impact. We are extremely pleased to be engaged with such extraordinarily capable institutions in the A&M System and look forward to years of discovery and innovation."

TEES partners with academic institutions, governmental agencies, industries, and communities to solve problems to help improve the quality of life, promote economic development, and enhance the educational systems of Texas. It is intimately connected with the College of Engineering of Texas A&M University, which is undergoing an unprecedented growth to become a College with 25,000 students by the year 2025 and hire a new generation of faculty who will be addressing the Nation's needs for research and technology development. 

In support of the long-term research effort, IBM will supply to the A&M System cutting edge technical computing technologies, which will be cloud-enabled. The A&M System will deploy a research computing cloud that will comprise of IBM hardware and software including:

  • Blue Gene/Q: Serving as the foundation of the computing infrastructure, a Blue Gene/Q system consisting of two racks, with more than 2,000 compute nodes, will provide 418 teraflops (TF) of sustained performance for big data analytics, complex modeling, and simulation of molecular dynamics, protein folding and organ modeling.
  • Power Systems: A total of 75 PowerLinux 7R2 servers with POWER7+ microprocessors will be connected by 10GbE into a system optimized for big data and analytics and high performance computing. This complex includes IBM BigInsights and Platform Symphony software, IBM Platform LSF scheduler, and IBM General Parallel File System.
  • System x: The solution will contain an estimated 900 IBM System x dense hyperscale compute nodes as part of an IBM NeXtScale system. Some of the nodes will be managed by Platform Cluster Manager Advanced Edition (PCM-AE) as a University-wide HPC cloud while the others will be managed by Platform Cluster Manager Standard Edition (PCM-SE) and serve as a general purpose compute infrastructure for the geosciences and open source analytics initiatives.
  • Platform Computing: Platform Computing software will be used to manage and accelerate various computational workloads. Platform Symphony will drive big data and analytics, and Platform LSF will drive traditional HPC and technical computing workloads. Platform Computing will also power the creation of an HPC cloud, allowing users within the A&M System access to the system.
  • General Parallel File System (GPFS): Five IBM System x GPFS Storage Servers (GSS) will provide five petabytes (PB) of shared storage for use by the compute building blocks using high-speed networks. GPFS will also include an IBM FlashSystem 820 tier with 10 terabytes (TB) of flash storage, delivering performance to accelerate computation for use primarily by Texas A&M Agrilife Research, Geosciences and university HPC as a part of the research computing infrastructure.

Furthermore, IBM will work with researchers at the A&M System to assess new computing technologies that will be necessary to advance data-driven science discovery and innovation over the next several years.

About IBM
For more information on IBM Research visit
For more information on IBM Technical Computing visit

About the A&M System
The A&M System is one of the largest systems of higher education in the nation, with a budget of $3.5 billion. Through a statewide network of 11 universities, seven state agencies, two service units, a comprehensive health science center and a system administration office, the A&M System educates more than 125,000 students and makes more than 22 million additional educational contacts through service and outreach programs each year. Externally funded research expenditures exceed $780 million and help drive the state's economy.

Ciri Haugh
[email protected]


More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@ThingsExpo Stories
SYS-CON Events announced today that Luxoft Holding, Inc., a leading provider of software development services and innovative IT solutions, has been named “Bronze Sponsor” of SYS-CON's @ThingsExpo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Luxoft’s software development services consist of core and mission-critical custom software development and support, product engineering and testing, and technology consulting.
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
In his session at @ThingsExpo, Tony Shan, Chief Architect at CTS, will explore the synergy of Big Data and IoT. First he will take a closer look at the Internet of Things and Big Data individually, in terms of what, which, why, where, when, who, how and how much. Then he will explore the relationship between IoT and Big Data. Specifically, he will drill down to how the 4Vs aspects intersect with IoT: Volume, Variety, Velocity and Value. In turn, Tony will analyze how the key components of IoT influence Big Data: Device, Connectivity, Context, and Intelligence. He will dive deep to the matrix...
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Building actually breathes - immediately flagging overheating in a closet or over cooling in unoccupied ho...
SYS-CON Events announced today that Cloud Raxak has been named “Media & Session Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Raxak Protect automates security compliance across private and public clouds. Using the SaaS tool or managed service, developers can deploy cloud apps quickly, cost-effectively, and without error.
Scott Guthrie's keynote presentation "Journey to the intelligent cloud" is a must view video. This is from AzureCon 2015, September 29, 2015 I have reproduced some screen shots in case you are unable to view this long video for one reason or another. One of the highlights is 3 datacenters coming on line in India.
“The Internet of Things transforms the way organizations leverage machine data and gain insights from it,” noted Splunk’s CTO Snehal Antani, as Splunk announced accelerated momentum in Industrial Data and the IoT. The trend is driven by Splunk’s continued investment in its products and partner ecosystem as well as the creativity of customers and the flexibility to deploy Splunk IoT solutions as software, cloud services or in a hybrid environment. Customers are using Splunk® solutions to collect and correlate data from control systems, sensors, mobile devices and IT systems for a variety of Ind...
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud without worrying about any lock-in fears. In fact by having standard APIs for IaaS would help PaaS expl...
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk will be on IBM Cloudant, Apa...
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. It features built-in integration with Azur...
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
The enterprise is being consumerized, and the consumer is being enterprised. Moore's Law does not matter anymore, the future belongs to business virtualization powered by invisible service architecture, powered by hyperscale and hyperconvergence, and facilitated by vertical streaming and horizontal scaling and consolidation. Both buyers and sellers want instant results, and from paperwork to paperless to mindless is the ultimate goal for any seamless transaction. The sweetest sweet spot in innovation is automation. The most painful pain point for any business is the mismatch between supplies a...
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
As enterprises capture more and more data of all types – structured, semi-structured, and unstructured – data discovery requirements for business intelligence (BI), Big Data, and predictive analytics initiatives grow more complex. A company’s ability to become data-driven and compete on analytics depends on the speed with which it can provision their analytics applications with all relevant information. The task of finding data has traditionally resided with IT, but now organizations increasingly turn towards data source discovery tools to find the right data, in context, for business users, d...
SYS-CON Events announced today that MobiDev, a software development company, will exhibit at the 17th International Cloud Expo®, which will take place November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software development company with representative offices in Atlanta (US), Sheffield (UK) and Würzburg (Germany); and development centers in Ukraine. Since 2009 it has grown from a small group of passionate engineers and business managers to a full-scale mobile software company with over 150 developers, designers, quality assurance engineers, project manage...