Welcome!

Open Source Cloud Authors: Pat Romanski, Elizabeth White, Yeshim Deniz, Harry Trott, Liz McMillan

Related Topics: Open Source Cloud

Open Source Cloud: Article

64-Bit Linux Applications Need High-Quality 64-bit Database Connectivity

Not choosing the right 64-bit database connectivity can cheat businesses out of the full benefit of 64-bit Linux

Some businesses using Linux as an internal server platform may only now be confronting the challenge of migrating to 64-bit Linux distributions but are actually stepping into familiar territory for most Linux users in the business world. 64-bit Linux has been running for years on chipset families such as Intel's EM64T (Extended Memory 64 Technology) and Itanium, AMD's Athlon 64 and Opteron, and IBM's POWER. In addition, 64-bit Linux distributions have been offered for some time from top vendors such as Red Hat and Novell/SuSE, and have been available as a server operating system choice from hardware vendors such as Dell, IBM, and HP.

With all of the availability and access to 64-bit Linux around, why hasn't business been more aggressive in purchasing and deploying 64-bit Linux server platforms? What initially made many businesses hesitant to embrace 64-bit Linux were general concerns about application migration. What are the costs of migrating our existing 32-bit applications to 64-bit? What degree of benefit would our 32-bit applications experience by being migrated to 64-bit? As time has passed, new processor architectures such as x86-64 made 64-bit Linux more attractive to IT organizations of all shapes and sizes. x86-64 architecture allows both 64-bit and existing 32-bit applications to run simultaneously on a 64-bit operating system platform. Because of this, IT organizations now have much greater flexibility to decide which applications to migrate to 64-bit and when to migrate sparing them the business disruption and expense of a wholesale overhaul of all 32-bit applications in one enormous project.

The question of which applications would benefit the most from migrating to 64-bit is a subjective one, but in general, the answer is memory-hungry, data-intensive, multi-user applications such as relational database management systems (RDBMSs), business intelligence (BI), and data warehousing applications. When run as 32-bit, these applications can easily hit the upper limit of addressable memory that's capable of being accessed by any 32-bit application even when there's more physical memory available to the operating system itself. The result of hitting this upper limit is an increase in the amount of paging to disk that must take place for the application to accomplish its tasks. This increase in disk I/O has the predictable effect of producing a performance bottleneck in the application that limits the ability of the application to scale for larger data sets or numbers of concurrent users. By running as 64-bit, these same applications can access all of the addressable memory available to the operating system and can therefore run entirely in memory if the Linux platform has sufficient RAM. The application's performance and support for additional concurrent users can then scale with the total amount of memory available to the operating system thus eliminating the performance and capacity bottleneck introduced by running as 32-bit.

Surprisingly for most of you reading this article, there is another key question that businesses of all kinds must consider when developing a strategy for developing or migrating applications to 64-bit on Linux. And it's the X factor in this scenario: "What impact will database connectivity have on the success of my 64-bit Linux applications?"

The Importance of Database Connectivity
Most people that I meet and ask about their database connectivity strategy usually tell me the same thing in one of three ways:

  1. "All database connectivity is pretty much the same."
  2. "Database connectivity is a commodity now."
  3. "My database connectivity is 'good enough."
What's not said, but implied is the obvious, "I don't think I need a database connectivity strategy." The myth that each of these statements is built on is the idea that a database connectivity solution from one vendor is architecturally identical to what's offered by a different vendor. What this myth fails to take into account is the fact that there are very different approaches to designing and developing database connectivity. The architecture chosen for a given set of database connectivity components can mean the difference between an end result of poor quality and an end result of high quality.

Most business applications that run on Linux handle database access to a relational data source through some sort of data access API such as ODBC, JDBC, or even some of the proprietary APIs available from the database vendors. Even if these applications weren't written using one of these APIs directly, it's a good bet that they make use of these APIs and subsequently load a driver or some type of database client libraries under the covers. Whether your business already has Linux applications compiled as 64-bit or is implementing a plan for migrating applications to 64-bit, you will want to carefully consider the type of database connectivity that's used. While there are an increasing number of options to choose from, picking high-quality 64-bit database connectivity can make a difference in determining whether your 64-bit Linux applications actually experience the kinds of performance benefits available to 64-bit applications or suffer from a range of potential performance and scalability limitations.

After investing a lot of time and effort in planning what applications to migrate to 64-bit then actually doing the work to migrate the applications, it seems baffling why some architects and developers wouldn't take the threat of introducing performance and scalability bottlenecks into an important application more seriously. Perhaps it's because they don't know what characteristics to look for when choosing database connectivity and the impact that these characteristics can have on the success or failure of their 64-bit Linux applications.

CPU-Bound, Not I/O Bound
As mentioned earlier, memory-intensive applications running as 64-bit can benefit from gaining access to more addressable memory space and thus can potentially access all of the memory available to the operating system. This can allow the application to run entirely in memory, which eliminates the performance and scalability bottlenecks introduced by the increased (and unnecessary) need to page to disk. For important 64-bit applications that run on Linux, businesses typically invest in server hardware with multiple CPUs and very large amounts of RAM to ensure that the application can run entirely in memory and support an increase in data set size, volume, or number of users. This investment works if the application is using database connectivity that is also designed to take advantage of increased numbers of CPUs or increased amounts of memory.

Along with disk I/O, network I/O is one of the two biggest factors impacting application performance. In the typical client/server deployment scenario, the application and database connectivity components run on one tier and connect across the network to an RDBMS located on a second tier. Poor-quality database connectivity components communicate with the RDBMS inefficiently due to an unnecessary amount of network I/O. These kinds of database connectivity components are said to be I/O-bound since they result in increased network I/O, which in turn introduces a performance and scalability bottleneck that can impact applications of any type. Most businesses can't upgrade the speed of their corporate network or WAN environments to compensate for the performance penalty paid for excessive network I/O, so the performance cost of repeatedly unnecessary TCP/IP round-trips adds up quickly for even a simple application operation. As a result, even though more RAM or CPUs can be added to the Linux application server platform, no additional performance or scalability benefit will be observed in the application.


More Stories By Mike Frost

Mike Frost is a product manager for DataDirect Technologies, a provider of standards-based components for connecting applications to data, and an operating unit of Progress Software Corporation. In his role, Mike is involved in the strategic marketing efforts for the company’s connectivity technologies. He has more than 6 years of experience working with enterprise data connectivity and is currently involved in the development of data connectivity components including ODBC, JDBC, ADO.NET, and XML.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessio...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
Announcing Poland #DigitalTransformation Pavilion
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
DXWorldEXPO LLC announced today that All in Mobile, a mobile app development company from Poland, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. All In Mobile is a mobile app development company from Poland. Since 2014, they maintain passion for developing mobile applications for enterprises and startups worldwide.
CloudEXPO | DXWorldEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
The best way to leverage your CloudEXPO | DXWorldEXPO presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering CloudEXPO | DXWorldEXPO will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at CloudEXPO. Product announcements during our show provide your company with the most reach through our targeted audienc...
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world.
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution. In his session at @ThingsExpo, Akvelon expert and IoT industry leader Sergey Grebnov provided an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
DXWorldEXPO | CloudEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
JETRO showcased Japan Digital Transformation Pavilion at SYS-CON's 21st International Cloud Expo® at the Santa Clara Convention Center in Santa Clara, CA. The Japan External Trade Organization (JETRO) is a non-profit organization that provides business support services to companies expanding to Japan. With the support of JETRO's dedicated staff, clients can incorporate their business; receive visa, immigration, and HR support; find dedicated office space; identify local government subsidies; get...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...