Welcome!

Open Source Cloud Authors: Pat Romanski, Elizabeth White, Ram Sonagara, Liz McMillan, Jeev Trika

Related Topics: Containers Expo Blog, Java IoT, Linux Containers, Open Source Cloud, IoT User Interface, Agile Computing, @BigDataExpo

Containers Expo Blog: Article

Easily Boost Your Web Application Using nginx

The performance improvement is quite significant for serving static content

More and more Web sites and applications are being moved from Apache to nginx. While Apache is still the number one HTTP server with more than 60% on active Web sites, nginx has now taken over 2nd place in the ranking and relegated Microsoft's IIS to 3rd place. Among the top 10.000 Web sites nginx is already the leader in the field, with a market share of 40%.

And the reasons are obvious: nginx is a high-speed, lightweight HTTP server engine. The performance improvement is quite significant for serving static content. Especially at high load, nginx is much faster than Apache and consumes much less resources on the server. Thus, concurrent requests can be handled more efficiently. As a consequence, the same tasks can be fulfilled by less hardware. And every byte of memory, CPU or even server to be economized reduces your infrastructure costs.

I ran some load tests: 10.000 requests showed quite remarkable differences, even more distinct with more concurrent users. Note that with Apache the total execution time increases with the number of users, while nginx can easily handle that. For 2,000 users, nginx could process the requests almost four times faster.

While nginx uses an event-based request handling in a small number of processes, Apache is spawning new processes or threads for each request, depending on the processing mode. Apache's default multi-process (prefork) mode creates child processes for each request. Such a process is a complete instance of Apache including all linked modules. That means that even a request for static content, like an image, causes a new process to be started and the PHP module to be loaded.

Apache can also be operated in a multi-threaded (worker) mode, which creates multiple threads in fewer processes, one per request. Thus, it consumes much less memory, but the operation is no longer thread-save. Therefore modules like mod_php can't be used.

I went through the exercise of figuring out the best way to leverage nginx on an application that runs on Apache. In this blog we will cover the actual installation steps, different deployment and migration scenarios, as well as how to measure the actual performance gain.

Installing nginx
All you have to do to start boosting your application performance is to install nginx on your machine and follow some configuration rules. In this article I will be referencing an example site running Ubuntu.

sudo apt-get install nginx

No doubt, Apache provides much more functionality by supplying a broad range of mountable modules and many more options to be configured. A common way to adjust the behavior of a website is a combination of the virtual host setup and using the .htaccess file. First of all: this file doesn't exist in nginx, which is another performance bonus. Apache checks every single directory in the path of the requested file for an .htaccess file and evaluates the content if it exists. And, if not configured properly, keeping a config file together with your data could result in a severe security issue. Nginx keeps the configuration in a central place and loads the settings into memory at startup.

Even if you are not sure whether or not you really should replace Apache by nginx, you could always use both together? We will cover this later.

Migrating Configuration
There are quite some similarities, but it's important to understand the differences between the configurations. Just like Apache, nginx keeps the files in /etc/nginx/sites-available. Use a symbolic link for active configurations in /etc/nginx/sites-enabled.

First of all, create a server block for each virtual host.

server {
listen 80;
...
}

The basic setup for running a site is similar to Apache, with a slightly different syntax:

#

# Apache

#

#

# nginx

#

<VirtualHost *:80>

ServerName mysite.com
ServerAlias www.mysite.com

DocumentRoot /srv/www/mysite
DirectoryIndex index.php index.html

</VirtalHost>

server {
listen: 80;

server_name mysite.com www.mysite.com;

root /srv/www/mysite;
index index.php index.html;

}



To add specific behavior for certain requests define a location inside your server block. You can use regular expressions to select the effected requests:

server {
...

location / {
try_files $uri $uri/ @notfound;
}

location /doc/ {

alias /usr/share/doc/;

autoindex on;

allow 127.0.0.1;

deny all;

}

location /images/ {
root /media/images

}

location @notfound {

rewrite (.*) /index.php?paramstring=$1;

}

location ~ /\.ht {

deny all;

}
}

This sample configuration shows some of the setup options for server/locations. Make sure to create a config to deny .ht* files, as nginx is not doing that from scratch. Direct access to these files is automatically rejected by Apache. Note that familiar options from Apache can be found here in nginx: allow/deny, alias, rewrite, etc.

Please refer to the online documentation on nginx.org for further information.

Especially when you have multiple websites running on your server, and lots of requests causing high load, it is a good decision to move to nginx. But multiple websites, configured differently, could result in a quite high effort on migration. There are some converters available doing that job for you, but mainly they convert a .htaccess file to an nginx config. But Apache also uses configurations for virtual hosts - do not forget about these! Even if converted by a tool, I recommend checking your configurations manually before using them in a production environment!

Tip: install nginx as the primary HTTP server and leave your Apache running on a different port. Migrate your virtual servers one-by-one by creating nginx server configurations, and forward requests for not-yet-migrated websites to Apache. I'll show you how:

For my insight into nginx, and a look at Apache, click here for the full article

More Stories By Harald Zeitlhofer

Harald Zeitlhofer has 15+ years of experience as an architect and developer of enterprise ERP solutions and web applications with a main focus on efficient and performant business processes, usability and application design. As a Technology Strategist in Dynatrace's Centre of Excellence team he influences the Dynatrace product strategy by working closely with customers and driving their performance management and improvement at the front line. He is a frequent speaker at conferences and meetup groups around the world. Follow him @HZeitlhofer

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, outlined ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and sto...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, discussed the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filterin...
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Early adopters of IoT viewed it mainly as a different term for machine-to-machine connectivity or M2M. This is understandable since a prerequisite for any IoT solution is the ability to collect and aggregate device data, which is most often presented in a dashboard. The problem is that viewing data in a dashboard requires a human to interpret the results and take manual action, which doesn’t scale to the needs of IoT.
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2016 Silicon Valley. The 6thInternet of @ThingsExpo will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Much of IT terminology is often misused and misapplied. Modernization and transformation are two such terms. They are often used interchangeably even though they mean different things and have very different connotations. Indeed, it is somewhat safe to assume that in IT any transformative effort is likely to also have a modernizing effect, and thus, we can see these as levels of improvement efforts. However, many businesses are being led to believe if they don’t transform now they risk becoming ...
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
What does it look like when you have access to cloud infrastructure and platform under the same roof? Let’s talk about the different layers of Technology as a Service: who cares, what runs where, and how does it all fit together. In his session at 18th Cloud Expo, Phil Jackson, Lead Technology Evangelist at SoftLayer, an IBM company, spoke about the picture being painted by IBM Cloud and how the tools being crafted can help fill the gaps in your IT infrastructure.