Welcome!

Open Source Cloud Authors: Yeshim Deniz, Zakia Bouachraoui, Elizabeth White, Liz McMillan, Pat Romanski

Related Topics: Containers Expo Blog, Java IoT, Linux Containers, Open Source Cloud, Machine Learning , Agile Computing, @DXWorldExpo

Containers Expo Blog: Article

Easily Boost Your Web Application Using nginx

The performance improvement is quite significant for serving static content

More and more Web sites and applications are being moved from Apache to nginx. While Apache is still the number one HTTP server with more than 60% on active Web sites, nginx has now taken over 2nd place in the ranking and relegated Microsoft's IIS to 3rd place. Among the top 10.000 Web sites nginx is already the leader in the field, with a market share of 40%.

And the reasons are obvious: nginx is a high-speed, lightweight HTTP server engine. The performance improvement is quite significant for serving static content. Especially at high load, nginx is much faster than Apache and consumes much less resources on the server. Thus, concurrent requests can be handled more efficiently. As a consequence, the same tasks can be fulfilled by less hardware. And every byte of memory, CPU or even server to be economized reduces your infrastructure costs.

I ran some load tests: 10.000 requests showed quite remarkable differences, even more distinct with more concurrent users. Note that with Apache the total execution time increases with the number of users, while nginx can easily handle that. For 2,000 users, nginx could process the requests almost four times faster.

While nginx uses an event-based request handling in a small number of processes, Apache is spawning new processes or threads for each request, depending on the processing mode. Apache's default multi-process (prefork) mode creates child processes for each request. Such a process is a complete instance of Apache including all linked modules. That means that even a request for static content, like an image, causes a new process to be started and the PHP module to be loaded.

Apache can also be operated in a multi-threaded (worker) mode, which creates multiple threads in fewer processes, one per request. Thus, it consumes much less memory, but the operation is no longer thread-save. Therefore modules like mod_php can't be used.

I went through the exercise of figuring out the best way to leverage nginx on an application that runs on Apache. In this blog we will cover the actual installation steps, different deployment and migration scenarios, as well as how to measure the actual performance gain.

Installing nginx
All you have to do to start boosting your application performance is to install nginx on your machine and follow some configuration rules. In this article I will be referencing an example site running Ubuntu.

sudo apt-get install nginx

No doubt, Apache provides much more functionality by supplying a broad range of mountable modules and many more options to be configured. A common way to adjust the behavior of a website is a combination of the virtual host setup and using the .htaccess file. First of all: this file doesn't exist in nginx, which is another performance bonus. Apache checks every single directory in the path of the requested file for an .htaccess file and evaluates the content if it exists. And, if not configured properly, keeping a config file together with your data could result in a severe security issue. Nginx keeps the configuration in a central place and loads the settings into memory at startup.

Even if you are not sure whether or not you really should replace Apache by nginx, you could always use both together? We will cover this later.

Migrating Configuration
There are quite some similarities, but it's important to understand the differences between the configurations. Just like Apache, nginx keeps the files in /etc/nginx/sites-available. Use a symbolic link for active configurations in /etc/nginx/sites-enabled.

First of all, create a server block for each virtual host.

server {
listen 80;
...
}

The basic setup for running a site is similar to Apache, with a slightly different syntax:

#

# Apache

#

#

# nginx

#

<VirtualHost *:80>

ServerName mysite.com
ServerAlias www.mysite.com

DocumentRoot /srv/www/mysite
DirectoryIndex index.php index.html

</VirtalHost>

server {
listen: 80;

server_name mysite.com www.mysite.com;

root /srv/www/mysite;
index index.php index.html;

}



To add specific behavior for certain requests define a location inside your server block. You can use regular expressions to select the effected requests:

server {
...

location / {
try_files $uri $uri/ @notfound;
}

location /doc/ {

alias /usr/share/doc/;

autoindex on;

allow 127.0.0.1;

deny all;

}

location /images/ {
root /media/images

}

location @notfound {

rewrite (.*) /index.php?paramstring=$1;

}

location ~ /\.ht {

deny all;

}
}

This sample configuration shows some of the setup options for server/locations. Make sure to create a config to deny .ht* files, as nginx is not doing that from scratch. Direct access to these files is automatically rejected by Apache. Note that familiar options from Apache can be found here in nginx: allow/deny, alias, rewrite, etc.

Please refer to the online documentation on nginx.org for further information.

Especially when you have multiple websites running on your server, and lots of requests causing high load, it is a good decision to move to nginx. But multiple websites, configured differently, could result in a quite high effort on migration. There are some converters available doing that job for you, but mainly they convert a .htaccess file to an nginx config. But Apache also uses configurations for virtual hosts - do not forget about these! Even if converted by a tool, I recommend checking your configurations manually before using them in a production environment!

Tip: install nginx as the primary HTTP server and leave your Apache running on a different port. Migrate your virtual servers one-by-one by creating nginx server configurations, and forward requests for not-yet-migrated websites to Apache. I'll show you how:

For my insight into nginx, and a look at Apache, click here for the full article

More Stories By Harald Zeitlhofer

Harald Zeitlhofer has 15+ years of experience as an architect and developer of enterprise ERP solutions and web applications with a main focus on efficient and performant business processes, usability and application design. As a Technology Strategist in Dynatrace's Centre of Excellence team he influences the Dynatrace product strategy by working closely with customers and driving their performance management and improvement at the front line. He is a frequent speaker at conferences and meetup groups around the world. Follow him @HZeitlhofer

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...