Welcome!

Open Source Cloud Authors: Yeshim Deniz, Pat Romanski, Stefana Muller, Karthick Viswanathan, Elizabeth White

Related Topics: Open Source Cloud

Open Source Cloud: Article

Java Technology On The Linux Platform

Java Technology On The Linux Platform

The Java 2 Platform, Standard Edition (J2SE technology) v1.3 for Linux means that Linux users and developers can take advantage of thousands of Java technology-based applications, from enterprise e-commerce infrastructure to client-side applications. It also opens up a huge emerging market for companies that already develop Java products.

The Java 2 Platform port was developed with the assistance of the blackdown.org porting group. Although Linux is a UNIX-based operating system, it's evolved at a different pace and direction than other UNIX platforms, making a port of the Java platform a sizable challenge. Many important bug fixes have gone into the Linux operating system and associated GNU libraries as a result of the work initiated by Blackdown, not just on Intel-based distributions but also Power PC and Sparc platforms. In particular, the linuxthreads library has improved dramatically over the past year on multiprocessor machines.

In an ideal world, the Java 2 Platform, combined with Java development tools such as Forté Community Edition, should be all you need to develop your projects. However, if the Linux operating system is a new development environment for you and you've previously developed with the Java 2 Platform for Windows or Solaris, you might find Linux a little different. This article provides tips to bring you up to speed as quickly as possible developing on the Linux platform.

SDK and JRE Installation Tips
Installation is straightforward, but be aware that the Java Runtime Environment (JRE) and Java Software Developer Toolkit (SDK) Red hat Package Management (RPM) files are installed by default to /usr/java. If you want these files in a different location, install with the rpm command:

rpm -i --badreloc --relocate /usr/java/=/usr/local/home j2sdk-1_3_0-linux.rpm

After relocating the files as shown above, start developing by adding the bin directory from the SDK or JRE installation to your $PATH. For example, using the bash shell, the $PATH is updated:
 

export PATH=/usr/java/jdk1.3/bin:$PATH

To verify that you've configured your environment, run the java -version command, and you should see a version string confirming that you've installed the Java 2 Runtime Environment:

java -version
java version "1.3.0"
Java™ 2 Runtime Environment, Standard Edition (build 1.3.0)
Java HotSpot™ Client VM (build 1.3.0, mixed mode)

Java Plug-in Installation Tips
The J2SE plug-in is embedded inside the Java Runtime directory. As long as the Netscape browser can find the plug-in library from within the Java Runtime directory, it can also find the additional runtime files it needs.

To let your Netscape browser know where the Java plug-in library is, set the NPX_PLUGIN_PATH environment variable to point to the directory where the javaplugin.so shared library is located:

export NPX_PLUGIN_PATH=/usr/java/jdk1.3/jre/plugin/i386

To configure the Java plug-in properties, use the ControlPanel program located in the jre/bin directory. In the example above, the program is /usr/java/jdk1.3/jre/bin/ControlPanel.

Java HotSpot Virtual Machine
This is the first release on Linux that includes the Java HotSpot Virtual Machine. Both the client and server Java HotSpot compilers are included by default in the Java Runtime Environment. These two solutions share the unique Java HotSpot runtime environment, but have different compilers optimized for the specific performance requirements of client- and server-side-based applications. The bundling of these virtual machines in one package provides developers with increased flexibility and convenience in deploying their client- or server-based Java technology applications.

By default the client compiler is enabled, but for intense server-side applications, the server compiler can be run using the -server runtime option. The Java HotSpot Virtual machine normally runs in a mixed mode, as seen in the version output. This means HotSpot will dynamically compile Java bytecodes into native code when a number of criteria have been met, including how many times the method has been run through the interpreter. The mixed runtime mode normally results in the best performance.

About Linux Threads
One major difference between developing on Linux and other UNIX operating systems is the system threads library. In Java 2 releases prior to 1.3, the JVM uses its own threads library, known as greenThreads, to implement threads in the Java platform. The GreenThreads implementation minimizes the JVM's exposure to differences in the LinuxThreads library and makes the port easier to complete. The downside to GreenThreads is that system threads on Linux are not taken advantage of, so the JVM doesn't scale well when additional CPUs are added.

In J2SE Release 1.3, the HotSpot virtual machine uses Linux system threads to implement JavaThreads. Because LinuxThreads are implemented as a cloned process, each JavaThread shows up in the process table if you run the ps command. Although this may look different, it's normal behavior for threaded programs on Linux:

java -jar Notepad.jar
ps -eo pid,ppid,command

In Table 1, the process id 11712 is the invoked JVM. The other processes with id 11712 as the parent process (listed under the PPID column) are JavaThreads implemented by the Linux system threads library. Each LinuxThread is created as a process clone operation that leaves the task of scheduling threads to the process scheduler.
 
On Solaris, however, JavaThreads are mapped onto user threads, which in turn are run on lightweight processes (LWP). On Windows the threads are created inside the process itself. Today, creating a large number of JavaThreads on Solaris and Windows is faster than on Linux. Thus, you might need to adjust programs that rely on platform-specific timing to take a little longer on startup when they run on Linux.

Linux Debugging Tools
As you learned in the section on threads, LinuxThreads are implemented as Linux processes. This makes debugging Java programs running on Linux a little tricky, but not impossible. This section explains how to attach the gdb GNU debugger to the JVM, and how to start the JVM with gdb to run debugging commands.

Using GDB
Going back to the example in the threads section, if the JVM is already running, you can attach the system gdb tool as follows:

gdb /usr/java/jdk1.3/bin/i386/native_threads/java 11712

The output from using the gdb tool should look similar to Listing 1. At this point the gdb tool has connected to the running Java process and is waiting for user input.

With gdb attached to the JVM, you can run any gdb commands. If you've used gdb before, these commands will look familiar. The info threads command lists the LinuxThreads, the t 12 command selects thread number 12 as the current thread, and the where command lists the stack frames in that thread (thread number 12, in this example). The output from an example session is shown in Listing 2.

Starting a Java Virtual Machine using gdb
Attaching a gdb trace to an existing program is usually the easiest way to debug a deadlock JVM; however, you might want to start the JVM from within the gdb debug tool from the outset, especially if you suspect the failure happens shortly after your application starts.

To enable the JVM to run from inside gdb on Linux, the following setup needs to be in place:
 

  1. Set the APPHOME and LD_LIBRARY_PATH environment variables before you run the gdb command. This step is required because the Java program is a shell wrapper that configures the environment for the real Java program to run, and gdb needs to run with the real binary Java program, not the shell wrapper version.
  2. Instruct gdb to ignore the signals that the Hotspot JVM uses to maintain its own state. An example of such a signal is SIGUSR1.
These next lines start the Java 2D demo inside a gdb session. First, the LD_LIBRARY_PATH and APPHOME environment variables are set. In this example, the setup assumes the SDK is installed in the default /usr/java directory:

export APPHOME=/usr/java/jdk1.3 export LD_LIBRARY_PATH=/usr/java/jdk1.3/jre/lib/i386/native_threads:/usr/ java/jdk1.3/jre/lib/i386:/usr/java/jdk1.3/jre/lib/i386/hotspot
 

  1. Start gdb. The following gdb commands will load the JVM, instruct gdb to stop at the main method, and ignore the signals used by the Hotspot Virtual Machine. The commands can be entered either after gdb has started, or put in a .gdbinit file in your home directory. Once the gdb tool has reached the breakpoint, other breakpoints can be added, or just enter the gdb cont command to continue:
file /usr/java/jdk1.3/jre/lib/i386/native_threads/java
break main
run -jar Java2Demo.jar
handle SIGUSR1 nostop noprint pass
handle SIGSEGV nostop noprint pass
handle SIGILL nostop noprint pass
handle SIGQUIT nostop noprint pass

Generating a Java stack trace on Linux
As LinuxThreads are implemented as Linux processes, this makes debugging Java programs running on Linux a little different. To generate a stack trace from the terminal window that started your application, type the sequence / to send the QUIT signal to generate a stack trace. However, if you need to generate a stack trace from a Java application already running in the background, send the QUIT or signal number 3 to the launcher process id. The launcher process in this example is process id 11667. The stack trace can be generated by sending the signal via the kill command as follows:

kill -3 11667

The resulting stack trace is a snapshot of the JavaThreads and details the state of each thread. In each thread trace, a value called nid - the hex number of the cloned process it came from - can be used to track down deadlocks or busy sections in your application. For more details on analyzing Java stack traces, refer to the "Debugging Applets, Applications, and Servlets" chapter in Advanced Programming for the Java 2 Platform (Addison-Wesley, 2000).

Integrating Native JNI Code on Linux
If you've used native JNI code in your Java application, it probably needs to be recompiled on Linux. Linux distributions come complete with a wealth of GNU tools for compiling C, C++, Fortran, and other languages. Some of the system header files and system calls may be named slightly differently on linux, and the size of structures or global variables may be smaller than on other Unix platforms. A quick run-through of the header files can avoid complex issues later on.

To recompile your native C or C++ code and generate the native library file, use the GNU tool gcc for C programs and g++ for C++ programs, and supply the options as shown in the next example. Finally, set the LD_LIBRARY_PATH environment variable to point to the directory where the final native library is located. In the following examples the native library is called libnativelib.so and is loaded from within a Java program by the using System.loadLibrary("nativelib").

GNU C/Linux

gcc -o libnativelib.so -D_REENTRANT -shared -Wl,-soname,libnative.so -I/usr/java/jdk1.3/include -I/usr/java/jdk1.3/include/linux nativelib.c -static -lc

GNU C++/Linux

g++ -o libnativelib.so -D_REENTRANT -shared -Wl,-soname,libdbmap.so -I/usr/java/jdk1.3/include -I/usr/java/jdk1.3/include/linux nativelib.cc -static -lc

Desktop Differences (Copy and Paste)
If you've been using Windows or a keyboard with a copy and paste key, you may be wondering how to copy and paste text between Java programs and other desktop programs and terminals.

Linux uses a mouse-driven copy-and-paste mechanism in which mouse button one selects and copies text, and mouse button two pastes the text. This technique works for Abstract Window Toolkit (AWT) components because they use the primary selection to achieve copy and paste. Project Swing components, however, use the system clipboard for copy and paste, and most tools on the desktop, apart from the Netscape browser, don't use the clipboard.

A workaround to this limitation is to map a key or mouse button to access the system clipboard:

*VT100.Translations: #override \
<Btn3Up>:       select-end(CLIPBOARD) \n\
<Btn2Up>:       insert-selection(CLIPBOARD) \n

The lines above can be passed as a value to an X tool using the -xrm option. Alternatively, the mapping can be made accessible to the entire desktop by including it in the .Xdefaults file in the user's home directory. The command xrdb -merge $HOME/.Xdefaults will reload updates in the .Xdefaults file.

Conclusion
Using Linux to develop and deploy applications written in Java has the same benefits as developing on any Java platform. Thousands of Java applications are available, and because they're cross-platform, those compiled on Linux will work right out of the box on Windows or any other Java-enabled platform.

In addition, the amount of Linux knowledge needed to develop or deploy Java applications is relatively small, making Linux an attractive choice as a development platform.

Resources
Java technologies on linux: http://java.sun.com/linux
Blackdown Java porting team: http://blackdown.org
 
 

More Stories By Maureen O'Gara

Maureen O'Gara the most read technology reporter for the past 20 years, is the Cloud Computing and Virtualization News Desk editor of SYS-CON Media. She is the publisher of famous "Billygrams" and the editor-in-chief of "Client/Server News" for more than a decade. One of the most respected technology reporters in the business, Maureen can be reached by email at maureen(at)sys-con.com or paperboy(at)g2news.com, and by phone at 516 759-7025. Twitter: @MaureenOGara

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...