Welcome!

Open Source Cloud Authors: Kevin Benedict, Elizabeth White, Nate Vickery, Pat Romanski, Liz McMillan

Blog Feed Post

vCloud Automation Center – vCAC 5.1 – Executing Scripts with the Linux Guest Agent

I’ve been getting asked a lot of question on how to execute scripts within Linux Guest Systems using vCAC. There are a few components to executing scripts in a Linux Guest OS which I’m going to cover in this post. Those items are:

  • Linux Guest Agent
  • Custom Properties

Linux Guest Agent

The Linux guest agent has a number of feature benefits that you receive if you utilize it. The Linux guest agent is a small agent that acts very similarly to the vCAC proxy agents. When it is installed you give it the name or IP address of the vCAC server. This allows it to check in with the server when it loads on a newly provisioned machine and determine if there is anything it needs to do. If the vCAC server has work for it to do it send the instructions and the agent executes the instructions on the local guest operating system. The guest agent comes with a number of pre-built scripts and functions, but also allows you to execute your own scripts. Some of the features available with the Linux Guest Agent are:

  • Disk Operations – Partition, Format, and mount disk that is added to the machine.
  • Execute Scripts – Execute scripts after the machine is provisioned.
  • Network Operations – Configure setting for additional network interfaces added to the machine.


This article is primarily about executing scripts with the guest agent, however by installing the guest agent if you add a disk and define the proper paramaters you will be able to utilize the Disk Operations that it supports. I will discuss networking in more details in a different article.
 

Installing the Guest Agent

1. You will need to install the Linux Guest Agent in your VMware Template that you will be using. In my example I’m using Centos 6.3 x64.
 
2. You will need to get the agent into your template either by accessing it across the network or by putting it on an ISO and mounting it to the VM. The agent are part of the vCAC installation package and are located under “LinuxGuestAgentPkgs” there you will find agents for a number of flavors of Linux.
 
3. Once you have your agent on your template you need to install the agent package. There are generally two packages, a tar.gz as well as a RPM. I will be installing the rpm file for rhel6-amd64. The specific package name is “gugent-5.1.1-56.x86_64.rpm”.
 
4. Install the package by doing the following: rpm -ivh path/gugent-5.1.1-56.x86_64.rpm. In the below image mine was already installed, but you get the idea.
vcaclgas-1
5. Next we need to execute the agent installer. You need to go to /usr/share/gugent and run ./installgugent.sh {vcac server} you can use either the IP address of hostname of the vCAC server.
vcaclgas-2
6. You can verify the proper name/IP by doing “cat rungugent.sh” and looking at the “–host=” statement on the 6th line.
vcaclgas-3
7. We next need to rename one of the scripts that is included with the Linus Guest Agent due to a bug. Navigate to “/usr/share/gugent/site/CustomizeGuestOS” In this folder you will see a script named “20_static_ip.sh” this script is depreciated and not needed. Rename the script to “old_20_static_ip.sh” by doing “mv 20_static_ip.sh old_20_static_ip.sh“.
vcaclgas-4
8. If you want to follow the example that I will be giving you should also create a folder on the template to be utilized as a mount point for an NFS share. I created “/Scripts” and my mount point by doing “mkdir /Scripts“.
 
9. Next I created a helper script in the template that I will use to mount my NFS share. I placed my script in the root of the disk, but you can place it where ever you like. A good place for it would be in “/usr/share/gugent/site/CustomizeOS“, but it really doesn’t matter where you place it.
 
10. My script is named “repomount.sh” and looks like this: “mount -t nfs $1 /Scripts” nothing fancy at all.
 
11. Once you have the script created go ahead and “shutdown” your template machine and “Convert it to a template” in “vCenter“.
 

Configure vCAC

12. If this is a “new template” that you just created you will need to “manually” perform a “Data Collection” in “vCAC“. You can do this by navigating to “Enterprise Administrator” Selecting “Compute Resources” and hover over the cluster that the template exists in and then select “Data Collection” Once there click “Request Now” under “Inventory“.
 
13. Once the “Data Collection” is complete your “template” will be ready for use.
 
14. If you don’t already have a blueprint that can deploy this “template” you will need to create one. See my article on “Connecting to vCenter” for instructions on creating a blueprint.
 
15.Next we are going to create a “Build Profile” that we can use to hold our properties. Navigate to “Enterprise Administrator” and select “Build Profiles” then select “New Build Profile“.
16. Give the “Build Profile” a “Name” and then we are going to add some “Custom Properties” to the profile. We need to add the following properties and values:

  • repo.path – This is a property that I made up for this example. It is not part of vCAC. The value should the the path to your “NFS” share in the format “IP:/volume/share” This is what we are going to mount to the folder we created earlier
  • VirtualMachine.Admin.UseGuestAgent – This tells vCAC to utilize the guest agent as part of the deployment process. The “value” should be “true“.
  • VirtualMachine.Customize.WaitComplete – This tells vCAC to wait until the vCenter Guest Customization is complete. If you do not use “Customization Specifications” you do not need this property. The “value” should be “true” if you use vCenter guest customization.
  • VirtualMachine.Software0.Name – Assign a name for the script you are going to execute. “Value” is a “Friendly Name” for your script.
  • VirtualMachine.Software0.ScriptPath – Path to your script including the script name. You can pass parameters to your script as well. I’m passing the value for the “repo.path” that I created earlier to my script to be utilizing as the “NFS location” to mount. The value is “/repomount.sh {repo.path}“. Note: The {} brackers are required in the value.
  • VirtualMachine.Software1.Name – I’m executing a second script so I’ve used the “VirtualMachine.Software1.Name” property again only with a 1 instead of a 0 this time.
  • VirtualMachine.Software1.ScriptPath – I’m executing a second script. This one s located on the NFS share that I am mounting with my first script. My script is “BashScript.sh” and it contains “echo “The script was successfully executed” > /Script_Successfull.txt

17. Click “Ok” to save the “Build Profile
vcaclgas-5
18. Next we need to assign this build profile to our “Linux Blueprint“. Go to “Enterprise Administrator” select “Global Blueprints“, then select the “Linux Blueprint” and then “edit“.
 
10. Once the “Blueprint” opens select the “Properties” tab and select the “Build Profile” you just created and then click “Ok
vcaclgas-6

So what does all this do

When you request a machine custom properties are associated with the machine. Some custom properties are what we call reserved properties because vCAC understands them and performs actions based on them. The “VirtualMachine.Admin.UseGuestAgent” property tells vCAC that when the machine is provisioning it needs to create workitems for the “Linux Guest Agent” to pick up. The “VirtualMachine.SoftwareX.Name” and “Virtualmachine.SoftewareX.ScriptPath” are put into the work item that is created by vCAC to instruct the ‘Linux Guest Agent” to execute those scripts.
 
In this example the first script will execute and mount the NFS share that I defined my my own property “repo.path”. Once complete the second script will run and execute “BashScript.sh” which will create a txt file. The guest agent performs it’s tasks after the VMware Guest Customization if being utilized otherwise it completed it after the “Machine Provisioned” state. Once the Linux Agent completed all it’s work items it will then remove itself from loading on future loads and stop it’s service.

Read the original blog entry...

More Stories By Sidney Smith

Sid Smith, founder of DailyHypervisor is considered to be a cloud expert in the IT field with over 10 years experience in Virtualization, Automation, and Cloud technologies. Sid Smith started in the industry designing and implementing large scale enterprise server and desktop virtualization environments for fortune 100 and 500 companies. He later went on to become a key employee at DynamicOps the well know creators of Cloud Automation Center. In July 2012 DynamicOps was acquired by VMware who has adopted Cloud Automation Center as a center piece for it’s vCloud Suite of products. Sid has helped dozens of fortune 100 and 500 enterprises successfully adopt both private and public cloud strategies as part of their IT offerings. The result of which was large operational and capital savings for his customers. Sid continues to help large enterprise customers reach their hybrid cloud strategies at VMware. On DailyHypervisor you will find exclusive content that will help you learn how to adopt a successful cloud strategy through the use of VMware Cloud Automation Center, Open Stack, and other industry recognized cloud solutions.

@ThingsExpo Stories
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices t...
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, discussed how they built...
No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
DevOps at Cloud Expo – being held June 5-7, 2018, at the Javits Center in New York, NY – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Among the proven benefits,...
@DevOpsSummit at Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, is co-located with 22nd Cloud Expo | 1st DXWorld Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
SYS-CON Events announced today that T-Mobile exhibited at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. As America's Un-carrier, T-Mobile US, Inc., is redefining the way consumers and businesses buy wireless services through leading product and service innovation. The Company's advanced nationwide 4G LTE network delivers outstanding wireless experiences to 67.4 million customers who are unwilling to compromise on qua...
SYS-CON Events announced today that Cedexis will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Cedexis is the leader in data-driven enterprise global traffic management. Whether optimizing traffic through datacenters, clouds, CDNs, or any combination, Cedexis solutions drive quality and cost-effectiveness. For more information, please visit https://www.cedexis.com.
SYS-CON Events announced today that Google Cloud has been named “Keynote Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Companies come to Google Cloud to transform their businesses. Google Cloud’s comprehensive portfolio – from infrastructure to apps to devices – helps enterprises innovate faster, scale smarter, stay secure, and do more with data than ever before.
SYS-CON Events announced today that Vivint to exhibit at SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California. As a leading smart home technology provider, Vivint offers home security, energy management, home automation, local cloud storage, and high-speed Internet solutions to more than one million customers throughout the United States and Canada. The end result is a smart home solution that sav...
SYS-CON Events announced today that Opsani will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Opsani is the leading provider of deployment automation systems for running and scaling traditional enterprise applications on container infrastructure.