Back in Action…

The spread of the novel coronavirus since the beginning of the year 2020 has changed life throughout the world. While public life has been mostly shut down, both personal and work life are starting to take place more and more digitally .

It has been few years since I blogged on the work I have been doing and what better time than now to revive it .The contours of my professional life have changed , and in my current role as Sr. Cloud Solution Architect I get opportunity to work on wide range of technologies from IBM Cloud , Watson AI , IBM Cloud Paks to just about everything under the gamut of Big Blue IBM.

Typical day could start with Modernization discussion leveraging IBM Transformation Advisor , and then hop on analytics and AI on why our Cloud Pak for Data makes sense bringing the power of Virtualization etc and end the day with Edge Computing conversation with Edge Usecases where IBM is actively working on with the newly launched offering IBM Edge Application Manager .

So how do you sail through the storm , my mantra in life :

You should diversify your source of fulfillment and derive meaning from different areas, hobbies, interests, passions, and purposes.But it’s important that whatever you do, do the best you can. That’s the essence of being a passionate person”.

So when I am not glued to my laptop , there are plenty of things which I am passionate about and these passions help me in myriad different ways ..

  1. Books – The very specific ones which open the world of possibilities and transport one in different realms of spiritual world ( The likes of Autobiography of Yogi , Death – An Inside Story , Aghora: At the left hand of God , Apprenticed to a Himalayan Master: A Yogi’s Autobiography , Mystic Eye , Many Lives -Many masters etc etc ..
Takeaway Tip 1 : Read that what intrigues your mind and opens it to mesmerizing experiences.

2. Cooking is new hobby picked over last two years ( Strive to make every meal a “piece of art” – not just ignite the taste-buds but offer a visual delight for the optics as well . And hey who says Roti’s gotta be round – a “heart” can add some spice too 🙂

Takeaway Tip 2 : Use your passion and creative skills to bring to life even the mundane tasks and see the transformation it can bring !

3. Binge watching Netflix Money Heist currently to get my heart pumping and brain ticking – the drama, intensity, violence, wit, mystery element and the gripping story – A masterpiece of flawless planning !! And yes that’s me in my Red Suit , much before TOKYO came into existence !!

Take Away Tip 3 : The plan that is flawless in method –can turn flawed in practice – the key is to adapt along the way.

Now that I have your attention on what keeps me ticking in the Quarantine days locked up at home, my goal over the next couple of weeks is to pen down my work in Security field – infusing Security into DevOps world of cloud-native Containers – yes you got it right – DevSecOps is exactly what I shall be talking about….

Thought to Ponder : How Secure is your Container ???


Full stack deployment using UCD – Part3

In Part 1 and Part 2 of the Blog, we talked about setting up Blueprint designer.  Now in final blog , we shall  talk about how to create Blueprint.

Lets go ahead and click on Create a new Blueprint . Go to top of homepage and click on New BlueprintPicture1

This creates a blank .yaml file as below .

  • Images : On the right side of panel , we see various images which come from AWS catalog in our case. One can see that they are classified under various headers like Ubuntu , RHEL , Suse etc


  • Components : We have various components which have been created in UCD for application deployment . In this case we have already created various components in UCD for deployment of three tier application ( jke.db contains db scripts , jke.war contains war file , MYSQL Server contains sql server installation  and Liberty Profile has Liberty installation ) Also as you see the various agents components are also created ( This was covered in last section of Part1 of my Blog ) , depending on OS of VM , appropriate agent package is automatically selected and installed.


  • Networks : Under Networks it displays the networks which have been created or one can also create new networks.


  • Security Policy : Under this section various Inbound and Outbound ports and hostnames rules which have been opened are listed . One can assign  the required group.


Now let’s go ahead and create blueprint , one can drag and drop image .

  1. In my case I have added two images – RHEL and Ubuntu . RHEL machine would be used for AppServer , so if you look within AppServer Machine – one can  see two Components  – Liberty ( which installs liberty)  and war file being added.
  2. On the other machine , which is DB server , I have dragged MYSQL Server Component ( which installs DB server ) and also JKE.db .
  3. Next Network is added and both VMs are joined to same network
  4. Security Group is attached to both
  5. Deployment Sequence which tells in what order the components should be deployed is added to each component.

With this our Blueprint is ready and now we are ready to provision .



Click on Provision Button on top and In pop-up add details of required configurations by selecting right parameters and go ahead

Once provisioning starts , one can see the various activities and their status.

Screen Shot 2017-10-25 at 4.00.51 PMWhat this does is first provisions images , then installs agents on each of those machines. Next it creates resource tree in UCD and maps the required components to the agents.When all this is done, it starts deployments of components in order specified.

Screen Shot 2017-10-25 at 4.04.35 PM

If you log into AWS console, you can see two servers  – AppServer VM &  DB Server VM – up and running.

Screen Shot 2017-10-25 at 4.06.16 PM


Now fetch the IP of the AppServer from AWS console and access app via <ip>:9080

Screen Shot 2017-10-25 at 4.07.51 PM

So in this series , we saw how we  could do full stack deployment of application , starting from provisioning the infrastructure , installing middleware and then  installing the application.

Full stack deployment using UCD – Part2

Now in this blog , we shall be looking at integrating with AWS .

In Blueprint Designer under Administration , select clouds and create a cloud connection by clicking on ‘Add New Cloud’.

In the Edit tab give the connection a name and select ‘Amazon Web Services’ cloud type. Uncheck “Use default orchestration engine” and enter the Identity and Orchestration end points. Replace {hostname} with the fully qualified name of the server hosting the Openstack Identity and HEAT services. Click Save.


Now under Users , create a new Realm Openstack and provide the Identity Realm , Orchestration URL and admin credentials as below . Once  Test Connection is successful we will go import users.

Openstack Realm

Next under Users –  select the Openstack realm which was created above and then import the users – as seen we see users admin and  heat imported.


Now that we have the users imported , we will focus on Giving Roles to user and also adding them to Team .

In the Authorization field , one can add  Amazon Access ID and secret and Test Connection.


The Cloud Discovery service provides information about available images, regions, and flavors to the blueprint design server. This service runs with a default set of options, but you can customize these options so your images, regions and flavors are shown in the blueprint designer by refering following link . In my case , I left setting as default to show all images.

We are all set now . on home page on clicking on clouds , it displays various regions and we can select specific region where we would like to work .

Screen Shot 2017-10-25 at 10.14.20 AM

In my next and final blog , we shall see how to create the blueprint .

Full Stack Deployment using Urbancode – Part1

It’s been a while  since I  blogged , but a recent customer usecase for Full stack deployment , got me to have a look into IBM Urbancode Blueprint . This blog is  guide to help you to design and deploy  full-stack environment for multiple clouds.

Teams often struggle to deploy applications in various environments – it is a lot of work trying to get an environment  setup before we can start to deploy our application. With cloud coming into play and with IBM UCD+Blueprint  the one click deployment story comes to life. Blueprint basically helps provisioning cloud environments and deploying application components to those environments, to know high level features read here

In this scenario –  we would model the application deployment process using UCD and environment provisioning in Blueprint.When you connect Blueprint Designer with UCD, you can provision application environments and install components on those environments.

The general process includes the following steps:
1. You connect Blueprint to required cloud system.
2. You integrate Blueprints Designer with UCD.
3. In UCD you create component processes to deploy components.
4. You create pattern/heat template in Blueprint Designer that include the infrastructre components as well as UCD components and the component processes.
5.  Finally you provision the blueprint.

Blueprint supports two  methods of provisioning resources on cloud

  1. Virtual system patterns (Used  with IBM Smart Cloud Orchestrator , PureApp , IBM Workload deployer)
  2.  OpenStack Heat ( AWS , Softlayer , Azure , Google Cloud )

My first goal was to try with AWS , so openstack Heat it what I explored.

One can either extend existing Openstack installation or Blueprint  comes  with a set of packaged Openstack services. Since I did not have Openstack , I chose to install one which came with Blueprint Engine.

  • Orchestration (HEAT) – Used to communicate to Clouds like AWS / Azure / SoftLayer
  • Identity (Keystone)- This is needed by Orchestration Service  to  to authenticate with cloud.

The Identity service objects are only used for the cloud connections, whereas the identity, and authentication of Blue Print Designer users, is typically managed through an enterprise directory.

Installation Topology ( Version :  6.2.4) :architecture

I installed Urbancode deploy and License Server on Windows Box and  installed Engine and Blueprint Server on RHEL7 ( Engine very specifically needs this version of OS) .  The installation in Documentation Center were fairly detailed – I reused the Maria DB which was installed during  Engine installation for Blueprint Server as well.Btw after couple of trials , I figured out version of MariaDB jar needed was mariadb-java-client-1.6.2.Make sure you have required rules setup to open the ports . I had this setup done on AWS  and defined Security rules to open 8443, 8080, 8004 , 7918, 3306, 5000, 27000, 22 ,8000 etc

Now that setup is ready , integrating the tools follow Connecting UCD to Blueprint

  •  UCD

In URL supplied https://<Blueprint designer URL>:8443/landscaper and selected “Always authenticate for each UCD user” and provided UCPD default user which is “ucdpadmin”


  • Blueprint

In Blueprint , provide the UCD server URL https://<Server hostname>:8443 , port as 7918 ,Token Based authentication and the Token created in UCD (In UCD Settings > Tokens and then clicking Create Token)


  •   Agents

Now I was wondering how are agents installed automatically , and somehow during the installation process I had missed agent part (Look at 3.From a Linux or Windows computer, install the agent components on the server). Cloud agents provided here are added as components to UCD. So these are now deployable components and their deployable components and their deployment is automated at provisioning time. The cloud agent package also adds an example component and application to UCD.deployment is automated at provisioning time

So now we have UCD and Blueprint connected and we have UCD agent components on UCD server . In my next blog I shall talk about connecting to cloud.. stay tuned !

Delivery Pipeline deploying Node app to IBM Container binding to Bluemix Service

If you have stumbled on this post , I believe you are fairly acquainted with Docker and IBM Containers on Bluemix.  The scenario which I am penning here is showcase how to build / deploy application on IBM Containers in Bluemix through delivery pipeline and  have that communicate to Bluemix Services.

Now if you are familiar with Bluemix – you know that Bluemix comes with Boiler Plates which are basically applications (along with code) which can be deployed to Bluemix cloud foundry runtimes. Instead of writing whole new app , I shall re-use the same app and show deployment to IBM Containers and connecting to Bluemix Service. I decide to go ahead with Node.js Cloudant Web Starter  where I shall deploy the same app to IBM Node Container and have it communicate with Bluemix Cloudant Service.

1. From Bluemix dashboard  create a web application using ‘Node.js Cloudant Web Starter’ . I named mine ‘samplenodeCFapp’. I have my CF app running


2. To get source code of this app, I click on ‘Add ToolChain’ ( ToolChain is experimental as this stage , and as Beta User I have access to it , for others it would be ‘Add Git’ button).  ToolChain is really a cool capability provided in Bluemix which basically plugs various Devops tools together for project  requirements – at this stage Toolchain has handful of features, which are expected to grow.We shall explore ToolChains maybe later in another blog entry.

So having said that my ToolChain is created as below.


The fascinating part is that at the click of button I have the whole DevOps toolchain  plugged and wired-ready for use : The source-code of the application setupin SCM repository (GitHub) ,  Issue tracking setup , Web based Eclipse Orion IDE for coding online , and Delivery Pipleline which lets you Build and Deploy applications. Neat ain’t it ?

Let’s go and check out Delivery Pipeline – we see that it is created as below ( this basically does deployment to cf, what we are interested now is building  the same for  IBM Containers )


So what we need to add is abilty to run Docker Build which will create Docker Image and then Docker Deploy which would depoy the  Container.

Build Docker :

I click on Add Stage – name my stage ‘Build Docker‘ and in Input tab select Input Type –  ‘SCM repository‘ ,and Trigger as ‘Run jobs whenever change is pushed to git‘.

In  the Jobs tab –  add Build job and select Build Type as ‘IBM Container Service‘, I name my image ‘samplenodedockerapp‘ and in Ports add  80, 443,3000 . I ensure the right Target /Organisation/Space where I want to deploy my container are selected.

Screen Shot 2016-08-01 at 10.46.02 AM

Screen Shot 2016-08-01 at 10.44.18 AM

 Deploy Docker

I click on Add Stage – name my stage ‘Deploy Docker

Input Tab select Input type as ‘Built Artifacts‘, stage as ‘Build Docker‘ ( docker stage created earlier) , Job as ‘Build’ and Stage Trigger as ‘Run Jobs when previous stage is completed

Screen Shot 2016-08-01 at 10.54.06 AM

In the Jobs Tab – select Deployer Type as ‘IBM Containers on Bluemix‘ , select name ‘samplenodecontainer‘ , ports ‘80,443, 3000‘ . I ensure the right Target /Organisation/Space where I want to deploy my container are selected.

Screen Shot 2016-08-01 at 10.53.02 AM

Environment Tab

Screen Shot 2016-08-01 at 10.54.16 AM

In the Environment tab , we can specify which application we want to Bind to and size of container .

Inorder to bind services to Container , one can bind services to dummy app – the Service credentials get stored in VCAP_SERVICES.

Since the services which I want to bind to this container are bound to app ‘samplenodeCFapp’,  I specify the same .

The app which we write basically parses the VCAP_SERVICES to get information on the services.

I have done all the required configurations and the delivery pipeline is configured as below :

Screen Shot 2016-07-29 at 2.26.47 PM

Now one key piece which must be there in place for Build Docker stage is the Dockerfile which is  text document that contains all the commands a user could call on the command line to assemble an image.

I go back to my ToolChain and open the Web Editor and add Dockerfile with content as below


ADD . /node
RUN npm install


ENTRYPOINT ["node", "/node/app.js"]

On Bluemix dashboard we have  default IBM Container like ibmnode – we shall deploy our sample Node.js application to this Container. So in my Dockerfile  what I am doing is pulling the default node Image, copying my content to /node folder and them mentioning that command to run as Entrypoint. I also mention the ports to expose .

My file is as below

Screen Shot 2016-07-29 at 2.36.01 PM

Commit and Push this file using by clicking on ‘git’ icon on left sidebar. The moment that is done , one can see that delivery pipeline is activated and is running

Screen Shot 2016-07-29 at 2.40.54 PM

( One can ignore the first two stages – those are deploying a CF app which were part of boiler plate)

Once Build Docker  stage passes – one can go to Bluemix Dashboard and click on ‘Start Containers’ to check that Image is now present ..

Screen Shot 2016-07-29 at 4.21.26 PM

Once the Deploy Docker stage in the Delivery Pipeline completes , it creates an instance in the space specified :

IBM Container

Open your browser and enter link http://<public ip>:3000 and your Node application is now accessible on IBM Container .

So in this article we saw how we can deploy Node.js app to IBM Container and have it communicate to Cloudant DB Service in Bluemix.




Static Code Analysis leveraging Sonarqube in IBM Bluemix DevOps Services

It’s been a while since I updated my blog, infact a year !!! I know I should be more regular writing these Technical articles..hope this year has more updates from me. Anyways..

The topic today is about Static Code Analysis. In many a customer engagement on DevOps on Bluemix, I have been asked about how to do Static Code Analysis through the Delivery Pipeline on Bluemix. Well this is not something which is available out-of-the-box on Bluemix. I normally suggest Kiuwan, which me and my fellow IBMer Amano-san  had  explored and written article on same here , but most of the times customers have been using  SonarQube which is quite popular and would like to leverage the same.

Before I get into Sonarqube , just brief on what is Static Code Analysis  all about ?

Static Code Analysis is a method of computer program debugging that is done by examining the code without executing the program. The process provides an understanding of the code structure, and can help to ensure that the code adheres to industry standards.

There are wide range of tools which do Static Code Analysis and this link gives pretty good view. Sonarqube is one of the Static Code Analysis tools.

With that brief introduction , let’s get started.

First I needed  a Sonarqube Server on Cloud so that my build in Delivery Pipeline is able to talk to Sonarqube . Having played around with Docker for a while ,choosing a Sonarqube image from DockerHub was natural choice. I picked the Version 4.5.7 of Sonarqube.

The following command lets one copy image from dockerhub to Bluemix IBM Containers directly

cf ic cpi source_repository/source_image_name private_registry_URL/

I execute :

docker cpi sonarqube:4.5.7

This command basically copies the sonarqube version 4.5.7 from Dockerhub to Bluemix IBM Containers.

Now from Bluemix dashboard   I go ahead and create a Container instance and assign a public IP to my instance .Voila , I now have my sonarqube server running  and can access it at http://<my ip>:9000/  . Docker indeed makes life a breeze – I shall write more on Docker – but that is topic for another day.

Now my task was to showcase being able to run Static Code Analysis from Delivery Pipeline.I decide to leverage a Java Cloudant Web Starter Boiler Plate application in Bluemix to showcase the same.

Once application is created on Bluemix , I click on ‘Add git’ button which creates a project in for me along with the Delivery Pipeline setup.

This project is built using ant , hence I look up for info for Sonar scanner for Ant. I download the required sonarqube-ant-task.jar file and place it in lib/sonar as shown in screenshot

Screen Shot 2016-06-28 at 8.13.36 PM

Next in the build.xml file, I need to make couple of changes :

(i) Add the following properties ( Replace your IP of sonar instance in

<property name=”sonar.home” value=”lib/sonar”/>
<property name=”sonar.projectKey” value=”org.sonarqube:java-simple-ant” />
<property name=”sonar.projectName” value=”Simple Project for Ant” />
<property name=”sonar.projectVersion” value=”1.0″ />
<property name=”sonar.language” value=”java” />
<property name=”sonar.sources” value=”src” />
<property name=”sonar.binaries” value=”bin” />
<property name=”sonar.sourceEncoding” value=”UTF-8″ />

<property name= “” value =”http://<IP of Sonar Instance>:9000″ />
<property name=”sonar.jdbc.url” value=”jdbc:h2:tcp://<IP of Sonar  Instance>:9092/sonar” />
<property name=”sonar.jdbc.username” value=”sonar” />
<property name=”sonar.jdbc.password” value=”sonar” />
<property name=”ant-contrib.jar” value=”lib/ant”/>

(ii) Add Classpath and then define sonar target as below
<path id=”sonar.classpath”>
<pathelement location=”${sonar.home}”/>
<fileset dir=”${sonar.home}”>
<include name =”*.jar”/>

<target name=”sonar”>
<taskdef uri=”antlib:org.sonar.ant” resource=”org/sonar/ant/antlib.xml” classpathref=”sonar.classpath” />

(iii) Make sure you change the target build to include sonar as below

<target name=”build” depends=”build-project,sonar,build-war”/>

With these changes , we are ready to go . And yes, once you make all these changes , one needs to Commit and Push in git repo . Once that is done , we shall see that build is automatically triggered and is successful.

Screen Shot 2016-06-28 at 8.37.54 PM

One can click on ‘View logs and history’ to check details of the run as below :

Screen Shot 2016-06-28 at 9.55.08 PM

I see the sonar static code analysis is being performed as part of the build and report is published to sonarqube as below.

Screen Shot 2016-06-28 at 9.57.03 PM

There is our Sonar Report pushed from build on Sonarqube Server !  One can see that 355 lines of Java code are analyzed ,  and 5 critical issues , 38 major etc have been identified.

Screen Shot 2016-06-28 at 9.57.24 PM

One can click on issue and drill down further into what issue was as shown below:

Screen Shot 2016-06-29 at 8.10.26 PM

I hope this helps to get started with Sonarqube through Delivery Pipeline in  Bluemix.

Also as mentioned at start, one can also explore static code analysis via Kiuwan integration with DevOps.

Virtual Machines on IBM Bluemix

Hey I am back..and I have just moved to new team focusing on Bluemix which is IBM’s PaaS . Bluemix enables organizations and developers to quickly and easily create, deploy, and manage applications on the cloud.

In this blog we explore VMs in Bluemix . IBM Bluemix provides the IBM Virtual Machines infrastructure as a beta feature that is currently available in US-South region . You can use this infrastructure to create virtual machines that are running in public and private (on-premise) OpenStack clouds. To know more  one can go through the following document

I have been working with customers who are evaluating this feature , any many of them have very similar questions and difficulties getting around. Hence I am putting up this blog which details some of the common scenarios which most folks were playing around with.. and to help others get started.

  1. Creation of VM Image
  2. Accessing the image created through ssh
  3. Adding new user to that image and login is as new user
  4. Pushing artifacts to new VM using WinSCP
  5. UI access to VM images

1.  Creation of VM Image

(i) From Bluemix Dashboard , click on Virtual Machines and then Create VM Image


(ii) On the Create Virtual Machine Bluemix screen , there are multiple flavors to choose from , I decided to create Ubuntu VM So I select Ubuntu from Image to launch , I add a VM group name and then the Security Key.


To access this VM instance after it has been created one needs a SSH key , there are multiple ways one can create SSH key and you could follow steps is the BluemixVM Image Documentation . I did not use puttygen and used the windows systems  SSH command line alternative found here.

On my windows command prompt I ran command as below and it prompted me for passphase. Enter any passphase and remember that passphase as it would be used while logging in later.

ssh-keygen -t rsa -f cloud.key

This command generates a pair of keys:

  • a key that you keep private :  cloud.key
  • a public key :

So I imported the by clicking on Add New Key , gave it name  and copied the contents of into Public Key to Import section and I am now ready to click on the Create Button.

(iii) The image is created in a minute or so and a Public as well as Private IP is assigned as shown below


2. Accessing the image created through ssh

(i)  On the command prompt (in location where the private ssh key is kept) , I type:  ssh -i    ibmcloud@    ( keyname is private key generated in step1 , ibmcloud is the default user on all IBM VM images on Bluemix , public IP is IP starting with 129 as shown in screenshot above)

ssh -i cloud.key ibmcloud

It would ask for passphase ,enter the passphase supplied in Section1 and volia we are in now.

3. Adding new user to that image and login is as new user

Let’s add a new user now..

(i) To get root access

$sudo bash

(ii) To add a new user , run   adduser , it would ask to enter password as well and couple of other information

adduser smith

(iii) Edit the sshd_config file to allow newly added user to login  via ssh . Edit the last  line in sshd_config which says AllowUser , add the user created above and save the file

vi /etc/ssh/sshd_config


(iv) Lets restart the service

service ssh restart

(v) Now we are ready to Login with new user , open a new command prompt and enter ssh username@, it would prompt for password , supply password entered while creating the user.

ssh smith@129.xx.xx.xx

We are now logged is as the new user .

4.Pushing artifacts to new VM using WinSCP

We may need to transfer files to VM and you can use WinSCP here

(i) The newly added user smith can easily Login through WinSCP entering userid/password.


(ii) ibmcloud can also login through WinSCP. Enter username as ibmcloud and leave password blank and instead click on Advanced Button and on SSH tab , under Authentication enter the ppk key.


If you created key using openssh , the key needs to be converted into ppk format , one can use puttygen to do the key conversion .( On puttygen click on  Conversions -> Import key and import the private  cloud.key generated in step1 and then click on generate key to convert to ppk format ).

5. UI access to VM Image

Finally , our last scenario in this Blog .. UI acccess, there are multiple ways one could do this , in this I am exploring xfce4

(i) Login into the VM using the ssh key

ssh -i cloud.key ibmcloud@129.x.x.x

(ii) Login as superuser

sudo bash

(iii) Install the required packages : By default, most Linux server installations will not come with a graphical desktop environment.In this case you need to install one that you can work with. In this example, we will install XFCE4

apt-get install gnome-core xfce4 firefox
apt-get install vnc4server
apt-get install xfonts-base

(iv) To make sure that vnc server is installed, type this command:

dpkg -l | grep vnc

Output is as follows:


(v) Now my goal is to give VNC access to user smith I had created in Section3

su - smith

The output would be something like this , one needs to set password which would be used to access VNC Server


(vi) I backup  the xstartup file before I make changes in case I need it later and edit it as below:

cp ~/.vnc/xstartup ~/.vnc/xstartup.bak
vi ~/.vnc/xstartup

And I edit the file as below :


(vii) Next I stop the VNC server instance that is running on port 5901

vncserver -kill :1

(viii) Now to easily control our the VNC server, let us set it up as an Ubuntu service. Create a file as below as paste contents in that file.

vi /etc/init.d/vncserver
[ -f /etc/vncserver/vncservers.conf ] && . /etc/vncserver/vncservers.conf
prog=$"VNC server"
start() {
 . /lib/lsb/init-functions
 echo -n $"Starting $prog: "
 ulimit -S -c 0 >/dev/null 2>&1
 for display in ${VNCSERVERS}
 export USER="${display##*:}"
 if test -z "${REQ_USER}" -o "${REQ_USER}" == ${USER} ; then
 echo -n "${display} "
 su ${USER} -c "cd ~${USER} && [ -f .vnc/passwd ] && vncserver :${DISP} ${VNCUSERARGS}"
stop() {
 . /lib/lsb/init-functions
 echo -n $"Shutting down VNCServer: "
 for display in ${VNCSERVERS}
 export USER="${display##*:}"
 if test -z "${REQ_USER}" -o "${REQ_USER}" == ${USER} ; then
 echo -n "${display} "
 export USER="${display##*:}"
 su ${USER} -c "vncserver -kill :${display%%:*}" >/dev/null 2>&1
 echo -e "\n"
 echo "VNCServer Stopped"
case "$1" in
start $@
stop $@
stop $@
sleep 3
start $@
if [ -f /var/lock/subsys/vncserver ]; then
stop $@
sleep 3
start $@
status Xvnc
echo $"Usage: $0 {start|stop|restart|condrestart|status}"
exit 1

(ix) To ensure that the VNC server will be able to use this new startup file properly, we’ll need to grant executable privileges to it:

chmod +x /etc/init.d/vncserver

(x) Now its time to start this service  to start a new VNC server instance

service vncserver start

(xi) In VNC you can define a separate access session for each particular user. For this we will use VNC configuration file:

mkdir -p /etc/vncserver
vi /etc/vncserver/vncservers.conf

(xii) To configure VNC for a user “smith” /etc/vncservers/vncservers.conf file insert the following lines:

VNCSERVERARGS[1]="-geometry 1024x768"

Here your port comes to be 5901 & 1024×768 resolution for the VNC client, you can choose resolution of your own choice.

(xiii) Next I will add it into boot startups:

update-rc.d vncserver defaults 99

(xiv) Finally reboot your server.


(xv) To test your VNC server, since I am on Windows I downloaded  VNCViwer ,

Now I am going to connect with the VNC server through my VNC-client and I am now on my last step.. ( I added the entry for the of the VM in my hosts file in Local machine mapping it to its IP) and ..


and there you go… we finally have logged into the image !


Ow yes , I disabled the firewall in between ( not sure though if that was needed)

sudo ufw disable

Release Managment Woes ?

Release Management is the process of managing software releases from development stage to software release. Releases can be at different levels: enterprise, line-of-business, application . Release Collaboration Release Managers need to bring together expertise from multiple application teams, testers and operations. They coordinate work across departments, often with inconsistent process flows through them. Let us look at various phases: Release Planning is manual labor intensive process.There has been wide-spread adoption of tools for Defect management, Testing, Deployment etc, but Release management is many Organizations is still done using Excel. Yes, excel is simple and you don’t really need to teach anyone how to use excel. But as number of applications participating in Release grows, the size of excel starts growing too. I have seen customers with Release Plans in Excel with more than two thousand lines – Now that is one mammoth plan !! At Enterprise Level Release with multiple teams participating it gets extremely difficult to get inputs from different stakeholders. Also while planning an Enterprise release , release managers are faced with challenge that there is no standards process followed  across LOB, and hence there is limited visibility into dependency which could lead to sub-optimal plans. Gaps and errors in plans cause unwanted  production issues .The cost of Planning the Release is very high – multiple meeting are held to sort out and seek clarifications on dependencies, a lot of emails exchange, and eventually the chance of things falling apart is high. Additionally the Release Manager needs to account for Enterprise Blackouts, Holidays, other Releases – and referring different tools is painful. The Release Implementation – the day of the Release finally arrives ( it is called the Release weekend) , and the office is bustling. Hundreds of people need to report to duty , be available round the clock and there is Conference Call running throughout the duration with people providing constant updates on the call. How does the executor  know when to start their task – should they keep waiting for the email from Duty Manager? How to  identify tasks which should be to be run sequentially? Does the Duty Manager have real time visibility into which tasks have not been started yet, which tasks are running late, actual status? Many a time the Duty Manager gets update for various tasks from people executing those through email  and the Duty Manager has to collate status from various teams send an high level reporting status to stakeholders at predefined points . There is lack of coordination and lot of wastage  Often Poor integration between systems also leads to rekeying data or lack of visibility into release status. The key point is that Real time status of the Release is missing. Another key aspect is Troubleshooting issues becomes  difficult because it is unclear which application versions are deployed to which environments. Finally the Release weekend is over , now the teams are faced with the task of Post Release Tracking. Teams struggle with inability to apply learning to future releases. For example while running task in Release Implementation, additional steps were required -how are those captured and leveraged in future releases? Problem-Statement We do have a solution to address these typical Release Management problems as part of our IBM DevOps Solution. IBM UrbanCode Release is an intelligent collaboration release management solution that replaces error-prone manual spreadsheets and streamlines release activities for application and infrastructure changes. The key capabilities in UrbanCode Release are :

  1. Release orchestration – Changes across multiple applications across Release pipeline.
  2. Deployment Plans and Templates – Creation of deployment plan is easier and collaborative experience.
  3. Release Gates – Automatic promotion rules to speed up changes through pipeline
  4. Calender View – Ability to schedule events such as holidays, blackout dates, and enterprise release dates.
  5. Environment Management – Easy identification of available environments and reservation
  6. Pipeline Visualization – Makes it easy to see which versions of each application are in each environment.
  7. Federated Dashboard – provides overview of progress across multiple Releases
  8. Impact Analysis – To view relationship between Application, changes(actual Work Items) and initiatives (projects) related to Release. (Integration with Rational Team Concert and JIRA)
  9. Integration with UrbanCode Deploy – Triggers automated tests in UrbanCode Deploy
  10. Real time status for long running production deployments

To know more about UrbanCode Release , here are some recommended videos

Deploying a .Net application to IIS using UrbanCode Deploy via msdeploy

I want to showcase a sample .Net application deployment using IBM UrbanCode Deploy . I  decided to pick Employee-Info-Starter-Kit . Employee Info Starter Kit is an open source ASP.NET project template and uses a single database table ‘Employee’ – This seemed like a  perfect example . In this blog entry , I would walk through scenario where this .Net application gets developed and then deployed on IIS using msdeploy plugin and I shall also showcase how sql scripts gets deployed using JDBC driver plugin of UrbanCode.

Local setup :

  • Visual Studio 2012
  • SQL Server 2014
  • IIS 7

Target Machine :

  • IIS 7
  • SQL Server 2008 R2

Server :


Section1: Sample application development in Visual Studio

1. I followed instructions here to do the installation .

2.Clicking on Eisk.Web.csproj opens the  project in Visual Studio , post which I did couple of changes as listed below :

(i) I edited the line connectionString in web.config as below  (basically changed DataSource to point to my local SQL Server (SMITH\SQLEXPRESS)  and set Integrated Security=True as I was using Windows authetentication)

<add name=”DatabaseContext” connectionString=”metadata=&quot;res://*/App_Logic.Entity Model.DatabaseContext.csdl|res://*/App_Logic.Entity Model.DatabaseContext.ssdl|res://*/App_Logic.Entity Model.DatabaseContext.msl&quot;;provider=System.Data.SqlClient;provider connection string=&quot;Data Source=SMITH\SQLEXPRESS;Initial Catalog=EmployeeInfo_SK_5_0;Integrated Security=True&quot;” providerName=”System.Data.EntityClient” />

(ii) I also create a file Setup.sql file under in Visual Studio under C# -> App_Data->SQL->Extras with content as below.

— =============================================
— Script Template
— =============================================

USE [EmployeeInfo_SK_5_0];
IF NOT EXISTS (SELECT * FROM master.dbo.syslogins where loginName= ‘NT AUTHORITY\NETWORK SERVICE’)
Print ‘Login exists…no need to re-create!’

IF NOT EXISTS (SELECT *  from sys.database_principals where (type=’S’ or type = ‘U’)  AND’NT AUTHORITY\NETWORK SERVICE’)
        EXEC sp_addrolemember ‘db_owner’, ‘NT AUTHORITY\NETWORK SERVICE’
        Print ‘User exists…no need to re-create!’

(iii) Also when I was running execution  of  certain sql files  through UrbanCode Depoy , it threw error  “Error Executing SQL Scripts: Incorrect syntax near ‘»’. This was not visible when I opened in VS Studio or even Notepad . This is due to encoding . I ensured that all sql files were opened in Visual Studio and then clicked on ‘File’ -> ‘Advanced Save’ option and changed Encoding to Western European (Windows) -Codepage 1252 . This solved problem

3.  After I did all above changes , I select Eisk.Web which is my project and click on Control-F5


I now see that my application is built as above and launched in browser


4. I change the Your database server address from localhost to smith\sqlexpress  which is my database address and click on Test Connection.

Message is displayed Connection passed. Click the Create Database button to create database”.


5. I click on ‘Create Database’ .

Message  is displayed “Database created successfully. Click the Create ‘Install Schema and Data’ button to install database. is displayed


6. I click on ‘Install Schema and Data’.

Message is displayed ” Congratulations! Database installation successful. Click here to start using Employee Info Starter Kit”


6. On clicking here , we now have the application up with data as below.


Now that I know it works fine , I want to create a deployment package so that I can deploy same on another machine.

Section2: Creating a Deployment Package from Visual Studio

1.  I select Eisk.Web which is my project and right click on  it and choose the Publish option

2. In the Publish Web window  under ‘Select a publish target‘ I choose Custom and enter Profile Name as Profile1 .  I enter details as  below under  Connections :

  • Publish Method: Web Deploy Package
  • Package Location: C:\DemoUCD\DemoMicrosoft\package\
  • Site/Application: Default Web Site/EiskApp

Click Next

4. Under settings I enter

  • Configuration: Release – Any CPU
  • DatabaseContext : click on  button with “…” .

On the Destination Connection String

  • Enter Server Name : SMITH\SQLEXPRESS and click on Refresh
  • Click on Use Windows Authentication radio button
  • Under Select or enter a database name enter EmployeeInfo_SK_5_0
  • Click on Test Connection to validate if required.
  • If everything fine ,click on OK.
  • The DatabaseContext  is automatically set to SMITH\SQLEXPRESS;Initial Catalog=EmployeeInfo_SK_5_0;Integrated Security=True
  • Ensure that Use this connection string at runtime (update destination web.config) is Checked.
  • I kept Update database as unchecked ( One could have this option checked as well , but I wanted to run through a scenario where I want to showcase SQL-JDBC plugin of Ibm UrbanCode Deploy being used to run sql scripts)

Click Next and click Publish

The output directory contains the files:



Section 3: Deployment using IBM UrbanCode Deploy

1. I create a Component Employee-component in  UrbanCode Deploy and for this scenario  I set Source Config Type: FileSystem, and path as C:\DemoUCD\DemoMicrosoft\package and import all files from package folder. ( In Urbancode we can pull artifacts from SCM or Build tool as well )


2. I create a Component Process as below :




The steps are as follows:

(i) Step – Download Artifacts  : This downloads artifacts on target machine


(ii) Step – Unzip  : This step unzips the sql files from package into folder temp in working directory on target.



(iii) Step – Move Directory  : This step moves the sql files from SQL folder located somewhere deep inside temp folder to SQL folder in working directory on target.

(iv) Step – Execute SQL Scripts : Executes the sql script to create database .  The connection string , driver jar location are defined in properties defined in Resource Group (Step 3 of this Section) . Since I am using Windows authentication  for both User and Password I type none



(v) Step – Execute SQL Scripts : Executes the sql scripts to create schema , create data and also do setup . The connection string , driver jar location are defined in properties defined in Resource Group (Step 3) . Since I am using Windows authentication  for both User and Password I type none


(vi) Step – Update XML with Xpath : Updates the XML script which has parameters for connection with variables defined in the Resource Group (Step 3). I used  the following Xpath Tester  site to frame by Xpath values .


(vii) Step – msdeploy  : Runs the msdeploy with Parameters XML files which was updated in previous step



3.I created a  Top Level Resource called Employee and added the agent running on my target machine to it and added Component Employee-component to it.


Now I want to define some Target specific variables in Resource Group Employee as below

In the Configuration of the Resource Group select Resource properties and enter the properties as shown below.


Name Value
Data Source
C:\Program Files\IIS\Microsoft Web Deploy V3\
SQL Location
temp\Content\C_C\DemoUCD\DemoMicrosoft\example\Employee Info Starter Kit\C#\obj\Release\Package\PackageTmp\App_Data\SQL
metadata=”res://*/App_Logic.Entity Model.DatabaseContext.csdl|res://*/App_Logic.Entity Model.DatabaseContext.ssdl|res://*/App_Logic.Entity Model.DatabaseContext.msl”;provider=System.Data.SqlClient;provider connection string=”Data Source=PSM-TEST-01\SYSARCH;Initial Catalog=EmployeeInfo_SK_5_0;Integrated Security=True”
C:\Program Files\Microsoft SQL Server JDBC Driver 3.0\sqljdbc_3.0\enu\sqljdbc4.jar

4. Create an Application called DemoApp:Employee and add the Component  Employee-component to the Application. Also create a application Process as below .

5. Create a new Environment  DEV for  the Application and add the  Resource Employee to the Environment

6 .We are now ready to run.  Execute the Application process on the Environment , the results are


You can now open IIS, refresh the Default Web Site, and you can see the EiskApp deployed .

Select  EiskApp -> Manage Application ->Browser as shown below



There you go …the app is deployed on target machine and is accessible as shown below..



Some environment issues which I stumbled across and solutions I applied:

1.Problem :While  running  ‘Execute SQL Script’ step I initially got error like below

Error Executing SQL Scripts: The TCP/IP connection to the host PSM-TEST-01, port 1433 has failed. Error: “Connection refused: connect. Verify the connection properties, check that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port, and that no firewall is blocking TCP connections to the port.”.
 Solution : Navigate to Microsoft SQL Server 2008 R2 ->SQL Server Configuration Manager -> SQL Server Network Configuration -> Protocols.  Under TCP/IP , ensure it is enabled and under IPAll set TCP Port as 1433 . Restart SQL Server for changes to be effective.

2. Problem :While running  ‘Execute SQL Script’ I initially got error like below

[sql] Jan 21, 2015 1:48:45 PM <clinit>
[sql] WARNING: Failed to load the sqljdbc_auth.dll cause : sqljdbc_auth (Not found in java.library.path)
Error Executing SQL Scripts: This driver is not configured for integrated authentication. ClientConnectionId:6df9b998-0cd0-498d-bde4-8dc8c0ef7ac7

Solution :   I copied the sqljdbc_auth.dll  from location where I installed by JDBC driver ( C:\Program Files\Microsoft JDBC Driver 4.1 for SQL Server\sqljdbc_4.1\enu\auth\x64 ) to  C:\Windows

3. Problem  & Solution:

4. Problem :

Solution : I added the Setup.sql file  mentioned in Section1 , Step 2 , sub-step(ii)


Part3: IBM UrbanCode Deploy & Oracle WebLogic Server Security Management

In Part1 and Part2 of this series we explored Oracle Application Deployment Server and Oracle Resource Management respectively, in this third blog we explore the plugin Oracle WebLogic Security Management. This plug-in provides steps to work with security configurations in WebLogic Server. You can perform the following tasks:

The goal here is to explore these various steps available in this plugin.

Before I leverage the plugin, I  explored the following topics to understand how the same is done through WebLogic Admin Console:  Configure Realm, Configure Role Mapping Providers, Define Global Roles, Define Users and Groups.

The Oracle WebLogic Security Management plugin comes with some example files to help users get started, so extract the plugin in folder and navigate to extras folder. Out here you will find files which you can leverage.

I have created a folder structure as follows C:\DemoUCD\DemoWebLogic\security as below which I would then supply to my UrbanCode Deploy Component


I place the example wlsMetadata.xml without modifying into the security folder. Rest of files here are modified as below and placed in security folder created above.

(Note: The wlsMetadata.xml from the extra folder in Oracle WebLogic Security Management was corrupt , so I picked file from extra folder in Oracle Resource Management plugin )
#Security Realm:
#Default Authenticator
#For many users, default role-mapper parameters will be sufficient.
#However, the plugin requires a value. An empty file (or this) will do.

<?xml version="1.0" encoding="UTF-8"?>
    <Realm name="myrealm" RoleMapper="XACMLRoleMapper">
        <operation name="createRole">
            <param resourceId="" name="@rolename1@"/>

<?xml version="1.0" encoding="UTF-8"?>
  <Realm name="@Realm@" AuthenticationProvider="@AuthenticationProviderName@">
    <operation name="createUser">
      <param name="@User1@" password="@UserPassword@" description="username1"/>
    <operation name="createGroup">
      <param name="@Group1@" description="group1 desc"/>
    <operation name="addMemberToGroup">
       <param group="@Group1@" member="@User1@" />

I referred to the following WebLogic Server Configuration MBean for creating the above property files.  If you noticed some of the properties have ‘@’ before and after, these are values I would be replacing at runtime using ‘Replace Token’ step in UrbanCode Deploy. I would define these properties in Component under Environment properties and then Replace Token step would use these values for replacement.

So let’s get started in UrbanCode deploy now ..

I already have WebLogic setup as mentioned in Part1

1. I am going to use same Resource Group  WebLogic which I created in Part1 and which was  extended further in Part2

2. Create a component Component-Weblogic-Security setting the Source Configuration Type as File System and Base Path as C:\DemoUCD\DemoWebLogic\security


3. The artifacts are imported under Version 1 in Component-Weblogic-Security

4. Add a couple of  Environment Properties in Component-Weblogic-Security and set values as shown below


5. Create a Component Process  called TestSecurityFlow with steps as shown below


( WebLogic Server needs to be restarted after a Security Realm is added/removed, so before creating Roles, Users and Groups, I restart my WebLogic Server )

6. The details to be filled in each step is as follows:

Download Artifacts :


Replace Token :



Create or Update Realm :


Create Authentication :



Create Role Mapper :


Stop WebLogic Server :


Start WebLogic Server :


Sleep for 1 Minute :


Create Roles :


Create User and Groups :



6. Add component Component-Weblogic-Security to Resource group WebLogic  which I created in Part1 and which was  extended further in Part2


7. Add component Component-Weblogic-Security to application DemoApp:Weblogic which we had created earlier in Part1

8. Create a Application Process  TestSecurity to install the Component-Weblogic-Security selecting Component process TestSecurityFlow as follows:

8. Before we run process ,  l would like to check that  Environment Properties in Component-Weblogic-Security we  defined in Step4 are available when we navigate to application DemoApp:Weblogic->Environments->DEV ( I am reusing environment DEV created in Part1) . One could edit properties here if required .

9. I am all set to run, navigate to application DemoApp:Weblogic ,  select DEV environment and run Application Process TestSecurity selecting the latest version. The process runs successfully as shown below


We can now go to WebLogic Server and check that resources are created successful as below





Role Mapper


Roles :


Users :




Hope this 3 part series helps you get started on devops journey using UrbanCode Deploy and WebLogic

Bluemix HUB

The community for bluemix developers

Smith's Blog

Continuous delivery of software-driven innovation


Sanjeev Sharma: My thoughts on Cloud, DevOps, Data Strategy, and Life...

Takehiko Amano's Blog on Emerging Technologies.

My thought on DevOps and Cloud technologies

Tim Feeney's Blog on Jazz

close encounters of the Jazz kind

Dan Toczala's Blog

THINK - About what's possible....