Delivery Pipeline deploying Node app to IBM Container binding to Bluemix Service

If you have stumbled on this post , I believe you are fairly acquainted with Docker and IBM Containers on Bluemix.  The scenario which I am penning here is showcase how to build / deploy application on IBM Containers in Bluemix through delivery pipeline and  have that communicate to Bluemix Services.

Now if you are familiar with Bluemix – you know that Bluemix comes with Boiler Plates which are basically applications (along with code) which can be deployed to Bluemix cloud foundry runtimes. Instead of writing whole new app , I shall re-use the same app and show deployment to IBM Containers and connecting to Bluemix Service. I decide to go ahead with Node.js Cloudant Web Starter  where I shall deploy the same app to IBM Node Container and have it communicate with Bluemix Cloudant Service.

1. From Bluemix dashboard  create a web application using ‘Node.js Cloudant Web Starter’ . I named mine ‘samplenodeCFapp’. I have my CF app running


2. To get source code of this app, I click on ‘Add ToolChain’ ( ToolChain is experimental as this stage , and as Beta User I have access to it , for others it would be ‘Add Git’ button).  ToolChain is really a cool capability provided in Bluemix which basically plugs various Devops tools together for project  requirements – at this stage Toolchain has handful of features, which are expected to grow.We shall explore ToolChains maybe later in another blog entry.

So having said that my ToolChain is created as below.


The fascinating part is that at the click of button I have the whole DevOps toolchain  plugged and wired-ready for use : The source-code of the application setupin SCM repository (GitHub) ,  Issue tracking setup , Web based Eclipse Orion IDE for coding online , and Delivery Pipleline which lets you Build and Deploy applications. Neat ain’t it ?

Let’s go and check out Delivery Pipeline – we see that it is created as below ( this basically does deployment to cf, what we are interested now is building  the same for  IBM Containers )


So what we need to add is abilty to run Docker Build which will create Docker Image and then Docker Deploy which would depoy the  Container.

Build Docker :

I click on Add Stage – name my stage ‘Build Docker‘ and in Input tab select Input Type –  ‘SCM repository‘ ,and Trigger as ‘Run jobs whenever change is pushed to git‘.

In  the Jobs tab –  add Build job and select Build Type as ‘IBM Container Service‘, I name my image ‘samplenodedockerapp‘ and in Ports add  80, 443,3000 . I ensure the right Target /Organisation/Space where I want to deploy my container are selected.

Screen Shot 2016-08-01 at 10.46.02 AM

Screen Shot 2016-08-01 at 10.44.18 AM

 Deploy Docker

I click on Add Stage – name my stage ‘Deploy Docker

Input Tab select Input type as ‘Built Artifacts‘, stage as ‘Build Docker‘ ( docker stage created earlier) , Job as ‘Build’ and Stage Trigger as ‘Run Jobs when previous stage is completed

Screen Shot 2016-08-01 at 10.54.06 AM

In the Jobs Tab – select Deployer Type as ‘IBM Containers on Bluemix‘ , select name ‘samplenodecontainer‘ , ports ‘80,443, 3000‘ . I ensure the right Target /Organisation/Space where I want to deploy my container are selected.

Screen Shot 2016-08-01 at 10.53.02 AM

Environment Tab

Screen Shot 2016-08-01 at 10.54.16 AM

In the Environment tab , we can specify which application we want to Bind to and size of container .

Inorder to bind services to Container , one can bind services to dummy app – the Service credentials get stored in VCAP_SERVICES.

Since the services which I want to bind to this container are bound to app ‘samplenodeCFapp’,  I specify the same .

The app which we write basically parses the VCAP_SERVICES to get information on the services.

I have done all the required configurations and the delivery pipeline is configured as below :

Screen Shot 2016-07-29 at 2.26.47 PM

Now one key piece which must be there in place for Build Docker stage is the Dockerfile which is  text document that contains all the commands a user could call on the command line to assemble an image.

I go back to my ToolChain and open the Web Editor and add Dockerfile with content as below


ADD . /node
RUN npm install


ENTRYPOINT ["node", "/node/app.js"]

On Bluemix dashboard we have  default IBM Container like ibmnode – we shall deploy our sample Node.js application to this Container. So in my Dockerfile  what I am doing is pulling the default node Image, copying my content to /node folder and them mentioning that command to run as Entrypoint. I also mention the ports to expose .

My file is as below

Screen Shot 2016-07-29 at 2.36.01 PM

Commit and Push this file using by clicking on ‘git’ icon on left sidebar. The moment that is done , one can see that delivery pipeline is activated and is running

Screen Shot 2016-07-29 at 2.40.54 PM

( One can ignore the first two stages – those are deploying a CF app which were part of boiler plate)

Once Build Docker  stage passes – one can go to Bluemix Dashboard and click on ‘Start Containers’ to check that Image is now present ..

Screen Shot 2016-07-29 at 4.21.26 PM

Once the Deploy Docker stage in the Delivery Pipeline completes , it creates an instance in the space specified :

IBM Container

Open your browser and enter link http://<public ip>:3000 and your Node application is now accessible on IBM Container .

So in this article we saw how we can deploy Node.js app to IBM Container and have it communicate to Cloudant DB Service in Bluemix.




Static Code Analysis leveraging Sonarqube in IBM Bluemix DevOps Services

It’s been a while since I updated my blog, infact a year !!! I know I should be more regular writing these Technical articles..hope this year has more updates from me. Anyways..

The topic today is about Static Code Analysis. In many a customer engagement on DevOps on Bluemix, I have been asked about how to do Static Code Analysis through the Delivery Pipeline on Bluemix. Well this is not something which is available out-of-the-box on Bluemix. I normally suggest Kiuwan, which me and my fellow IBMer Amano-san  had  explored and written article on same here , but most of the times customers have been using  SonarQube which is quite popular and would like to leverage the same.

Before I get into Sonarqube , just brief on what is Static Code Analysis  all about ?

Static Code Analysis is a method of computer program debugging that is done by examining the code without executing the program. The process provides an understanding of the code structure, and can help to ensure that the code adheres to industry standards.

There are wide range of tools which do Static Code Analysis and this link gives pretty good view. Sonarqube is one of the Static Code Analysis tools.

With that brief introduction , let’s get started.

First I needed  a Sonarqube Server on Cloud so that my build in Delivery Pipeline is able to talk to Sonarqube . Having played around with Docker for a while ,choosing a Sonarqube image from DockerHub was natural choice. I picked the Version 4.5.7 of Sonarqube.

The following command lets one copy image from dockerhub to Bluemix IBM Containers directly

cf ic cpi source_repository/source_image_name private_registry_URL/

I execute :

docker cpi sonarqube:4.5.7

This command basically copies the sonarqube version 4.5.7 from Dockerhub to Bluemix IBM Containers.

Now from Bluemix dashboard   I go ahead and create a Container instance and assign a public IP to my instance .Voila , I now have my sonarqube server running  and can access it at http://<my ip>:9000/  . Docker indeed makes life a breeze – I shall write more on Docker – but that is topic for another day.

Now my task was to showcase being able to run Static Code Analysis from Delivery Pipeline.I decide to leverage a Java Cloudant Web Starter Boiler Plate application in Bluemix to showcase the same.

Once application is created on Bluemix , I click on ‘Add git’ button which creates a project in for me along with the Delivery Pipeline setup.

This project is built using ant , hence I look up for info for Sonar scanner for Ant. I download the required sonarqube-ant-task.jar file and place it in lib/sonar as shown in screenshot

Screen Shot 2016-06-28 at 8.13.36 PM

Next in the build.xml file, I need to make couple of changes :

(i) Add the following properties ( Replace your IP of sonar instance in

<property name=”sonar.home” value=”lib/sonar”/>
<property name=”sonar.projectKey” value=”org.sonarqube:java-simple-ant” />
<property name=”sonar.projectName” value=”Simple Project for Ant” />
<property name=”sonar.projectVersion” value=”1.0″ />
<property name=”sonar.language” value=”java” />
<property name=”sonar.sources” value=”src” />
<property name=”sonar.binaries” value=”bin” />
<property name=”sonar.sourceEncoding” value=”UTF-8″ />

<property name= “” value =”http://<IP of Sonar Instance>:9000″ />
<property name=”sonar.jdbc.url” value=”jdbc:h2:tcp://<IP of Sonar  Instance>:9092/sonar” />
<property name=”sonar.jdbc.username” value=”sonar” />
<property name=”sonar.jdbc.password” value=”sonar” />
<property name=”ant-contrib.jar” value=”lib/ant”/>

(ii) Add Classpath and then define sonar target as below
<path id=”sonar.classpath”>
<pathelement location=”${sonar.home}”/>
<fileset dir=”${sonar.home}”>
<include name =”*.jar”/>

<target name=”sonar”>
<taskdef uri=”antlib:org.sonar.ant” resource=”org/sonar/ant/antlib.xml” classpathref=”sonar.classpath” />

(iii) Make sure you change the target build to include sonar as below

<target name=”build” depends=”build-project,sonar,build-war”/>

With these changes , we are ready to go . And yes, once you make all these changes , one needs to Commit and Push in git repo . Once that is done , we shall see that build is automatically triggered and is successful.

Screen Shot 2016-06-28 at 8.37.54 PM

One can click on ‘View logs and history’ to check details of the run as below :

Screen Shot 2016-06-28 at 9.55.08 PM

I see the sonar static code analysis is being performed as part of the build and report is published to sonarqube as below.

Screen Shot 2016-06-28 at 9.57.03 PM

There is our Sonar Report pushed from build on Sonarqube Server !  One can see that 355 lines of Java code are analyzed ,  and 5 critical issues , 38 major etc have been identified.

Screen Shot 2016-06-28 at 9.57.24 PM

One can click on issue and drill down further into what issue was as shown below:

Screen Shot 2016-06-29 at 8.10.26 PM

I hope this helps to get started with Sonarqube through Delivery Pipeline in  Bluemix.

Also as mentioned at start, one can also explore static code analysis via Kiuwan integration with DevOps.

Virtual Machines on IBM Bluemix

Hey I am back..and I have just moved to new team focusing on Bluemix which is IBM’s PaaS . Bluemix enables organizations and developers to quickly and easily create, deploy, and manage applications on the cloud.

In this blog we explore VMs in Bluemix . IBM Bluemix provides the IBM Virtual Machines infrastructure as a beta feature that is currently available in US-South region . You can use this infrastructure to create virtual machines that are running in public and private (on-premise) OpenStack clouds. To know more  one can go through the following document

I have been working with customers who are evaluating this feature , any many of them have very similar questions and difficulties getting around. Hence I am putting up this blog which details some of the common scenarios which most folks were playing around with.. and to help others get started.

  1. Creation of VM Image
  2. Accessing the image created through ssh
  3. Adding new user to that image and login is as new user
  4. Pushing artifacts to new VM using WinSCP
  5. UI access to VM images

1.  Creation of VM Image

(i) From Bluemix Dashboard , click on Virtual Machines and then Create VM Image


(ii) On the Create Virtual Machine Bluemix screen , there are multiple flavors to choose from , I decided to create Ubuntu VM So I select Ubuntu from Image to launch , I add a VM group name and then the Security Key.


To access this VM instance after it has been created one needs a SSH key , there are multiple ways one can create SSH key and you could follow steps is the BluemixVM Image Documentation . I did not use puttygen and used the windows systems  SSH command line alternative found here.

On my windows command prompt I ran command as below and it prompted me for passphase. Enter any passphase and remember that passphase as it would be used while logging in later.

ssh-keygen -t rsa -f cloud.key

This command generates a pair of keys:

  • a key that you keep private :  cloud.key
  • a public key :

So I imported the by clicking on Add New Key , gave it name  and copied the contents of into Public Key to Import section and I am now ready to click on the Create Button.

(iii) The image is created in a minute or so and a Public as well as Private IP is assigned as shown below


2. Accessing the image created through ssh

(i)  On the command prompt (in location where the private ssh key is kept) , I type:  ssh -i    ibmcloud@    ( keyname is private key generated in step1 , ibmcloud is the default user on all IBM VM images on Bluemix , public IP is IP starting with 129 as shown in screenshot above)

ssh -i cloud.key ibmcloud

It would ask for passphase ,enter the passphase supplied in Section1 and volia we are in now.

3. Adding new user to that image and login is as new user

Let’s add a new user now..

(i) To get root access

$sudo bash

(ii) To add a new user , run   adduser , it would ask to enter password as well and couple of other information

adduser smith

(iii) Edit the sshd_config file to allow newly added user to login  via ssh . Edit the last  line in sshd_config which says AllowUser , add the user created above and save the file

vi /etc/ssh/sshd_config


(iv) Lets restart the service

service ssh restart

(v) Now we are ready to Login with new user , open a new command prompt and enter ssh username@, it would prompt for password , supply password entered while creating the user.

ssh smith@129.xx.xx.xx

We are now logged is as the new user .

4.Pushing artifacts to new VM using WinSCP

We may need to transfer files to VM and you can use WinSCP here

(i) The newly added user smith can easily Login through WinSCP entering userid/password.


(ii) ibmcloud can also login through WinSCP. Enter username as ibmcloud and leave password blank and instead click on Advanced Button and on SSH tab , under Authentication enter the ppk key.


If you created key using openssh , the key needs to be converted into ppk format , one can use puttygen to do the key conversion .( On puttygen click on  Conversions -> Import key and import the private  cloud.key generated in step1 and then click on generate key to convert to ppk format ).

5. UI access to VM Image

Finally , our last scenario in this Blog .. UI acccess, there are multiple ways one could do this , in this I am exploring xfce4

(i) Login into the VM using the ssh key

ssh -i cloud.key ibmcloud@129.x.x.x

(ii) Login as superuser

sudo bash

(iii) Install the required packages : By default, most Linux server installations will not come with a graphical desktop environment.In this case you need to install one that you can work with. In this example, we will install XFCE4

apt-get install gnome-core xfce4 firefox
apt-get install vnc4server
apt-get install xfonts-base

(iv) To make sure that vnc server is installed, type this command:

dpkg -l | grep vnc

Output is as follows:


(v) Now my goal is to give VNC access to user smith I had created in Section3

su - smith

The output would be something like this , one needs to set password which would be used to access VNC Server


(vi) I backup  the xstartup file before I make changes in case I need it later and edit it as below:

cp ~/.vnc/xstartup ~/.vnc/xstartup.bak
vi ~/.vnc/xstartup

And I edit the file as below :


(vii) Next I stop the VNC server instance that is running on port 5901

vncserver -kill :1

(viii) Now to easily control our the VNC server, let us set it up as an Ubuntu service. Create a file as below as paste contents in that file.

vi /etc/init.d/vncserver
[ -f /etc/vncserver/vncservers.conf ] && . /etc/vncserver/vncservers.conf
prog=$"VNC server"
start() {
 . /lib/lsb/init-functions
 echo -n $"Starting $prog: "
 ulimit -S -c 0 >/dev/null 2>&1
 for display in ${VNCSERVERS}
 export USER="${display##*:}"
 if test -z "${REQ_USER}" -o "${REQ_USER}" == ${USER} ; then
 echo -n "${display} "
 su ${USER} -c "cd ~${USER} && [ -f .vnc/passwd ] && vncserver :${DISP} ${VNCUSERARGS}"
stop() {
 . /lib/lsb/init-functions
 echo -n $"Shutting down VNCServer: "
 for display in ${VNCSERVERS}
 export USER="${display##*:}"
 if test -z "${REQ_USER}" -o "${REQ_USER}" == ${USER} ; then
 echo -n "${display} "
 export USER="${display##*:}"
 su ${USER} -c "vncserver -kill :${display%%:*}" >/dev/null 2>&1
 echo -e "\n"
 echo "VNCServer Stopped"
case "$1" in
start $@
stop $@
stop $@
sleep 3
start $@
if [ -f /var/lock/subsys/vncserver ]; then
stop $@
sleep 3
start $@
status Xvnc
echo $"Usage: $0 {start|stop|restart|condrestart|status}"
exit 1

(ix) To ensure that the VNC server will be able to use this new startup file properly, we’ll need to grant executable privileges to it:

chmod +x /etc/init.d/vncserver

(x) Now its time to start this service  to start a new VNC server instance

service vncserver start

(xi) In VNC you can define a separate access session for each particular user. For this we will use VNC configuration file:

mkdir -p /etc/vncserver
vi /etc/vncserver/vncservers.conf

(xii) To configure VNC for a user “smith” /etc/vncservers/vncservers.conf file insert the following lines:

VNCSERVERARGS[1]="-geometry 1024x768"

Here your port comes to be 5901 & 1024×768 resolution for the VNC client, you can choose resolution of your own choice.

(xiii) Next I will add it into boot startups:

update-rc.d vncserver defaults 99

(xiv) Finally reboot your server.


(xv) To test your VNC server, since I am on Windows I downloaded  VNCViwer ,

Now I am going to connect with the VNC server through my VNC-client and I am now on my last step.. ( I added the entry for the of the VM in my hosts file in Local machine mapping it to its IP) and ..


and there you go… we finally have logged into the image !


Ow yes , I disabled the firewall in between ( not sure though if that was needed)

sudo ufw disable

Release Managment Woes ?

Release Management is the process of managing software releases from development stage to software release. Releases can be at different levels: enterprise, line-of-business, application . Release Collaboration Release Managers need to bring together expertise from multiple application teams, testers and operations. They coordinate work across departments, often with inconsistent process flows through them. Let us look at various phases: Release Planning is manual labor intensive process.There has been wide-spread adoption of tools for Defect management, Testing, Deployment etc, but Release management is many Organizations is still done using Excel. Yes, excel is simple and you don’t really need to teach anyone how to use excel. But as number of applications participating in Release grows, the size of excel starts growing too. I have seen customers with Release Plans in Excel with more than two thousand lines – Now that is one mammoth plan !! At Enterprise Level Release with multiple teams participating it gets extremely difficult to get inputs from different stakeholders. Also while planning an Enterprise release , release managers are faced with challenge that there is no standards process followed  across LOB, and hence there is limited visibility into dependency which could lead to sub-optimal plans. Gaps and errors in plans cause unwanted  production issues .The cost of Planning the Release is very high – multiple meeting are held to sort out and seek clarifications on dependencies, a lot of emails exchange, and eventually the chance of things falling apart is high. Additionally the Release Manager needs to account for Enterprise Blackouts, Holidays, other Releases – and referring different tools is painful. The Release Implementation – the day of the Release finally arrives ( it is called the Release weekend) , and the office is bustling. Hundreds of people need to report to duty , be available round the clock and there is Conference Call running throughout the duration with people providing constant updates on the call. How does the executor  know when to start their task – should they keep waiting for the email from Duty Manager? How to  identify tasks which should be to be run sequentially? Does the Duty Manager have real time visibility into which tasks have not been started yet, which tasks are running late, actual status? Many a time the Duty Manager gets update for various tasks from people executing those through email  and the Duty Manager has to collate status from various teams send an high level reporting status to stakeholders at predefined points . There is lack of coordination and lot of wastage  Often Poor integration between systems also leads to rekeying data or lack of visibility into release status. The key point is that Real time status of the Release is missing. Another key aspect is Troubleshooting issues becomes  difficult because it is unclear which application versions are deployed to which environments. Finally the Release weekend is over , now the teams are faced with the task of Post Release Tracking. Teams struggle with inability to apply learning to future releases. For example while running task in Release Implementation, additional steps were required -how are those captured and leveraged in future releases? Problem-Statement We do have a solution to address these typical Release Management problems as part of our IBM DevOps Solution. IBM UrbanCode Release is an intelligent collaboration release management solution that replaces error-prone manual spreadsheets and streamlines release activities for application and infrastructure changes. The key capabilities in UrbanCode Release are :

  1. Release orchestration – Changes across multiple applications across Release pipeline.
  2. Deployment Plans and Templates – Creation of deployment plan is easier and collaborative experience.
  3. Release Gates – Automatic promotion rules to speed up changes through pipeline
  4. Calender View – Ability to schedule events such as holidays, blackout dates, and enterprise release dates.
  5. Environment Management – Easy identification of available environments and reservation
  6. Pipeline Visualization – Makes it easy to see which versions of each application are in each environment.
  7. Federated Dashboard – provides overview of progress across multiple Releases
  8. Impact Analysis – To view relationship between Application, changes(actual Work Items) and initiatives (projects) related to Release. (Integration with Rational Team Concert and JIRA)
  9. Integration with UrbanCode Deploy – Triggers automated tests in UrbanCode Deploy
  10. Real time status for long running production deployments

To know more about UrbanCode Release , here are some recommended videos

Deploying a .Net application to IIS using UrbanCode Deploy via msdeploy

I want to showcase a sample .Net application deployment using IBM UrbanCode Deploy . I  decided to pick Employee-Info-Starter-Kit . Employee Info Starter Kit is an open source ASP.NET project template and uses a single database table ‘Employee’ – This seemed like a  perfect example . In this blog entry , I would walk through scenario where this .Net application gets developed and then deployed on IIS using msdeploy plugin and I shall also showcase how sql scripts gets deployed using JDBC driver plugin of UrbanCode.

Local setup :

  • Visual Studio 2012
  • SQL Server 2014
  • IIS 7

Target Machine :

  • IIS 7
  • SQL Server 2008 R2

Server :


Section1: Sample application development in Visual Studio

1. I followed instructions here to do the installation .

2.Clicking on Eisk.Web.csproj opens the  project in Visual Studio , post which I did couple of changes as listed below :

(i) I edited the line connectionString in web.config as below  (basically changed DataSource to point to my local SQL Server (SMITH\SQLEXPRESS)  and set Integrated Security=True as I was using Windows authetentication)

<add name=”DatabaseContext” connectionString=”metadata=&quot;res://*/App_Logic.Entity Model.DatabaseContext.csdl|res://*/App_Logic.Entity Model.DatabaseContext.ssdl|res://*/App_Logic.Entity Model.DatabaseContext.msl&quot;;provider=System.Data.SqlClient;provider connection string=&quot;Data Source=SMITH\SQLEXPRESS;Initial Catalog=EmployeeInfo_SK_5_0;Integrated Security=True&quot;” providerName=”System.Data.EntityClient” />

(ii) I also create a file Setup.sql file under in Visual Studio under C# -> App_Data->SQL->Extras with content as below.

— =============================================
— Script Template
— =============================================

USE [EmployeeInfo_SK_5_0];
IF NOT EXISTS (SELECT * FROM master.dbo.syslogins where loginName= ‘NT AUTHORITY\NETWORK SERVICE’)
Print ‘Login exists…no need to re-create!’

IF NOT EXISTS (SELECT *  from sys.database_principals where (type=’S’ or type = ‘U’)  AND’NT AUTHORITY\NETWORK SERVICE’)
        EXEC sp_addrolemember ‘db_owner’, ‘NT AUTHORITY\NETWORK SERVICE’
        Print ‘User exists…no need to re-create!’

(iii) Also when I was running execution  of  certain sql files  through UrbanCode Depoy , it threw error  “Error Executing SQL Scripts: Incorrect syntax near ‘»’. This was not visible when I opened in VS Studio or even Notepad . This is due to encoding . I ensured that all sql files were opened in Visual Studio and then clicked on ‘File’ -> ‘Advanced Save’ option and changed Encoding to Western European (Windows) -Codepage 1252 . This solved problem

3.  After I did all above changes , I select Eisk.Web which is my project and click on Control-F5


I now see that my application is built as above and launched in browser


4. I change the Your database server address from localhost to smith\sqlexpress  which is my database address and click on Test Connection.

Message is displayed Connection passed. Click the Create Database button to create database”.


5. I click on ‘Create Database’ .

Message  is displayed “Database created successfully. Click the Create ‘Install Schema and Data’ button to install database. is displayed


6. I click on ‘Install Schema and Data’.

Message is displayed ” Congratulations! Database installation successful. Click here to start using Employee Info Starter Kit”


6. On clicking here , we now have the application up with data as below.


Now that I know it works fine , I want to create a deployment package so that I can deploy same on another machine.

Section2: Creating a Deployment Package from Visual Studio

1.  I select Eisk.Web which is my project and right click on  it and choose the Publish option

2. In the Publish Web window  under ‘Select a publish target‘ I choose Custom and enter Profile Name as Profile1 .  I enter details as  below under  Connections :

  • Publish Method: Web Deploy Package
  • Package Location: C:\DemoUCD\DemoMicrosoft\package\
  • Site/Application: Default Web Site/EiskApp

Click Next

4. Under settings I enter

  • Configuration: Release – Any CPU
  • DatabaseContext : click on  button with “…” .

On the Destination Connection String

  • Enter Server Name : SMITH\SQLEXPRESS and click on Refresh
  • Click on Use Windows Authentication radio button
  • Under Select or enter a database name enter EmployeeInfo_SK_5_0
  • Click on Test Connection to validate if required.
  • If everything fine ,click on OK.
  • The DatabaseContext  is automatically set to SMITH\SQLEXPRESS;Initial Catalog=EmployeeInfo_SK_5_0;Integrated Security=True
  • Ensure that Use this connection string at runtime (update destination web.config) is Checked.
  • I kept Update database as unchecked ( One could have this option checked as well , but I wanted to run through a scenario where I want to showcase SQL-JDBC plugin of Ibm UrbanCode Deploy being used to run sql scripts)

Click Next and click Publish

The output directory contains the files:



Section 3: Deployment using IBM UrbanCode Deploy

1. I create a Component Employee-component in  UrbanCode Deploy and for this scenario  I set Source Config Type: FileSystem, and path as C:\DemoUCD\DemoMicrosoft\package and import all files from package folder. ( In Urbancode we can pull artifacts from SCM or Build tool as well )


2. I create a Component Process as below :




The steps are as follows:

(i) Step – Download Artifacts  : This downloads artifacts on target machine


(ii) Step – Unzip  : This step unzips the sql files from package into folder temp in working directory on target.



(iii) Step – Move Directory  : This step moves the sql files from SQL folder located somewhere deep inside temp folder to SQL folder in working directory on target.

(iv) Step – Execute SQL Scripts : Executes the sql script to create database .  The connection string , driver jar location are defined in properties defined in Resource Group (Step 3 of this Section) . Since I am using Windows authentication  for both User and Password I type none



(v) Step – Execute SQL Scripts : Executes the sql scripts to create schema , create data and also do setup . The connection string , driver jar location are defined in properties defined in Resource Group (Step 3) . Since I am using Windows authentication  for both User and Password I type none


(vi) Step – Update XML with Xpath : Updates the XML script which has parameters for connection with variables defined in the Resource Group (Step 3). I used  the following Xpath Tester  site to frame by Xpath values .


(vii) Step – msdeploy  : Runs the msdeploy with Parameters XML files which was updated in previous step



3.I created a  Top Level Resource called Employee and added the agent running on my target machine to it and added Component Employee-component to it.


Now I want to define some Target specific variables in Resource Group Employee as below

In the Configuration of the Resource Group select Resource properties and enter the properties as shown below.


Name Value
Data Source
C:\Program Files\IIS\Microsoft Web Deploy V3\
SQL Location
temp\Content\C_C\DemoUCD\DemoMicrosoft\example\Employee Info Starter Kit\C#\obj\Release\Package\PackageTmp\App_Data\SQL
metadata=”res://*/App_Logic.Entity Model.DatabaseContext.csdl|res://*/App_Logic.Entity Model.DatabaseContext.ssdl|res://*/App_Logic.Entity Model.DatabaseContext.msl”;provider=System.Data.SqlClient;provider connection string=”Data Source=PSM-TEST-01\SYSARCH;Initial Catalog=EmployeeInfo_SK_5_0;Integrated Security=True”
C:\Program Files\Microsoft SQL Server JDBC Driver 3.0\sqljdbc_3.0\enu\sqljdbc4.jar

4. Create an Application called DemoApp:Employee and add the Component  Employee-component to the Application. Also create a application Process as below .

5. Create a new Environment  DEV for  the Application and add the  Resource Employee to the Environment

6 .We are now ready to run.  Execute the Application process on the Environment , the results are


You can now open IIS, refresh the Default Web Site, and you can see the EiskApp deployed .

Select  EiskApp -> Manage Application ->Browser as shown below



There you go …the app is deployed on target machine and is accessible as shown below..



Some environment issues which I stumbled across and solutions I applied:

1.Problem :While  running  ‘Execute SQL Script’ step I initially got error like below

Error Executing SQL Scripts: The TCP/IP connection to the host PSM-TEST-01, port 1433 has failed. Error: “Connection refused: connect. Verify the connection properties, check that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port, and that no firewall is blocking TCP connections to the port.”.
 Solution : Navigate to Microsoft SQL Server 2008 R2 ->SQL Server Configuration Manager -> SQL Server Network Configuration -> Protocols.  Under TCP/IP , ensure it is enabled and under IPAll set TCP Port as 1433 . Restart SQL Server for changes to be effective.

2. Problem :While running  ‘Execute SQL Script’ I initially got error like below

[sql] Jan 21, 2015 1:48:45 PM <clinit>
[sql] WARNING: Failed to load the sqljdbc_auth.dll cause : sqljdbc_auth (Not found in java.library.path)
Error Executing SQL Scripts: This driver is not configured for integrated authentication. ClientConnectionId:6df9b998-0cd0-498d-bde4-8dc8c0ef7ac7

Solution :   I copied the sqljdbc_auth.dll  from location where I installed by JDBC driver ( C:\Program Files\Microsoft JDBC Driver 4.1 for SQL Server\sqljdbc_4.1\enu\auth\x64 ) to  C:\Windows

3. Problem  & Solution:

4. Problem :

Solution : I added the Setup.sql file  mentioned in Section1 , Step 2 , sub-step(ii)


Part3: IBM UrbanCode Deploy & Oracle WebLogic Server Security Management

In Part1 and Part2 of this series we explored Oracle Application Deployment Server and Oracle Resource Management respectively, in this third blog we explore the plugin Oracle WebLogic Security Management. This plug-in provides steps to work with security configurations in WebLogic Server. You can perform the following tasks:

The goal here is to explore these various steps available in this plugin.

Before I leverage the plugin, I  explored the following topics to understand how the same is done through WebLogic Admin Console:  Configure Realm, Configure Role Mapping Providers, Define Global Roles, Define Users and Groups.

The Oracle WebLogic Security Management plugin comes with some example files to help users get started, so extract the plugin in folder and navigate to extras folder. Out here you will find files which you can leverage.

I have created a folder structure as follows C:\DemoUCD\DemoWebLogic\security as below which I would then supply to my UrbanCode Deploy Component


I place the example wlsMetadata.xml without modifying into the security folder. Rest of files here are modified as below and placed in security folder created above.

(Note: The wlsMetadata.xml from the extra folder in Oracle WebLogic Security Management was corrupt , so I picked file from extra folder in Oracle Resource Management plugin )
#Security Realm:
#Default Authenticator
#For many users, default role-mapper parameters will be sufficient.
#However, the plugin requires a value. An empty file (or this) will do.

<?xml version="1.0" encoding="UTF-8"?>
    <Realm name="myrealm" RoleMapper="XACMLRoleMapper">
        <operation name="createRole">
            <param resourceId="" name="@rolename1@"/>

<?xml version="1.0" encoding="UTF-8"?>
  <Realm name="@Realm@" AuthenticationProvider="@AuthenticationProviderName@">
    <operation name="createUser">
      <param name="@User1@" password="@UserPassword@" description="username1"/>
    <operation name="createGroup">
      <param name="@Group1@" description="group1 desc"/>
    <operation name="addMemberToGroup">
       <param group="@Group1@" member="@User1@" />

I referred to the following WebLogic Server Configuration MBean for creating the above property files.  If you noticed some of the properties have ‘@’ before and after, these are values I would be replacing at runtime using ‘Replace Token’ step in UrbanCode Deploy. I would define these properties in Component under Environment properties and then Replace Token step would use these values for replacement.

So let’s get started in UrbanCode deploy now ..

I already have WebLogic setup as mentioned in Part1

1. I am going to use same Resource Group  WebLogic which I created in Part1 and which was  extended further in Part2

2. Create a component Component-Weblogic-Security setting the Source Configuration Type as File System and Base Path as C:\DemoUCD\DemoWebLogic\security


3. The artifacts are imported under Version 1 in Component-Weblogic-Security

4. Add a couple of  Environment Properties in Component-Weblogic-Security and set values as shown below


5. Create a Component Process  called TestSecurityFlow with steps as shown below


( WebLogic Server needs to be restarted after a Security Realm is added/removed, so before creating Roles, Users and Groups, I restart my WebLogic Server )

6. The details to be filled in each step is as follows:

Download Artifacts :


Replace Token :



Create or Update Realm :


Create Authentication :



Create Role Mapper :


Stop WebLogic Server :


Start WebLogic Server :


Sleep for 1 Minute :


Create Roles :


Create User and Groups :



6. Add component Component-Weblogic-Security to Resource group WebLogic  which I created in Part1 and which was  extended further in Part2


7. Add component Component-Weblogic-Security to application DemoApp:Weblogic which we had created earlier in Part1

8. Create a Application Process  TestSecurity to install the Component-Weblogic-Security selecting Component process TestSecurityFlow as follows:

8. Before we run process ,  l would like to check that  Environment Properties in Component-Weblogic-Security we  defined in Step4 are available when we navigate to application DemoApp:Weblogic->Environments->DEV ( I am reusing environment DEV created in Part1) . One could edit properties here if required .

9. I am all set to run, navigate to application DemoApp:Weblogic ,  select DEV environment and run Application Process TestSecurity selecting the latest version. The process runs successfully as shown below


We can now go to WebLogic Server and check that resources are created successful as below





Role Mapper


Roles :


Users :




Hope this 3 part series helps you get started on devops journey using UrbanCode Deploy and WebLogic

Part2: IBM UrbanCode Deploy & Oracle WebLogic Server Resource Management

In Part1 I talked about Application Deployment Server plugin, now in here in Part 2 we discuss the usage of the next plugin Resource Management. This plug-in provides steps to support automated deployment of various WebLogic server resources.

The plugin has various steps which are listed in this link

My goal here is to be able to do activities mentioned in this Oracle blog through automated fashion using IBM UrbanCode Deploy using some of the steps on this plugin. My main purpose here is to showcase the usage of this Plugin.

I already have WebLogic setup as mentioned in Part1.

The  Resource Management plugin comes with some example files to help get started, so extract the plugin in folder and navigate to extras folder. Out here you will find following :

  • wlsMetadata.xml which is the Java beans configuration file provided as an example, this provides the steps, connection information for accessing the WebLogic server.
  • JMX_Properties folder which  have couple of property files. You can create JMX property files that define the objects or you can use the example files are provided in the plug-in .

I have created a folder structure as follows C:\DemoUCD\DemoWebLogic\resources  as below which I would then supply to my UrbanCode Deploy Component


I place the example wlsMetadata.xml without modifying into the resources folder.  Inside JMX_Properties I have placed the files as below
Notes:This server has some notes from the JMX Framework.
Notes:This note comes from the UrbanCode plugin.
Notes:Created from UCD
Notes:This is a note from the jmx framework.
Notes:Sample note here.

I referred to the following WebLogic Server Configuration MBean for creating the above property files.  If you noticed some of the properties have ‘@’ before and after , these are values I would be replacing at runtime using ‘Replace Token’ step in UrbanCode Deploy.

So let’s get started in UrbanCode deploy now ..

1. I am going to use same Resource Group  WebLogic which I created in Part1 of this series. I define couple of properties in Resource Group which I would use commonly  in my Component steps

  • jmx-classpath: C:\Oracle\Middleware\Oracle_Home\wlserver\server\lib\wljmxclient.jar     (path to wljmxclient.jar file)
  • weblogic.admin: weblogic
  • weblogic.hostname: <hostname of your weblogic server>
  • weblogic.port : 7001


2. Create a component Component-Weblogic-Resource  setting the Source Configuration Type as File System and Base Path as C:\DemoUCD\DemoWebLogic\resources


3. The artifacts are imported under Version 1 in Component-Weblogic-Resource

4. I also add couple of  Environment Properties in Component-Weblogic-Resource and set values as shown below .


5. Create a Component Process  called Test Flow with steps as shown below


The details to be filled in each step is as follows:

Download Artifacts


Replace Token


JMS Server






Connection Factory


6. Add component Component-Weblogic-Resource to Resource group WebLogic  which we discussed in Step1.


7. Add component Component-Weblogic-Resource to application DemoApp:Weblogic which we had created earlier in blog Part1.

8. Create a Application Process  Test to install the Component-Weblogic-Resource selecting Component process Test Flow as follows:


8. Before we run process, l would like to check that  Environment Properties in Component-Weblogic-Resource we  defined in Step4 are available when we navigate to application DemoApp:Weblogic->Environments->DEV ( I am reusing environment DEV created in Part1.) . One could edit properties here if required .


9. I am all set to run, navigate to application DemoApp:Weblogic ,  select DEV environment and run Application Process Test selecting the latest version. The process runs successfully as shown below


We can now go to WebLogic Server and check that resources are created successful as below

1.JMS Server




3. JMS Subdeployment


4. ConnectionFactory and  Queue


In Part3 of this series we explore the next WebLogic plugin  Oracle WebLogic Security Management.

Part1: IBM UrbanCode Deploy & Oracle WebLogic Application Deployment Server plugin

I have been thinking about writing on the usage of  IBM Urban Code Deploy Oracle WebLogic plugins for a while now , and finally it’s here this New Year ! Oracle WebLogic Server is application server for building and deploying enterprise Java EE applications.

UrbanCode Deploy has couple of plugins to automated activities carried out on WebLogic Server. In Part1 of this series I talk about  Application Deployment Server plugin , Part2 of this series would cover Resource Management plugin and Part3 covers Oracle WebLogic Security Management

The WebLogic Application Server plug-in includes steps that manage Oracle WebLogic Server, including tasks that are related to installing and maintaining applications.The process steps in the plugin are as follows:

I was a newbie to WebLogic, so before I could try out WebLogic UrbanCode plugin, I first needed to have basic understanding of WebLogic.I explored a couple of sites/videos to get an idea and here is what I suggest going through:WebLogic Architecture and Installing WebLogic. I created domains , managed servers etc as required following the instructions.

  • To start WebLogic : C:\Oracle\Middleware\Oracle_Home\user_projects\domains\base_domain\startWebLogic.cmd
  • I then run : C:\Oracle\Middleware\Oracle_Home\wlserver\server\bin\
  • Finally I start Node Manager :  C:\Oracle\Middleware\Oracle_Home\user_projects\domains\base_domain\bin\startNodemanager

Once Weblogic server is started  you can access the Server :  http://localhost:7001/console/login/LoginForm.jsp and navigate to: base_domain -> Environment->Servers

I have a Managed Server MS4 running on port 7004  in WebLogic as shown below which I would be using for deployments


Now that Server is up and running , I followed instruction on this site to get myself acquainted with deployment of war file on WebLogic Server.

Installing the application :

Next my goal was to achieve the same process through UrbanCode Deploy , so let’s get started. For the purpose of this demo scenario, I am using Source  Configuration Type as File System (Versioned) .

I downloaded sample benefits.war file : Version1 and Version2 and placed them on File System at C:\DemoUCD\DemoWebLogic\benefits under folders 1.0 and 2.0 respectively as shown below


On the UrbanCode Deploy front I do the following:

1.Create a component Component-Weblogic-WAR to house the war file and set with the Source Config Type set to File System (Versioned)  as Import Versions Automatically checked


The two versions were successfully imported into UrbanCode Deploy.

2.Create a Component process Deploy for the Component-Weblogic-WAR that has two steps Download Artifacts step (with default settings) and Deploy ( picked from weblogic).

For deploy step enter the following:

  • Deployment Name : benefits
  • AdminURL : t3://<hostnameofweblogicserver:port>
  • User Name : weblogic ( or username you set while installing WebLogic )
  • Password: password for your WebLogic server
  • Target: Name of your Target server
  • Source : benefits.war
  • Deployment Classpath : leave it as is , we shall create property in Resource later.

The details are shown in screenshot below :


3.Do the rest of usual parts of UrbanCode Deploy – Create Application DemoApp:Weblogic , add Component-Weblogic-WAR to the Application and also create Application Process Deploy to deploy the Component-Weblogic-WAR using Component process Deploy

4. Create a Resource group WebLogic and add agent along with Component-Weblogic-WAR as shown below.


5. In Resource Group WebLogic create property deployer-classpath and add path to weblogic.jar (C:\Oracle\Middleware\Oracle_Home\wlserver\server\lib\weblogic.jar) as shown in screenshot below.


6. In application DemoApp:Weblogic  , create environment DEV in and add resource group WebLogic  to it .

7. We are now ready to deploy. While deploying  select Version 1 of Component-Weblogic-WAR to deploy. Deployment is successful as shown below:


8.Login to http://localhost:7001/console/login/LoginForm.jsp and  check same by navigating base-domain -> Deployments .  As shown below , benefits in deployed on server MS4 and is in Active state.


9.Navigate and access the application at http://localhost:7004/benefits/


Updating the Application:

Now we would be updating application using version2 of file . In the version2, the Web application’s deployment descriptors have been modified to use a different URL context path.

Run the application deployment again , this time selecting Version 2 of Component-Weblogic-WAR to deploy. The process completes successfully.


The application is now accessible through URL  http://localhost:7004/  ( It is no longer accessible on http://localhost:7004/benefits/)

Stopping and Deleting the Application :

Navigate to component Component-Weblogic-WAR and create a Component process called undeploy as follows :


Also create a application process Undeploy to undeploy selecting component Component-Weblogic-WAR and using undeploy Component process and run this process


Volià..The application stops and is undeployed successfully as is no longer available on WebLogic Server


We have successfully explored WebLogic Application Deployment Plugin. Hope you found this useful !

I have used the following versions of products to demo the above scenario :

  • IBM UrbanCode Deploy Server v 6.1.1
  • Oracle WebLogic Application Deployment Plugin version 4.596018
  • Oracle WebLogic Server 12c (12.1.3)

Deploying to Websphere Portal Server using IBM UrbanCode Deploy

IBM Urbancode Deploy has a WebSphere Portal Plugin which I wanted to evaluate .This plugin provides steps that enables you to automate deployment of artifacts and automate running commands . The various steps this plugin provides are as follows :
* Deploy Portal WAR
* Deploy ThemeExecute Config Engine command
* Execute Config Engine command
* Install PAA
* Invoke XMLAccess script
The plug-in works with WebSphere Portal versions 6.1, 6.1.5, 7.0, and 8.0. Both ND and stand-alone topologies are supported.
I am newbie to Portal server and now needed a use-case to try this out . I picked a up the flow depicted  in this developeworks article to get started with , the article also has a sample  files (sample.war file as well as sample.xml file ), so I had artifacts ready for deployment.

So let me walk you through process I followed.

Part 1: Initial Setup:

1.I have installed UrbanCode Deploy Server and I have UrbanCode Deploy agent on the system that hosts my Portal Server.
2.On my UrbanCode Deploy server , I have imported the WebSphere Portal plugin-in and Application Deployment for WebSphere® plug-in which are required for the workflow

Part 2: Creating component , application :

1.In UrbanCode Deploy , create a new component by clicking on Components->New Component  and select Template Middleware configuration for WebSphere and Source  Config Type as appropriate. I created a component “Component-WAS” which would import the artifacts  from File System. Ideally you need one component for each resource to deploy , but here I have placed them all together.

You can select Import Version automatically or you could use run manual import.


2. Add a new application by navigating to Applications and clicking on Create New Application and then associate the component created in Step 1 to the application . I have created an application by name “Demo-WAS” and added component “Component-WAS”  to it.

Part3: Create process diagrams for Component and Application

1.Create a component process diagram by navigating to Component -> <Component-name> ->Processes and clicking “Create New Process“.
2.In process design a workflow like below for deploying theme war file as  mentioned in this developerworks article



The configuration for key steps is as follows..


3.Create an application process by clicking on  Processes tab of the application
4.Add a deploy process and drag and drop the Install Component step onto the process editor.
Select the component which was created earlier and choose the process created for the component.
For example, you would choose the component process to deploy theme which was created in section earlier and Save the process.

Part 4: Setup required for WebSphere auto-discovery and auto-configure :

If you have installed WebSphere in non-default location then you need to add wsadmin.path  in UrbanCode Deploy else you can skip first 3 steps. In my case , I had installed WebSphere Portal Server in C:\Program Files (x86)\IBM\WebSphere , hence I had to add this variable as shown in steps (Steps 1-3) below

In UrbanCode Deploy :

1.Navigate  to Resources -> Agents> <agent-name (agent installed on host where portal server resides >> Configuration > Agent Properties.
2. Add a property named wsadmin.path.
3. Set wsadmin.path to the fully qualified path to the wsadmin script (including the script name).


Next we need to add resource group

4. Navigate  to Resources -> Resources  and click on  “Create Top-Level Group” , enter a name as save.
5 .Hover over the row for the resource group which was added in step above and click Actions, and select Add agent.
6. Wait 10 to 30 seconds, then click Refresh. A twisty is now next to the agent.  When you expand it, there is a sub-resource cell, WebSphereCell.

Incase you don’t see the twisty and the sub-resource cell , you can check the Urbancode deploy Discovery logs for your agent in following location.

<server install dir>/logs/autoDiscovery/

Mine were at :  C:\Program Files\ibm-ucd\server\logs\autoDiscovery\<agent -name >WebSphere Discovery. Logs for successful discovery would appear as shown in the screenshot below.

**Auto-discovery steps are initiated when an agent resource is added to the resource tree.


7. Hover over the WebSphereCell row, then click Edit.  Enter values for the following properties.

  •  WebSphere Profile Path  ( Value I supplied –  C:\Program Files (x86)\IBM\WebSphere\wp_profile\bin\ )
  •  WebSphere User     (Value I supplied -admin )
  •  WebSphere Password. ( Value I supplied was  admin’s password ofcourse 🙂 ) .
  •  Leave the WebSphere Cell Name and WebSphere Profile Path blank blank and click Save


8. Next we need to set Auto configure options for WebSphereCell.
Hover over the row for WebSphereCell, click Actions, then click AutoConfigure
Click No auto configure for resource.
Check Websphere Topology Discovery box and click OK and then click Save.

9. Wait 30-60 seconds, then click Refresh. A twisty is now next to
WebSphereCell. Expand it and make sure the resource tree matches your WebSphere Application Server topology.
On the WebSphereCell entry, click Edit. Check that Cell Name was filled in and is correct.

Incase auto-configure is not working for you , and you can check the logs at

<server install dir>/logs/autoConfigure/

Mine were at : C:\Program Files\ibm-ucd\server\logs\autoConfigure\WebSphereCell\WebSphere Topology Discovery
If auto-configure has worked successfully the logs would be something like this


10.Add the component which was created initially  to the resource tree as follows


Part 5: Create environment

1. Navigate to Application -> <application-name> and click on create New Environment , enter environment name and save
2. Go to the environment created in Step above and click on Add Resources and add add Resource group which was added in Part 4.

Part6: Ready to run


Once the process runs you can try out this new theme   (Follow the last section called “Testing the new theme” in following  developerWorks Article )

Part7: Additional use-case  deploying war file to Portal Server

If you want to use the deploy war file step in Portal Plugin , and you are running on Windows machine, then you would need to tweak the plugin to make it run to install as well as update existing portlet.

1.File to be changed : deploy.groovy

Change 1:
Replace this line :

webapp.appendNode("url", "file://localhost" + workDirString + war)//TODO for windows

with the following

if (isWindows) {
workDirString.replaceAll("\\\\", "/")
webapp.appendNode("url", "file://localhost/" + workDirString + war)
} else {
webapp.appendNode("url", "file://localhost" + workDirString + war)


host = System.getenv("HOSTNAME") , i replaced with name of host machine


 def result = outputLines[2] to def result = outputLines[4]

Hope this helps if you are trying out WebSphere Portal Plugin…

Part2: Upgrading the IBM Urbancode Plugin with additional functionality

After the successful creation of IBM Urbancode Plugin to create database in Postgres in article Part1: Creating a simple IBM Urbancode Deploy Plugin “Postgres-Create Database”, I wanted to extend the functionality of this plugin to cater to additional commands in Postgres like

  • Drop database
  • Backup database
  • Restore database

Here is how this is achieved.

Part 1:  Update the plugin.xml

(i) I updated the  version in the plugin.xml created previously to version 2 ( Line5 ) as shown in image below


(ii) I added steps for drop database in plugin.xml  as shown below


(iii)  I added steps to backup the Database in plugin.xml as shown below


(iv)  I added steps to Restore Database in plugin.xml as shown below


Part 2: Create Groovy scripts for the new steps

Next I created groovy scripts for the  additional steps I have added in plugin.xml in Part 1 above

(i) Created dropdb.groovy script


(ii) Created backupdb.groovy script


(iii)  Created restoredb.groovy script


Part 3: Update the upgrade.xml

Next I added lines 6-11  in upgrade.xml file to migrate existing Postgres plugin to new version and expose the new commands added in Part 1.


Part 4: Package the files and import into IBM Urbancode Deploy

With all the required  files ready , I package them in zip file and import the same into IBM Urbancode Deploy as described in my previous article


Now these commands are available within Process Designer as shown below ready to be used.


and the version is upgraded to 2.

Upgraded Version

Bluemix HUB

The community for bluemix developers

Smith's Blog

Continuous delivery of software-driven innovation

Sanjeev Sharma's Site

My thoughts on #DevOps, Software #Architecture, #Agile Development, Innovation, Technology, Life...

Takehiko Amano's Blog on Emerging Technologies.

My thought on DevOps and Cloud technologies

Tim Feeney's Blog on Jazz

close encounters of the Jazz kind

Dan Toczala's Blog

THINK - About what's possible....