Monday, December 30, 2013

Release Testing: Verifying that the stage files are signed and verifiable

When testing a release candidate, it is necessary to test whether the stage files are signed and those are verifiable ( md5, sha and asc verification). This can be a tedious task if not automated.

This guide uses Apache Airavata project to demonstrate the examples. 

  • Download all the files(.zip, .tar.gz, .asc, .sha, .md5..etc) you need to test and store them all in that directory you created. ex: All the files from the Apache Airavata 0.11-RC3 (
  • SHA signing validation Create a bash script in the same folder and copy the code below
     for file in `find . -type f -not -name '*.asc' -not -name '*.md5' -not -name '*.sh' -not -name '*.sha'`
           gpg --print-md SHA512 ${file} | diff - "${file}"".sha" | echo "testing : $file"".sha" 
  • This code will verify the SHA signings.
  • MD5 validation 
     gpg --print-md md5 ${file} | diff - "${file}"".md5" | echo "testing : $file"".md5" 
  • ASC validation 
    gpg --verify ${file} 
  • You can have all these validations in one bash script if required. 

    Saturday, December 7, 2013

    Apache Airavata: How to start contributing

    Since Apache Airavata is a matured project, you need to understand what is happening underneath before diving in to contributing ( this guide is for a beginner ).

    I suggest the following steps to get a better understanding of what is happening.

    • Subscribe to the mailing lists ( the dev and the user lists)
    • Download Apache Airavata binaries ( or you can build from the source).
      Then you should try doing the 5 & 10 minute tutorials to get a better understanding of what Airavata does.
    • Then read the Airavata Wiki.
      Wiki is documented by the contributors of Airavata. It's the one place where you can get the most information on Apache Airavata.
      • First I would recommend you to read about the components of Airavata
        This takes apart Airavata and describes what each part does individually as components. Airavata is the integration of all these components.
      • Next the Airavata Developer guide is essential as it provides information particularly directed towards developers. There you can get a slight understanding of how the components are integrated.
    • The following research papers provide a deep insight in to the Airavata core
      • Yi Huang, Aleksander Slominski, Chathura Herath, and
        Dennis Ganno, "WS-Messenger: A Web Services-based Messaging System for Service-Oriented Grid Computing"

        More related research Papers are listed here

    Monday, September 9, 2013

    Learning curve generator for Learning Models in Python and scikit-learn

    This particular program draws the learning curve for the Gaussian Naive Bayes Model. But the function is generic such that it can generate the Learning curve once the model for the data provided.
    I have used the scikit learn library and used the "digits" data set for the calculations.
    Uses simple rms error to draw the plot
    Hope this helps.
    Note: %pylab inline will only work if you are using ipython. If you are not, import the numpy,matplotlib.pyplot modules
    %pylab inline
    from sklearn.naive_bayes import GaussianNB
    from sklearn.datasets import load_digits
    import sklearn.cross_validation
    #loading the digits dataset
    digits = load_digits()
    #seperating data sets for cross validation
    data_train,data_test,target_train,target_test = cross_validation.train_test_split(,,test_size = 0.20, random_state = 42)
    #assigning the Gaussian Naive Bayes Model
    clf = GaussianNB()
    #compute the rms error
    def compute_error(x, y, model):
        yfit = model.predict(x)
        return np.sqrt(np.mean((y - yfit) ** 2))
    def drawLearningCurve(model):
        sizes = np.linspace(2, 200, 50).astype(int)
        train_error = np.zeros(sizes.shape)
        crossval_error = np.zeros(sizes.shape)
        for i,size in enumerate(sizes):
            #getting the predicted results of the GaussianNB
            predicted = model.predict(data_train)
            #compute the validation error
            crossval_error[i] = compute_error(data_test,target_test,model)
            #compute the training error
            train_error[i] = compute_error(data_train[:size,:],target_train[:size],model)
        #draw the plot
        fig,ax = plt.subplots()
        ax.plot(sizes,crossval_error,lw = 2, label='cross validation error')
        ax.plot(sizes,train_error, lw = 2, label='training error')
        ax.set_xlabel('cross val error')
        ax.set_ylabel('rms error')
        ax.legend(loc = 0)
        ax.set_title('Learning Curve' )

    Tuesday, August 13, 2013

    Binary Search Tree with In Order Traversal implementation in Java

    This class is a complete class you can utilize out of the box for your applications.
    Node class can be modified to suit your needs.

    class BinaryTreeSearch{
     public enum State{
      Visited, Unvisited,Visiting;
     //this is the Node used in the tree
        static class Node{
            private int data;
            private Node left;
            private Node right;
            public Node(int data){
       = data;
                left = null;
                right = null;
            public void setLeft(Node left){
                this.left = left;
            public void setRight(Node right){
                this.right = right;
            public Node getLeft(){
                return this.left;
            public Node getRight(){
                return this.right;
            public int getData(){
            public boolean equals(Node n){
                if( ==(int) n.getData()) return true;
                    return false;
        public static void main(String[] args){
            BinaryTreeSearch bts = new BinaryTreeSearch();
        //execute the test case
        public void run(){
            Node root = new Node(10);
            insert(root,new Node(20));
            insert(root,new Node(5));
            insert(root,new Node(4));
            insert(root,new Node(5));
            insert(root,new Node(15));
            System.out.println("\n" + binarySearch(root,new Node(10)));
        // insert a node to the binary search tree
        public void insert(Node root, Node n){
            if(root == null|| n == null) return;
            if(root.getData() > n.getData()){
                if(root.getLeft() == null){
                     System.out.println("Added node to left of "+root.getData()+" of value "+n.getData());            
            }else if(root.getData() < n.getData()){
                if(root.getRight() == null){
                    System.out.println("Added node to Right of "+root.getData()+" of value "+n.getData());      
        //in-order Traversal
        public void inOrderTraverse(Node root){
            if(root != null){
                System.out.print("  "+root.getData());
        //binary search
        public boolean binarySearch(Node root,Node n){
            if(root == null || n == null) {
                return false;
            System.out.println("  Testing out "+root.getData()+" for value "+n.getData());
            if(root.getData() > n.getData()){
               return  binarySearch(root.getLeft(),n);
            }else if(root.getData() < n.getData()){
               return  binarySearch(root.getRight(),n);
            return true;