Kohei Nozaki's blog 

Entries tagged [ant]

Using DBUnit from Ant


Posted on Wednesday Sep 23, 2015 at 12:02PM in Technology


It’s a common usecase that using DBUnit from JUnit testcases but sometimes I need to use DBUnit in a standalone application manner to achive tasks such as exporting the data from a database to a XML file for creating test data from an existing table. In such case, I feel using DBUnit Ant task is better rather than launching it from Maven plugin or write a standalone Java application uses DBUnit API. in this entry, I introduce you a complete working example of an Ant script that uses DBUnit Ant task.

Environment

Complete JAR file list in my ~/.ant/lib:

$ ls -l ~/.ant/lib
total 1236
-rw-rw-r--. 1 kyle kyle 560755 Sep 23 10:56 dbunit-2.5.1.jar
-rw-rw-r--. 1 kyle kyle 660126 Sep 23 10:54 postgresql-9.4-1203.jdbc42.jar
-rw-rw-r--. 1 kyle kyle  32127 Sep 23 11:15 slf4j-api-1.7.12.jar
-rw-rw-r--. 1 kyle kyle   7892 Sep 23 11:16 slf4j-jdk14-1.7.12.jar
$

Exporting example

build.xml

This script exports the result of a SQL to a flat XML file. The format can be specified in dbunit.format property that has set to flat as default. You can add additional query or table elements inside the dbunit element to include multiple dataset in single XML file. JDBC connection information such as credentials or URL is expected to be stored as a separate file named dbunit.properties that is placed under the same directory to this file.

<project name="dbunit-export" basedir=".">
    <taskdef name="dbunit" classname="org.dbunit.ant.DbUnitTask"/>

    <loadproperties srcFile="dbunit.properties"/>

    <fail unless="dbunit.driver"/>
    <fail unless="dbunit.url"/>
    <fail unless="dbunit.userid"/>
    <fail unless="dbunit.password"/>
    <fail unless="dbunit.datatypeFactory"/>

    <property name="dbunit.format" value="flat"/> <!-- Possible values are "flat", "xml", "csv", "dtd", "xls". Defaults to "flat" -->

    <target name="export">
        <fail unless="dbunit.sql"/>
        <fail unless="dbunit.dest"/>
        <fail unless="dbunit.format"/>
        <fail unless="dbunit.query"/>

        <dbunit driver="${dbunit.driver}" url="${dbunit.url}" userid="${dbunit.userid}" password="${dbunit.password}">
            <dbconfig>
                <property name="datatypeFactory" value="${dbunit.datatypeFactory}" />
            </dbconfig>
            <export dest="${dbunit.dest}">
                <query name="${dbunit.query}" sql="${dbunit.sql}"/>
            </export>
        </dbunit>
    </target>
</project>

dbunit.properties

Set each properties to suit your environment. See this page to check what datatypeFactory are available.

dbunit.driver=org.postgresql.Driver
dbunit.url=jdbc:postgresql://localhost:5432/mydb
dbunit.userid=someone
dbunit.password=somepass
dbunit.datatypeFactory=org.dbunit.ext.postgresql.PostgresqlDataTypeFactory

How to run

The following command exports a XML file from a SQL.

$ ant export "-Ddbunit.sql=select * from job_instance where jobinstanceid >= 399341 order by jobinstanceid" -Ddbunit.query=job_instance -Ddbunit.dest=job_instance.xml
Buildfile: /home/kyle/dbunit-ant-example/build.xml

export:
   [dbunit] Executing export:
   [dbunit]       in format: flat to datafile: /home/kyle/dbunit-ant-example/job_instance.xml
Successfully wrote file '/home/kyle/dbunit-ant-example/job_instance.xml'

BUILD SUCCESSFUL
Total time: 0 seconds

This produces a flat XML dataset named job_instance.xml something like:

<?xml version='1.0' encoding='UTF-8'?>
<dataset>
  <job_instance jobinstanceid="399341" jobname="somejob1" applicationname="someapp"/>
  <job_instance jobinstanceid="399342" jobname="somejob2" applicationname="someapp"/>
  <job_instance jobinstanceid="399343" jobname="somejob3" applicationname="someapp"/>
...
  <job_instance jobinstanceid="400004" jobname="somejob4" applicationname="someapp"/>
</dataset>

Exporting / importing an entire database

<project name="mydbunit" basedir=".">
    <taskdef name="dbunit" classname="org.dbunit.ant.DbUnitTask"/>

    <loadproperties srcFile="dbunit.properties"/>

    <fail unless="dbunit.driver"/>
    <fail unless="dbunit.url"/>
    <fail unless="dbunit.userid"/>
    <fail unless="dbunit.password"/>
    <fail unless="dbunit.datatypeFactory"/>

    <property name="dbunit.format" value="flat"/> <!-- Possible values are "flat", "xml", "csv", "dtd", "xls". Defaults to "flat" -->

    <target name="export">
        <fail unless="dbunit.dest"/>
        <fail unless="dbunit.format"/>

        <dbunit driver="${dbunit.driver}" url="${dbunit.url}" userid="${dbunit.userid}" password="${dbunit.password}">
            <dbconfig>
                <property name="datatypeFactory" value="${dbunit.datatypeFactory}" />
                <!-- <property name="escapePattern" value="`?`" /> -->
            </dbconfig>
            <export dest="${dbunit.dest}" ordered="true"/>
        </dbunit>
    </target>

    <target name="import">
        <fail unless="dbunit.src"/>
        <fail unless="dbunit.format"/>

        <dbunit driver="${dbunit.driver}" url="${dbunit.url}" userid="${dbunit.userid}" password="${dbunit.password}">
            <dbconfig>
                <property name="datatypeFactory" value="${dbunit.datatypeFactory}" />
                <!-- <property name="escapePattern" value="`?`" /> -->
            </dbconfig>
            <operation type="CLEAN_INSERT" src="${dbunit.src}" ordered="true"/>
        </dbunit>
    </target>

</project>

To export:

$ ant export -Ddbunit.dest=mydb.xml

To import:

$ ant import -Ddbunit.src=mydb.xml

Note that you need commons-collections3 jar file in your classpath.

Other tasks

There are more operations that can be achieved through Ant tasks available such as:

  • Data comparison

  • INSERT or UPDATE based on the file


Deploying an application to WildFly with Ant + Cargo


Posted on Wednesday Mar 04, 2015 at 04:49PM in WildFly


I created an Ant script which deploys an application to WildFly through ssh tunnel. it creates ssh tunnel using Ant’s sshtunnel target. the script works well with a Jenkins job.


Derby database backup script


Posted on Friday Feb 20, 2015 at 02:49PM in Technology


As Roller, and a virtual machine on VMware Fusion, I wrote an another Ant script which backups an Apache Derby database to automate backup of data of my Apache James server. the script has easy purge function as a target named purge too. intended environment is as follows:

  • Linux server

  • Accepts connection via ssh

  • Has executable ij command which is simple CLI JDBC frontend program shipped with Derby

  • Requires Derby instance to listen a port

It works as follows:

  1. Invoke SYSCS_UTIL.SYSCS_FREEZE_DATABASE() with ij to freeze the database

  2. Create a tarball of the database

  3. Invoke SYSCS_UTIL.SYSCS_UNFREEZE_DATABASE() with ij to unfreeze the database

  4. Download the tarball

  5. Delete the tarball

A bad thing is that due to lack of streaming download in sshexec task, it needs extra free space on the server. the script can be obtained from my GitHub repository.


Automating backup of Roller


Posted on Sunday Feb 01, 2015 at 11:17AM in Technology


I wrote an another Ant script to backup Apache Roller database and data directory. intended to use for PostgreSQL and Linux server. see my GitHub repository for more information.

At first I implemented it with sshexec task and used its output attribute to download the backup stream to local file but the file was corrupted. unfortunately sshexec task uses ByteArrayOutputStream to collect the output stream, then converts the byte array to String as far as I found out. I think it’s not preferable and it can be simply redirected to local file stream. so I changed implementation to that dump the data as a temporary file on remote server, then download it with scp task. I know it’s inefficient and consumes free space the same as the data. implementing redirection of the streams to sshexec task or changing the script to use ssh command directly would be a solution.


Automating daily cold backup of a virtual machine


Posted on Saturday Jan 31, 2015 at 01:24PM in Technology


I have a virtual machine which is running on VMware Fusion 6, and runs some batch jobs for every weekday. I wrote a backup script for it. it does some annoying work such as mounting, un-mounting and stopping or restarting the vm then simply copies .vmware directory into mounted directory. also purging function is available. see my GitHub repository for detail.

First, I wrote the script with tmutil which manipulates Time Machine via CLI, but it excludes virtual machines while exclusion list is empty so I stopped use it. so the script simply copies with copy task of Ant instead.

As to snapshot - I exclude it from the options because it’s not recommended for production use by VMware (VMware does not recommend running production virtual machines off snapshots on a permanent basis). additionally, they said that AutoProtect should only be used in testing environments. see http://kb.vmware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC&externalId=1014509