Showing posts with label java. Show all posts
Showing posts with label java. Show all posts

18 May 2019

Thoughts on Type Variance


in Java arrays are covariant (can be assigned to an array of supertype)

Integer[] arr = {1, 2, 3};
Object[] objArr = arr; // allowed because of being covariant
objArr[0] = "Hello"; // runtime error: ArrayStoreException

But it comes at a cost that, it might fail at runtime (as in the last line)

in Kotlin, arrays are invariant:

val arr = arrayOf(1, 2, 3)
val objArr: Array<Any> = arr // compilation error

So, invariant, doesn't consider List<Integer> to substitue List<Object> nor vice versa,

However, covariant consider List<Integer> to substitue List<Object>, but not vice versa.
And contravriant consider List<Object> to substitude List<Integer>, but not vice versa.


in Kotlin, Lists (immutable lists) are covariant, the following works:
val list = listOf(1, 2, 3)
val objList: List<Any> = list

because Lists are immutable, we can't fail as in the Java array case.

in contravariant, the super type can substitute the subtype (might seems illogical at first),
but look at this (fiction example):

operateOnIntegers(Integer i) {
i.setValue(50)
}

Number n = 30
operateOnIntegers(n)

because operateOnIntegers expected a narrower type than the passed one (expects Integer and passed Number),
it will never write a value into it that exceeds its limits.



02 March 2019

Spring security multiple authentication provider


In your WebSecurityConfigurerAdapter you will need to add more than auth

@Override

protected void configure(AuthenticationManagerBuilder auth) throws Exception {

    auth.authenticationProvider(new MyFirstAuthenticationProvider(userRepository, bCryptPasswordEncoder()));

    auth.authenticationProvider(new MySecondAuthenticationProvider(userRepository, bCryptPasswordEncoder()));

}

Then Create MyFirstAuthenticationProvider and MySecondAuthenticationProvider like:

public class MyFirstAuthenticationProvider extends DaoAuthenticationProvider {

    public MyFirstAuthenticationProvider(UserRepository userRepository,
                                              BCryptPasswordEncoder bCryptPasswordEncoder) {

        super.setPasswordEncoder(bCryptPasswordEncoder);
        super.setUserDetailsService(......)
        );
    }

    @Override
    public boolean supports(Class<?> authentication) {
        return MyFirstAuthenticationToken.class.isAssignableFrom(authentication);
    }
}


public class MyFirstAuthenticationToken extends UsernamePasswordAuthenticationToken {

    public MyFirstAuthenticationToken(UserEntity principal, Object credentials,
                                                   Collection<? extends GrantedAuthority> authorities) {
        super(principal, credentials, authorities);
    }
}
And the same for MySecondAuthenticationProvider.

You will need to use the authentication providers/token in the authenticaiton/authorization filters.


07 December 2017

Very important notes about Spring @Transnational

Sprint @Transnational is being ignored in the following cases:

1. when the caller method is calling the @Transaction annotated method from the same class

2. When the Annotated method is not public

@Transnational by default don't rollback for Checked Exceptions


class A{
     void caller(){
            doInTransactionMethod(); // @Transnational is ignored
     }

    @Transnational // by default rollback for RuntimeExceptions
    public <return type> doInTransactionMethod(<params>){ // should be public as well
    }
}

The problem is, I keep forgetting about the above 3 simple rules, So I tried to writing down here to try not to forget about it

 Also, here is a tweet that talks about public default limitation: https://twitter.com/mohewedy/status/1099781513888100352

18 November 2017

Goobye Eclipse welcome Intellij IDEA

I used IntelliJ IDEA before 4 or so years ago when I was working with Android and then switched back to Eclipse. And I felt that I came back to the stone age.

What stopped me from continuing on IntelliJ is that I use IBM Jazz at work which is not supported well by our team. And everybody uses Eclipse so I decided to follow the crowd.

Eclipse is a nice Java IDE, but I have some behaviours that I don't like which drove me to migrate to IntelliJ IDEA.

Nowadays I wanted to create JAX-RS Rest application, but Eclipse insisting that this is an EJB project and keep adding ejbModule source-based structure, and at this point, I felt It is it. Eclipse I am breaking with you :P.

I am using the community edition from IntelliJ because I don't need the "Enterprise" features of an IDE (like Create new Servlet, or EJB, or deploy to JBoss or Tomcat), I just use shell scripts to do the deployment myself, and it works on both *nix and M$ Windows platforms :)

Still, I use Eclipse just to use the powerful plugin written for Jazz Source code management. (until I figure out how to use jazz from inside IntelliJ)

And for green-field projects, I use spring boot of course (When I have the option :) )

03 November 2017

Deploy your maven project to the public repository

1. update your maven to look like https://github.com/mhewedy/sftp-utils/blob/master/pom.xml by adding the following:


<name>sftp-utils</name>
<description>SFTP Utils</description>
<url>https://github.com/mhewedy/sftp-utils</url>

<licenses>
    <license>
        <name>Apache License Version 2.0</name>
        <url>http://www.apache.org/licenses/LICENSE-2.0.txt</url>
    </license>
</licenses>

<developers>
    <developer>
        <name>Muhammad Hewedy(mhewedy)</name>
        <email>mhewedy@gmail.com</email>
        <organization>mhewedy</organization>
        <organizationUrl>https://github.com/mhewedy</organizationUrl>
    </developer>
</developers>

<scm>
    <connection>scm:git:git@github.com:mhewedy/sftp-utils.git</connection>
    <developerConnection>scm:git:git@github.com:mhewedy/sftp-utils.git</developerConnection>
    <url>git@github.com:mhewedy/sftp-utils.git</url>
</scm>


And

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-javadoc-plugin</artifactId>
    <executions>
        <execution>
            <id>attach-javadocs</id>
            <goals>
                <goal>jar</goal>
            </goals>
        </execution>
    </executions>
</plugin>

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-source-plugin</artifactId>
    <executions>
        <execution>
            <id>attach-sources</id>
            <goals>
                <goal>jar</goal>
            </goals>
        </execution>
    </executions>
</plugin>

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-gpg-plugin</artifactId>
    <executions>
        <execution>
            <id>sign-artifacts</id>
            <phase>verify</phase>
            <goals>
                <goal>sign</goal>
            </goals>
        </execution>
    </executions>
</plugin>

<plugin>
    <groupId>org.sonatype.plugins</groupId>
    <artifactId>nexus-staging-maven-plugin</artifactId>
    <version>1.6.3</version>
    <extensions>true</extensions>
    <configuration>
        <serverId>ossrh</serverId>
        <nexusUrl>https://oss.sonatype.org/</nexusUrl>
        <autoReleaseAfterClose>true</autoReleaseAfterClose>
    </configuration>
</plugin>

And change the values according to your project

Then create a ticket like this: https://issues.sonatype.org/browse/OSSRH-35670

Then after it is resolved as fixed, then execute from the project directory:

mvn clean deploy

and check https://oss.sonatype.org/content/groups/staging/ and https://oss.sonatype.org/content/groups/public for the artifacts, for example:
https://oss.sonatype.org/content/groups/public/com/github/mhewedy/sftp-utils/


02 March 2017

Using spwrap with Spring boot

spwrap is a tiny framework to make stored procedures call from java code an easy and fun task.

I've written a couple of posts about it.

Today, I am gonna show you how to use it from spring boot project.

The main point I want to emphasis here is to register DAO interfaces as Spring Bean so that I would be reused across your application:

@Configuration
public class Config {

    @Autowired
    private DataSource dataSource;

    @Bean
    public DAO dao(){
        return new DAO.Builder(dataSource)
                .config(new spwrap.Config().useStatusFields(false))
                .build();
    }

    @Bean
    public CustomerDAO customerDAO(DAO dao){
        return dao.create(CustomerDAO.class);
    }
}
See the github page https://github.com/mhewedy/spwrap-examples/tree/master/spring-boot-mysql for the complete project.

Learn more: https://github.com/mhewedy/spwrap/wiki/Using-with-Spring-Boot-other-DI-frameworks
Thanks to stored-procedure-proxy at: https://github.com/marschall/stored-procedure-proxy/wiki/Object-Lifetime

28 February 2017

Using spwrap to simplify calling stored procedures from java

spwrap is a tiny framework to simplify the call to stored procedures from Java code.

In this post, I'll implement the Columbian coffee example from Java Tutorial at oracle.

This example uses mysql database to store coffee and suppliers data. see the install script at github.

The script installs 2 Tables: suppliers and coffees fill them with data and create 3 stored procedures: 
show_suppliers: To list all coffee names with supplier names (takes no parameters and return as result set)
get_supplier_of_coffee: To get a supplier name of coffee (takes 1 input parameter and return 1 output parameter)
raise_price: To raise the price of coffee (take 3 input parameters and return 1 output parameter)
Note: the original  raise_price stored procedure take 2 input parameters and 1 in/out parameter, but spwrap doesn't support INOUT parameters, so I split it into 1 input and 1 output parameter.

We will use spwrap to simplify the call to these 3 stored procedures.

First, We need to Create a Java interface to represent these 3 stored procedures:
public interface ColumbianDAO {

    @StoredProc("SHOW_SUPPLIERS")
    List<SupplierCoffee> showSuppliers();

    @Scalar(VARCHAR)
    @StoredProc("GET_SUPPLIER_OF_COFFEE")
    String getSupplierOfCoffee(@Param(VARCHAR) String coffeeName);

    @Scalar(NUMERIC)
    @StoredProc("RAISE_PRICE")
    BigDecimal raisePrice(@Param(VARCHAR)String coffeeName,
                          @Param(FLOAT)float maximumPercentage,
                          @Param(NUMERIC) BigDecimal newPrice);
}
The interface contains 3 methods to represent the 3 stored procedures, the annotation @StoredProc uses to mark method as a stored procedure.

The annotation @Scalar to represent the return type of stored procedure so that the output parameter mapped correctly to the method return type.

For the stored procedures that return its result as result set, you need to provide result set mapper to map the result set object to your domain object (SupplierCoffee), here's the mappers implementation:
public class SupplierCoffee implements ResultSetMapper<SupplierCoffee> {
    private String supplierName, coffeeName;

    @Override
    public SupplierCoffee map(Result<?> result) {
        // convert the result into SupplierCoffee        
        SupplierCoffee supplierCoffee = new SupplierCoffee();
        supplierCoffee.supplierName = result.getString(1);
        supplierCoffee.coffeeName = result.getString(2);
        return supplierCoffee;
    }

    @Override
    public String toString() {
        return "SupplierCoffee{" +
                "supplierName='" + supplierName + '\'' +
                ", coffeeName='" + coffeeName + '\'' +
                '}';
    }
}
And now you can call the stored procedures using the following code:
DAO dao = new DAO.Builder("jdbc:mysql://localhost:3306/columbian", "root", "")
        .config(new Config().useStatusFields(false))
        .build();

ColumbianDAO columbianDAO = dao.create(ColumbianDAO.class);

List<SupplierCoffee> supplierCoffees = columbianDAO.showSuppliers();
supplierCoffees.forEach(System.out::println);

String coffee = "Colombian";
String supplier = columbianDAO.getSupplierOfCoffee(coffee);
System.out.printf("Supplier of the coffee '%s' is '%s'\n", coffee, supplier);

BigDecimal newPrice = columbianDAO.raisePrice(coffee, 0.10f, BigDecimal.valueOf(19.99));
System.out.printf("new price of '%s' is '%s'\n", coffee, newPrice);
Download the complete source code of the complete example at github.

If you want to know more about spwrap visit the project page at github.  If you like the project, please start it at github :)

22 February 2017

Using Spockframework to unroll donzs of test cases from a few of them

Part of working on spwrap, I am working to writing unit and integration tests form this tiny framework.

I've used spockframework to write some test cases on a project in my current company, I find it very handy and complete framework.

Because spockframework is written in Groovy, it provides very dynamic features that is hard to achieve in other Java-based testing frameworks, at least the syntax of other testing/mocking frameworks will not as good as Spock.

Besides Spock is provides basic testing functionality, it provides what they called "interaction-based testing" (a.k.a. Mocking) and one amazing feature what is "Data Driven testing"

In this post I'll talk about how I used both of them to write 8 test cases and got more than 150 unit test generated.

One of the basic spwrap features is to let user call stored procedures that return result sets and output parameters, and it is user responsibility to extract the data from these JDBC interfaces.

This mapping is done in Mappers, where the user have to implement one of two interfaces either ResultSetMapper or TypedOutputParameterMapper.

example of class implements both interfaces:


public class Customer implements TypedOutputParamMapper<Customer>, ResultSetMapper<Customer> {

    private Integer id;
    private String firstName, lastName;

    @Override
    public Customer map(Result<?> result) {
        if (result.isResultSet()) {// for ResultSetMapper
            return new Customer(result.getInt(1), result.getString(2), result.getString(3));
        } else { // for TypedOutputParamMapper
            return new Customer(null, result.getString(1), result.getString(2));
        }
    }

The map function above have one parameter of Type Result, which is a wrapper for both java.sql.ResultSet and java.sql.CallableStatement

The Result class has two subclasses (ResultSetWrapper and CallableStatementWrapper) that delegate the call to ResultSet and CallableStatement respectively and re-throw SQLException as non-checked CallException.

Each class of the Wrapper classes (ResultSetWrapper and CallableStatementWrapper) has about 40 methods like getString, getBoolean, getByte, getShort, getInt, getLong, getFloat, etc.

So, I need to write about 40 * 2 (1 success path and 1 fail path) * 2 (2 classes to test) ~= 160 method.

so let's see how we accomplish this using spock:


def callableStatementWrapper
def callableStatementMock = Mock(CallableStatement)

@Shared METHOD_NAMES = ["getString", "getBoolean", "getByte", "getShort", "getInt",
                        "getLong", "getFloat", "getDouble", "getBytes", "getDate", "getTime",
                        "getTimestamp", "getObject", "getBigDecimal", "getRef", "getBlob", "getClob",
                        "getArray", "getURL"];
void setup() {
    callableStatementWrapper = new CallableStatementWrapper(callableStatementMock, 1)
}

def "calling #methodName(int) on CallableStatementWrapper calls the same method name on CallableStatement" (String methodName){
    when:
        callableStatementWrapper."$methodName"(1)
    then:
        1 * callableStatementMock./get.*/(_)
    where:
        methodName << METHOD_NAMES
}

The first line is just a definition for a reference that will hold the object that we need to test which is a CallableStatementWrapper.

The second line is mocking the java.sql.CallableStatement into an variable named callableStatementMock

then we have a static (shared) field of type array of string, actually these are method names on CallableStatementWrapper we need to test.

The setup method instantiate the CallableStatementWrapper using the callableStatementMock mocked object.

the test method do 2 important things:

1. In the when block we say: whenever the user calls callableStatementWrapper. (some dynamic method name that we will supply later in the where block)

2. in the then block: we say, expect 1 call to the mocked object getXXX method that takes any parameter (see interaction-based testing for more details)

3. in the where block, we substitute the methodName in step 1 (when step) by the method name from the static array of method names METHOD_NAMES Array.

When run this test,  and because of the class annotate by @Unroll, we got about 20 test case runs for us.

see the test cases on github for CallableStatementWrapper and ResultSetWrapper.

Hope you can find spock and spwrap helpful!




19 February 2017

spwrap: Stored Procedure call wrapper

spwrap is a Stored Procedure caller; simply execute stored procedure from java code.
Example:

public interface CustomerDAO {

    @StoredProc("create_customer")
    void createCustomer(@Param(VARCHAR) String firstName, @Param(VARCHAR) String lastName);

    @StoredProc("get_customer")
    Customer getCustomer(@Param(INTEGER) Integer id);   

    @StoredProc("list_customers")
    List<Customer> listCustomers();
}
public class Customer implements TypedOutputParamMapper<Customer>, ResultSetMapper<Customer> {

    private Integer id;
    private String firstName, lastName;

    public Customer() {
    }

    public Customer(Integer id, String firstName, String lastName) {
        super();
        this.id = id;
        this.firstName = firstName;
        this.lastName = lastName;
    }

    public Integer id() {
        return id;
    }

    public String firstName() {
        return firstName;
    }

    public String lastName() {
        return lastName;
    }

    @Override
    public Customer map(Result<?> result) {
        if (result.isResultSet()) {// for ResultSetMapper
            return new Customer(result.getInt(1), result.getString(2), result.getString(3));
        } else { // for TypedOutputParamMapper
            return new Customer(null, result.getString(1), result.getString(2));
        }
    }

    // for TypedOutputParamMapper
    @Override
    public List<Integer> getTypes() {
        return Arrays.asList(VARCHAR, VARCHAR);
    }
}
DAO dao = new DAO.Builder(dataSource).build();
CustomerDAO customerDao = dao.create(CustomerDAO.class);

customerDao.createCustomer("Abdullah", "Muhammad");
Customer abdullah = customerDao.getCustomer(0);
// ......
Learn more at github: https://github.com/mhewedy/spwrap 

10 January 2017

easily switch between java versions on mac

add the following alias definitions to your rc (bashrc, zshrc or whatever)

alias  java8='JAVA_HOME=$(/usr/libexec/java_home -v1.8)'
alias java9='JAVA_HOME=$(/usr/libexec/java_home -v9)'


09 January 2017

CompletableFuture cheat sheet

I've divided CompletableFuture methods into groups for me easy to remember

Beside the static factory methods that creates a Completable Future instance, here are the following groups

methodparameter expressionsimilar to
// basic
run () -> {}
accept x -> {}
apply x -> y map
compose x -> f2<y>flatMap
// both
run x, f2<?>, () -> {}
accept x, f2<y>, x, y -> {}
combine x, f2<y>, x, y -> z
// either
run x, f2<?>, () -> {}
accept x, f2<x>, (x) -> {}
apply x, f2<x>, (x) -> y
// exceptions
exceptionally x, (ex) -> x
whenComplete x, (x, ex) -> {}
handle x, (x, ex) -> y

We have 4 groups here, basic, both, either and exceptions, and each method of the above can have 3 versions (the main function and one that run async and third one that run async with custom user-provided Executor).

Preface

In functional programming, we have 4 kinds of functional interfaces ( that represents mainly the functions)

void fn ()         => in java called Runnable (implementations of java.lang.Runnable)
void fu (T t)    => in java called Consumers (implementations of java.util.function.Consumer)
U fn (T t)        => in java called Functions (implementations of java.util.function.Function)
U fn()              => in java called Suppliers (implementations of java.util.function.Supplier)

The last type of functions (Suppliers) sometimes are represented by java.util.concurrent.Callable, but Callable returns checked Exception, so Suppliers are more suitable.

In completable future, user can stream over the result and apply different kind of operations on the result (typically as if you use java.util.Stream, and this I see the completable future should have some relation to the Stream interface, I write a little more about this here), and this operation takes a version of the first 3 versions of the functional interfaces above.

Basic Group

The first group, are basic methods, in which the parameter will run on the output object of the completable future instance, for example, apply method (thenApply) with the following signature:

<U> CompletableFuture<U> thenApply(Function<? super T,? extends U> fn)

This method applies the function in the parameter to the result of type T and return a new object of type U, (hence it is typical mapping function)

Note, the thenApply function has three version, the basic one that run on the same thread as the completable future that has just completes, and 2 version one Async and the other is Async with Executor .

Similar to thenApply are thenRun (3 versions as well) and thenAccept (3 version as well), but thenRun parameter is Runnable which means it expects no input (as result of the completable execution) and return no output.
thenAccept, expects the completable future result as input but returns no output

The last method in this group is thenCompose, which is more or less a flatMap function, which takes the result of the completable future as input and return a CompletableFuture of other type (3 versions as well of this method:

public <U> CompletableFuture<U> thenCompose(Function<? super T,? extends CompletionStage<U>> fn)

Note, CompletableFuture  implements CompletableFuture.

A good use case of thenCompose is when we have a mapping function that will return a CompletableFuture and if we use the regular map function (thenApply), the output will be CompletableFuture<CompletableFuture<U>>. (see the resources for an example).

In this group we could see 4 operations that utilize the first 3 types of functional interfaces, and other groups will keep the same way.

Both Group

The basic group contains the main idea of the whole methods of the CompletableFuture, if you understand it, it will be easy to understand other groups.

Both group is all about, execute the current completable future and then anther completable future (comes as a first parameter to the method), and then the result do with 1 one of 3 things:

Ignore the result and return no output (run)  {method runAfterBoth with 3 versions}
Take the result and return no output (accept) {method thenAcceptBoth with 3 versions}
Take the result and return a new output of different type (combine) (similar to apply in basic group) {method thenCombine with 3 versions}

for example here's the syntax of the thenCombine method:

<U,V> CompletableFuture<V> thenCombine(CompletionStage<? extends U> other, BiFunction<? super T,? super U,? extends V> fn)


the method combine the result of the current CompletableFuture with the result of the other CompletableFuture and send them to the function that takes T (type of current compeletable future), U (type of other completable future) and return new type V.

example usage for clarification:

CompletableFuture<String> current = .....
CompletableFuture<Integer> other = .....

current.thenCombine(other, (String s, Integer i) ->  0.99f);

Either Group

Either group is pretty much like the both group, but one of them once executed, the function will be called.

The thing to note here, the combine function name is apply, so why?

apply executed on one of them, combine takes the 2 result and return a new result of new type.

public <U> CompletableFuture<U> applyToEither(CompletionStage<? extends T> other, Function<? super T,U> fn)

So, current Completable Future and other completable future should be of same generic type. (T)

Exceptions Group

Exception group are operations that except the completable future will return an exception and will deal with it.

exceptionally: will register function what would happen if exception thrown (to return some value)
whenComplete: will register a consumer of result (might be null) or exception (might be null) (mutual exclusive)
handle: same as whenComplete but register a function instead of a consumer. (to return some value of new type)

simple example on handle:

CompletableFuture.supplyAsync(() -> 10).handle((x, ex)-> "hello" );

Note, although whenComplete takes a consumer function, it returns either the result or the exception thrown.


resources:

https://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletionStage.html
http://www.nurkiewicz.com/2013/05/java-8-definitive-guide-to.html




12 October 2016

Using Java 8 Optional and reduce null checking


I am a big fan with the new streams API of Java 8, and I hope it would be used every where in the language.

For example, I hope that, Instead of CompletableFuture interface has  a method named thenApply I hope the name would be "map" instead. and for method like thenAccept, I hope the name would be "forEach" instead.

The point is, thenApply method in the CompleteableFuture interface is more or less a mapping operation, that takes the output of the step before it as input and generates a new output.

Also, theAccept is kind of consumer operation, like forEach, which will will take the output of the previous step as an input and consumes it without generating any output.

And if we talk about the Optional object in Java 8, although I don't have much more functional operations, but he have good enough to allow us to use it to reduce null checking.

The Optional Object has a map and filter (besides its regular methods such as orElseXXX methods).

Here's One example of its usage:

Suppose we have this Object structure that returned from some webservice:

                  static class Result {
Departement departement;
}


static class Departement{
List<Employee> employees;
}


static class Employee{
String name;
}

Then suppose our webservice return the result as one of these 4 functions:

       static Result simulateWsResult1() {
Employee e1 = new Employee();
e1.name = "Ali";
Employee e2 = new Employee();
e2.name = "Muhammad";


Departement dept = new Departement();
dept.employees = Arrays.asList(e1, e2);

Result result = new Result();
result.departement = dept;

return result;
}

static Result simulateWsResult2() {
Employee e1 = new Employee();
Employee e2 = new Employee();

Departement dept = new Departement();
dept.employees = Arrays.asList(e1, e2);

Result result = new Result();
result.departement = dept;

return result;
}

static Result simulateWsResult3() {
Departement dept = new Departement();

Result result = new Result();
result.departement = dept;

return result;
}

static Result simulateWsResult4() {
Result result = new Result();

return result;
}

Then, the main method looks like:

       public static void main(String[] args) {

System.out.println(getEmployeeNames(simulateWsResult1()));
System.out.println(getEmployeeNames(simulateWsResult2()));
System.out.println(getEmployeeNames(simulateWsResult3()));
System.out.println(getEmployeeNames(simulateWsResult4()));
}

And here's the parsing method: getEmployeeNames

       static List<String> getEmployeeNames(Result r){

return
Optional.ofNullable(r)
.map(result -> result.departement)
.map(dept -> dept.employees)
.map(empList -> empList.stream())
.map(empStream -> empStream.filter(e -> e.name != null).map(e -> e.name))
.orElse(Stream.empty())
.collect(Collectors.toList());
}

Complete code here.

Thanks.

30 September 2016

Gradle is awsome

Gradle the build tool for Java and Android (and other languages) is really powerful.

I am brand new to gradle, this is the first time I try it, and I find it really powerful.

I am working - for fun - on some spring boot + angular project , I decided to go with gradle because I like Groovy (The language that grails is written as a DSL; however gradle is written in Java as well as groovy).

However Eclipse support is not as much good as Maven, but I started the spring boot using gradle as the build tool and every thing is fine.

Now I need to build the spring project as well as the angular project (which is based on angular-seed that uses npm and bower).

I find a plugin for gradle for this task (gradle-node-plugin), this plugin allow you to do:

gradle npm-install

so, the npm install command will run and that what I need, but I need it to run with gradle build command.

First the npm-install task of gradle run by default  package.js file found in src/main/java, So I have to write my own task to make it uses spring-boot conventions (src/main/resources/static)

And thanks to the author of this plugin, he makes it easily to extend his plugin, so I wrote:

task my_npm_install(type: NpmTask) {
description = "Installs dependencies from package.json"
workingDir = file("${project.projectDir}/src/main/resources/static")
args = ['install']
}

Here I am defining a new task (enhanced task), that is of type NpmTask (defined in gradle-node-plugin) then set some properties (defined by the parent task (NpmTask) ), so it can do the job.

so, now I can run: gradle my_npm_task and gradle now will run npm install against the correct package.json file.

What is remaining is to have the this task run after I run gradle build (the build task).

Thanks to the amazing tasks dependency feature provided by gradle, I can refer to some task (provided by java plugin - I think) and make it depends on another task (the one I wrote).

Like this: build.dependsOn(my_npm_install)

Then when I run gradle build, here's the output:

.....
......

So that, the build task will run my_npm_install: (here's the output of gradle build command):

:check
:nodeSetup SKIPPED
:my_npm_install
:build
.......
......

gradle run the my_npm_install task before the build task.

Again, gradle is much more powerful and flexible than maven, and have a better syntax as well.
 

05 September 2016

shell script to import certificates into java cacerts

I am not the original author, I just some small enhancements



 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
#! /bin/bash

if [ $# -eq 0 ]; then
    echo -e "usage: $0 <host>\nexample: $0 abc.com"
    exit -1
fi

KEYTOOL=../../bin/keytool
HOST=$1
PORT=443
KEYSTOREFILE=cacerts
KEYSTOREFILE_BKUP=$KEYSTOREFILE.`date '+%Y%m%d%H%M'`.'original'
KEYSTOREPASS=changeit

if [ ! -f $KEYSTOREFILE ]; then
    echo -e "You must run this script from the directory jdk/jre/lib/security"
    exit -1
fi

#backup the cacerts file
echo -e "\n\n**** BAKCING UP THE $KEYSTOREFILE TO $KEYSTOREFILE_BKUP ****\n\n"
cp $KEYSTOREFILE $KEYSTOREFILE_BKUP


# get the SSL certificate
echo -e "\n\n**** SAVING THE CERTIFCATE TO ${HOST}.cert ****\n\n"
openssl s_client -connect ${HOST}:${PORT} </dev/null \
    | sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' > ${HOST}.cert

echo -e "\n\n**** USING keytool AT $KEYTOOL ****\n\n"

# create a keystore and import certificate
echo -e "\n\n**** IMPORTING THE CERTIFICATE... ****\n\n"
"$KEYTOOL" -import -noprompt -trustcacerts \
    -alias ${HOST} -file ${HOST}.cert \
    -keystore ${KEYSTOREFILE} -storepass ${KEYSTOREPASS}

echo -e "\n\n**** PRINTING THE CERTIFICATE AFTER IMPORTED ... ****\n\n"
# verify we've got it.
"$KEYTOOL" -list -v -keystore ${KEYSTOREFILE} -storepass ${KEYSTOREPASS} -alias ${HOST} | grep --color=always $HOST

31 August 2016

callback on thread completion

The code illustrate how callback is being accomplished on Thread completion:


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
class Executor {

  public <T> void execute(Callable<T> task, BiConsumer<T, ? super Exception> callback) {
     CallableThread<T> thread = new CallableThread<>(task, callback);
     thread.start();
  }

  class CallableThread<T> extends Thread {

     Callable<T> task;
     BiConsumer<T, ? super Exception> callback;

     public CallableThread(Callable<T> task, BiConsumer<T, ? super Exception> callback) {
        this.task = task;
        this.callback = callback;
     }

     public void run() {
        System.out.println("running task on thread : " + Thread.currentThread().getName());
        try {
            T t = task.call();
            callback.accept(t, null);
        } catch (Exception ex) {
            callback.accept(null, ex);
        }
     }
   }
 }

Caller:
1
2
3
4
5
6
7
8
System.out.println("running task on thread : " + Thread.currentThread().getName());

new Executor().execute(() -> "HELLO WORKD", (result, ex) -> {
    System.out.println("result: " + result);
    System.out.println("exception: " + ex);
});

System.out.println("finished running task on thread : " + Thread.currentThread().getName());


Output:
1
2
3
4
5
running task on thread : main
finished running task on thread : main
running task on thread : Thread-0
result: HELLO WORKD
exception: null

It appears that, the Executor is submitting the Callable to execution and on complete it invokes the callback functional interface.