Showing posts with label grails. Show all posts
Showing posts with label grails. Show all posts

Friday, March 06, 2020

Fly safe within limits with Flyway in a Spring Boot application . . .

Flyway seems more popular than Liquibase in Java world. Coming back to Java after few years of joy with Grails and it's much more flexible db migration solution offered by grails database-migration plugin which has Liquibase under covers, I certainly felt little limited flying in with Java-Flyway in the very first couple of hours of exploring it.

Liquibase offers more flexibility through a ledger, a change-log XML file in which you define the order of your migration scrips. Grails database-migration plugin enhances migration scripts typically written in SQL with added DSL Groovy support. Also, the change-log file can be in groovy instead of XML. XML was once hot and is a legacy now (except for Maven, it's still modern). Grails database-migration plugin offers full power of dealing with database migrations including full support for generating base-level or starting migration script, incremental change scripts, a rollback mechanism etc. The documentation is also top notch.

With Flyway, you do not have that flexibility dealing with the order or migration scripts through change-log like ledger file. You have to follow version-embedded filename (SQL or Java) conventions. It is highly recommended to follow timestamp based filename versioning. I am yet to explore it's Java way of dealing with complex migrations, but I am sure it is not going to be as pleasing as working with database migrations in Grails projects with expressive nature of Groovy code.

There are tons of articles comparing both Flyway and Liquibase. This post is not to compare, but some exploration of Flyway and JPA capabilities with Grails database-plugin mindset in a Java-based Spring Boot project with JPA.

Environment: Java 13, Spring Boot 2.2.4.RELEASE, PostgreSQL 12.1, Maven 3.6.2 on macOS High Sierra 10.13.6

Generate BASE DDL

It is tempting to start hand-coding Flyway SQL scripts once you make your initial domain model ready with JPA annotations. This is highly error prone and disconnects your domain model powered with JPA from DB in the process of initializing DB with schema and getting it validated against the model. One way to achieve this is to generate DDL scripts from the model.

I prefer to have DDL scripts generated than hand-coding. JPA has this feature and Hibernate offers a decent implementation. This will give you a jumpstart with db migration scripts. You can take generated script by the well-known tool: copy and paste into Flyway migration script file and polish it further. This way, your model gets verified through the generated script taken into Flyway script and applied to DB. Thus any discrepancies between the model and DB can be avoided later in the game.

In order to get the DDL script generated, you need to make some run-time configuration changes for your local environment (the environment for which you need to get DDL generated). There are three ways to do this (at least the possible ways I've explored).

Option-1: Make changes to your environment properties/yml file as shown below:

bootstrap-local.yml
spring: jpa: properties: hibernate: # generating DDL - add me, Hibernate 5.1.0 onwards the default end of SQL statement delimiter is none in generated DDLs hbm2ddl.delimiter: ';' # generating DDL - add me javax: persistence: schema-generation: scripts: action: create create-target: create.sql flyway: # generating DDL, make sure I am turned off enabled: false

Run your app with the above changes, and you will have create.sql file generated in the directory where you run your app from. Examine and make any necessary changes to the DDL generated before copying that into Flyway Base SQL script.

Revert the changes done to your environment properties/yml file and bring up the application. Flyway should be flying happily taking the base DDL script file and applying it to your database.

Option-2: Set those properties on the maven command line (*fine-print: Due to some reason, this option doesn't work consistently for me, I am not at all happy with Spring Boot Maven Plugin's documentation. You need to depend on extensive and tireless search to find out how to get this done :( )

Alternatively, you can simply override those run-time config properties for your local env in the maven command and get the DDL generated. This way you don't have to temporarily change your local run-time config file every time when you need to generate DDL and revert it afterwards. An example of running maven wrapper command on the root project when you have a spring-boot project (my-service-api) as one of modules, is shown below:

./mvnw -pl myservice-api clean install spring-boot:run -Dspring-boot.run.profiles=local -DskipTests \ -Dspring-boot.run.arguments=\ --spring.flyway.enabled=false,\ --spring.jpa.properties.javax.persistence.schema-generation.scripts.action=create,\ --spring.jpa.properties.javax.persistence.schema-generation.scripts.create-target=create.sql,\ --spring.jpa.properties.hibernate.hbm2ddl.delimiter=\;

In the above command, we basically have overridden four run-time config properties earlier shown in yml file for DDL generation:
  1) disabled Flyway
  2) specified schema-generation type
  3) specified the DDL file name to be generated
  4) specified the delimiter character, the end of statement character for SQL statements generated in the DDL file.

All backslashes (\) are just shell line-breakers except the very last one to escape the end of statement delimiter character (;) in the generated DDL script.

If you are lucky, you will have create.sql file generated in the directory you ran this command from. Examine the DDL generated before copying it into Flyway Base SQL script.

Simply bring up your application. Flyway should be flying happily taking the base DDL script file and applying it to your database.

Option-3 (My preferred option): Run with your runnable jar

Have a runnable jar created (typically under target directory in your module). Simply bring up the application by passing all those properties to override on the command line. This way, you can stay away from Maven and from all issues it brings in along with it. An example is shown below:

For action create:
java --enable-preview -Dspring.profiles.active=local -jar <path/to/your/jar-file/executable/jar-file.jar> \ --spring.flyway.enabled=false \ --spring.jpa.properties.javax.persistence.schema-generation.scripts.action=create \ --spring.jpa.properties.javax.persistence.schema-generation.scripts.create-target=create.sql \ --spring.jpa.properties.hibernate.hbm2ddl.delimiter=\;

For action update:
java --enable-preview -Dspring.profiles.active=local -jar <path/to/your/jar-file/executable/jar-file.jar> \ --spring.flyway.enabled=false \ --spring.jpa.properties.javax.persistence.schema-generation.scripts.action=update \ --spring.jpa.properties.javax.persistence.schema-generation.scripts.create-target=update.sql \ --spring.jpa.properties.hibernate.hbm2ddl.delimiter=\;


Again, all backslashes (\) are just shell line-breakers except the very last one to escape the end of statement delimiter character (;) in the generated DDL script.

If you want to run it from IntelliJ instead of command-line, setup a Run Configuration as shown below:


Incremental DDL changes

Once you have base DDL Flyway script applied, as you progress with your development, there will be changes made to domain model as it starts to evolve. As and when your domain model goes through changes, you need to put corresponding Flyway SQL migration scripts in place.

I'VE NOT FOUND A WAY TO GET THIS DONE!

NOTE: Though I have not found an action like update authoritatively documented anywhere, I just tried and it does work and generating something but not very useful. All I tried was changing action to update from create and create-target to update.sql from create.sql.

If you have your previously generated create.sql/update.sql file hanging around and use the same for incremental changes, it simply gets appended with the resulted incremental DDL statements. That is definitely not what you want. So, make sure that you delete or use a different name.

Once, you have the incremental DDL script, examine it, and copy it to new Flyway script file. Bring up the app to have Flyway flying again taking the newly added script with it and applying it to the Database.

Leverage JPA Annotations as much as you can in order to generate your DDL accurately

A good Database schema design should have all data constraints applied. These constraints include primary key constraints, foreign key constraints, unique constrains etc. JPA offers annotations that can be leveraged in generating constraint creation DDL commands as well.

PRIMARY KEY Constraint
public class MyDomain { @Id @GeneratedValue(strategy = GenerationType.IDENTITY) @Column(nullable = false, updatable = false) private Long id; ... }

The above JPA annotation generates the following DDL script:

CREATE TABLE my_domain {
id SERIAL PRIMARY KEY, ... }

When the type is SERIAL PostgreSQL generates a table specific sequence my_domain_seq and with IDENTITY generation strategy, this sequence is used both by database and JPA.

UNIQUE KEY Constraint
@Table( uniqueConstraints = @UniqueConstraint( columnNames = {"prop1", "prop2"} name = "my_domain_p1_p2_uk" ) ) public class MyDomain { ... String prop1; String prop2; }

The above JPA annotation generates the following DDL script:
ALTER TABLE my_domain ADD CONSTRAINT my_domain_p1_p2_uk UNIQUE (prop1, prop2);

FOREIGN KEY Constraint
public class MyDomain { ... @ManyToOne(fetch = FetchType.EAGER, optional = false) @JoinColumn( name = "my_prop_type_id" foreignKey = @ForeignKey(name = "my_domain_mpt_fk"), nullable = false, insertable = flase, updatable = false ) private MyPropType myPropType; ... }

The above JPA annotation generates the following DDL script:
ALTER TABLE my_domain ADD CONSTRAINT my_domain_mpt_fk FOREIGN KEY (my_prop_type_id) REFERENCES my_prop_type;

TIPS

Get that missing Semicolon back

Without explicitly setting the property spring.jpa.properties.hibernate.hbm2ddl.delimiter=; the generated DDL statements will not end with semi-colon. If you set it on the command line instead of the env specific application yml/properties file, make sure to escape ; with \ as shown below:
spring.jpa.properties.hibernate.hbm2ddl.delimiter=\;

Turn Flyway on/off

Flyway can be turned on/off by setting the property spring.flyway.enabled=true/false. It can either be set in application yml/properties files or on the command line when mvn/mvnw is run. I am not happy with overriding on the maven command line is as it takes up my time with stupid errors that I do not want to break my head with anymore, use this option at your own discretion :)

Happy Coding!
Have a limited but safe flight with Flyway and Maven in a Spring Boot application!!

Monday, January 20, 2020

The Forgotten Maven . . .

It's been more than a decade since I used Maven, or Maven used me. I was quite impressed with it's dependency management and convention over configuration features first time when I tried and introduced it in a project. But it was not a long-lasting impression. Those were the days when Ant was the de-facto standard build system for Java projects, XML/XSD/XSLT were pleasing every developer, and there wasn't any build system with dependency management capability.

Then I moved to a team where Ant + Ivy was chosen standard in Maven era. Huh...what a pain to go back to Ant and start writing build scripts in XML from scratch with Ivy as the dependency manager and your own conventions & configurations for a project! Later, I was privileged at the same place to revamp their tech-stack for one of the new projects, changing from AccuRev to Git, Java to Java + GroovyStruts to Spring MVC, WebLogic to Tomcat, no CI/CD to full Jenkins CI/CD pipeline, and Ant + Ivy to Gradle. The last one: Ant + Ivy to Gradle, was a joyful great leap forward. I was very impressed with the depth of Gradle documentation. I started advocating by popularizing the same phrase borrowed from Gradle docs, "Next Generation Build System". Then I chose and moved onto Grails projects. For about 5 years, I was living in a very happy world of Groovy, Grails and Gradle. My vision got better with no visual noise and clutter ;)

Back to the Future

Now I am back to Java, Spring Boot tech space where Maven is the chosen standard build system. Whenever I open pom.xml file in IDE, my eyes suddenly get blurry and my fingers start to slide on the trackpad making the screen scroll up and down. "Welcome back!", I say to myself as this is my choice to move back to Java ;)

Lately, I was going through the steps for building and running an existing multi-module project (multi-project in Gradle). A step describing to install specific version of maven paused me. Also, to run maven-goals (equivalent to gradle-tasks) of a specific module (project in Gradle) in a multi-module (multi-project in Gradle) proejct, I had to cd into it to run it's goals. My immediate reaction was to google and explore these two features: 1) Multi-module builds 2) Maven wrapper. Going forward, I would apply these two to every maven build-based project.

Environment: Java 13, Spring Boot 2.2.3, Maven 3.6.2 on MacOS High Sierra 10.13.6

1. Maven Wrapper (similar to Gradle wrapper)

Working with Gradle based Grails projects, I am used to Gradle wrapper which is preferred way to go without having Gradle or a specific version of Gradle installed to build and run your project. I was happy to find something similar exists now for Maven world. If one exists, why not use it? I started using it. Conceptually, it is very similar to Gradle wrapper.

2. Multi-module maven build (similar to multi-project gradle build)

One of the pain points I had with a multi-module project was finding how to run maven goals in the context of a specific module (project) from the root project directory. But finding a way to do this took little more time than expected even when we have stackoverflow around to readily help finding a way. If a maven expert reads this and says, "Hey stupid, this is so obvious and maven users already know how to do this even in sleep.", I am not ashamed to take it with a smile ;)

Using Maven Wrapper in a Multi-module maven Project

In a multi-module maven project, apply maven wrapper at the root project level. It generates a couple of command scripts (mvnw, mvnw.cmd) and .mvn/wrapper dir used by the wrapper scripts to go fetch, install and run maven if it is not present on your system.

With the following multi-project structure (my-service is the root project, my-service-api is a Spring Boot project):

. └── my-service ├── my-service-api │   └── pom.xml ├── my-service-lib │   └── pom.xml └── pom.xml


Run the following wrapper plugin goal from the root project dir: my-service (of course, for running this, you need to have maven installed) to add maven wrapper support to your multi-module project:
mvn -N io.takari:maven:wrapper

The above command generates wrapper specific files and directory that are highlighted and shown below:
. └── my-service ├── .mvn │   └── wrapper │   ├── MavenWrapperDownloader.java │   ├── maven-wrapper.jar │   └── maven-wrapper.properties ├── mvnw ├── mvnw.cmd ├── my-service-api │   └── pom.xml ├── my-service-lib │   └── pom.xml └── pom.xml

All files added by the wrapper plugin should also be checked-in and live in the repo along with project files. Now, if a new developers check out the project, they don't need to install maven to build and run the project on their systems. Simply run maven goals from the root project like, but instead of running mvn command which requires maven to exists, happily run ./mvnw wrapper script.

Maven multi-project builds can be run from either the root project or from a specific module. When run from the root, it builds all sub-modules. When run from a specific module, it just builds that module. The root level pom.xml defines modules and other common configurations available for all modules. A module specific pom.xml defines build configuration for that specific module.

Without wrapper support and maven installed, or with wrapper support and maven not installed, one can run multi-project builds in 3 ways:
    1. Build from the root, which builds all modules (sub-projects).
    2. Build from the root, but a specific module's specific goal.
    3. Build from a specific module, cd into a specific module and run that module's specific goal.

1. Build from the root project

// from the root project $ cd my-service // with maven (installed) my-service$ mvn clean install // with maven wrapper (maven not installed) my-service$ ./mvnw clean install

2. Build a specific module from the root project

// from the root project $ cd my-service // with maven (installed) my-service$ mvn -pl my-service-api clean install spring-boot:run -Dspring-boot.run.profiles=local
// with maven wrapper (maven not installed) my-service$ ./mvnw -pl my-service-api clean install spring-boot:run -Dspring-boot.run.profiles=local

-pl <sub-module-name-the-tasks-to-be-run-for> is the key command option to know for this.

3. Build specific module's goal from the module

// from the module $ cd my-service-api // with maven (installed) my-service-api$ mvn -pl clean install spring-boot:run -Dspring-boot.run.profiles=local
// with maven wrapper (maven not installed) my-service-api$ ../mvnw clean install spring-boot:run -Dspring-boot.run.profiles=local

TIP

mvn -h or mvn --help lists all command line options.

Here is how the help looks for -pl option:
-pl,--projects <arg> Comma-delimited list of specified reactor projects to build instead of all projects. A project can be specified by [groupId]:artifactId or by its relative path

I got fooled and stayed away by the buzz word reactor projects when I first tried looking for some help before seeking further help and finding how-to on stackoverflow :(

1. Newer versions of Maven - Wrapper support

With newer versions of Maven, adding wrapper to maven project is easy, just run the following command from the project root folder:
mvn wrapper:wrapper // add wrapper support to project ./mvnw -v // check maven version used by wrapper ./mvnw validate // validate project

To upgrade maven wrapper to newer version of maven, just run the following command and checkin all modified files into your source repo:
./mvnw wrapper:wrapper -Dmaven=3.8.6 // upgrade maven wrapper to newer version of maven ./mvnw -v // check maven version used by wrapper

2. Show Maven Version when maven commands are run using Maven wrapper

Add file under .mvn directory of your project named: maven.config containing --show-version and you will have both Maven version and Maven home directory displayed in the build output in the beginning.
# show Maven version, Maven home, Java version, and OS details --show-version

3. Show Java Version when maven commands are run using Maven wrapper

Add file under .mvn directory of your project named: jvm.config containing --show-version and you will have Java version displayed in the build output in the beginning.

Also, the following config entries in the above mentioned jvm.config file will be useful to show date-time of build console log lines and thread name as well.

-Dorg.slf4j.simpleLogger.showDateTime=true -Dorg.slf4j.simpleLogger.dateTimeFormat=HH:mm:ss -Dorg.slf4j.simpleLogger.showThreadName=true --show-version

Summary

After happily living in Groovy/Gradle/Grails world for about a decade, whether I like it or not, I am back to Java/Maven and I am using maven again. Oops, maven started using me again ;)

References



Sunday, July 15, 2018

Add Custom Scope to a Grails 2 Application . . .

Grails services are Spring managed singleton beans by default. Singleton is one of the five different scopes (singleton, prototype, request, session, and globalSession) that Spring framework offers for managed beans. Prior to Spring 2.5, there were only 2 standard scopes: singleton and prototype. Spring 2.5 added 3 additional scopes: requestsession and globalSession for use in web-based applications. Grails adds two more scopes to the mix: flow and flash.

Sometimes, you might run into a situation that none of these scopes meet your requirements. For instance, when you have a multi-tenant or multi-client application, what you may need is a client scope, a separate bean instance for each client. Recently, I ran into this situation. Addressing an existing performance issue drove me into this situation, making an use case for a custom scope.

We have a Grails 2.5.4 multi-client application with clientId included in URL mappings for end-points like: /clients/$clientId/resource. There has been a performance issue with one of the end-points backed by a service. The service has a heavy-weight method which builds and caches client data once for each client with an expiration time set to expire cache few hours once built. It takes few minutes to build this data due to it's nature and some data rule complexities. Once built and cached, it rebuilds data from the cache really fast. One of the performance improvements identified upfront was to limit concurrent calls to that method. Obviously, there is no point in allowing concurrent builds for the same client. The solution to put in place was to allow one-and-only-one concurrent call for any given client, but allow concurrent calls for different clients one per each client. Making the method synchronized is an easy way to limit concurrent calls, but it only solves half of the problem. As service is a singleton bean and synchronized method uses this as the object lock. With that, concurrent calls get executed serially, across all clients. But, we need to allow concurrent executions for different clients, but not for the same client. It is still possible to achieve this with synchronized method, but only when there is one service instance per client. This opened the need for a custom scope. Since Spring 2.0, the concept of scoping beans is made extensible and is the way to add custom scope.

Spring maintains a cache of all scoped beans in it's container. Every scope except prototype has it's own in-memory cache of bean instances which is initialized & populated when the application gets started and maintained by spring container. All spring managed beans either get instantiated or get proxies created during the application startup. Obviously, prototype scoped beans don't need any kind of cache as they get created and injected into all beans that are auto-wired with Singleton beans. A Grails service being singleton bean is also stateless and hence it's methods can be executed by multiple concurrent threads.

Spring has a good documentation of all these details and the API is well-documented as well. For creating a custom scope, all we need to do is: 1) Create a custom Scope object by implementing org.springframework.beans.factory.config.Scope interface and 2) Register custom scope with Spring container. 3) Scope required beans at this custom scope. Sounds simple, in a Grails application it should even be simpler. Lets get through step-by-step implementation details of adding a new custom scope: Client scope to a Grails 2 application.

Environment: Grails 2.5.4, Spring 4.1.8, Java 8 on MacOS High Sierra 10.13.5

Step-1 Implement custom Scope

This is straight forward. Just implement the interface and provide implementation for all essential methods. Remember, the implementation should also maintain it's own cache for this custom scoped beans. Also, make sure that any needed scoped context (in this case, it is clientId) is available and accessible in this implementation when a reference to the bean scoped at this custom scope is needed.

src/groovy/com/giri/grails/scope/ClientScope.groovy
package com.giri.grails.scope import grails.plugin.springsecurity.SpringSecurityService import grails.util.Holders import groovy.util.logging.Log4j import org.springframework.beans.factory.ObjectFactory import org.springframework.beans.factory.config.Scope /** * Custom scope bean for client-scoped services registered in resources.groovy. * * All services(Spring beans) that need client-scope should define static property of scope like: * static scope = ClientScope.SCOPE_NAME * * @see resources.groovy * * @author gpottepalem * Created on July 15, 2018 */ @Log4j class ClientScope implements Scope { static final String SCOPE_NAME = 'clientScope' /** * Client scoped bean store. * A synchronized multi-thread-safe map of various beans scoped with {@link ClientScope#SCOPE_NAME} * Spring framework depends on this store for maintaining beans defined with this custom client-scope. * e.g. Two clients with one common service and two different services * [ client-1 : [ * 'ClientService' : clientServiceObjectRef, * 'OtherService' : otherServiceObjectRef * ], * client-2 : [ * 'ClientService' : clientServiceObjectRef, * 'SomeOtherService' : someOtherServiceObjectRef * ] * ] */ private Map<Integer, Map<String, Object<?> clientScopedBeansMap = [:].asSynchronized() /** * Helper method, returns client-scoped beans for a given client. * @param clientId the client id * @return A map of client-scoped beans for the given client. */ private Map<String Object> getClientScopedBeans(Integer clientId) { if(!clientScopedBeansMap[clientId]) { clientScopedBeansMap[clientId] = [:].asSynchronized() log.debug "No client scoped bean found for client:$clientId, just created new map" } return clientScopedBeansMap[clientId] } /** * Helper method, returns clientId taking it from authenticated user. * @return clientId of the current user logged in */ private Integer getClientId() { (Holders.grailsApplication.mainContext.getBean('springSecurityService') as SpringSecurityService).authentication?.clientId } @Override Object get(String name, ObjectFactory<?> objectFactory) { synchronized (this) { Integer clientId = getClientId() Map<String Object> clientScopedBeans = getClientScopedBeans(clientId) if (!clientScopedBeans[name]) { clientScopedBeans[name] = objectFactory.object log.debug "Added new instance: ${clientScopedBeans[name]} for bean: $name for client:$clientId to the bean store" } return clientScopedBeans[name] } } @Override Object remove(String name) { Map<String Object> scopedBeanMap = getClientScopedBeans(getClientId()) return scopedBeanMap.remove(name) } @Override void registerDestructionCallback(String s, Runnable runnable) { // nothing to register } @Override Object resolveContextualObject(String s) { return null } @Override String getConversationId() { return SCOPE_NAME } }

Step-2 Register custom scope

Register custom scope in:
grails-app/conf/spring/resources.groovy
import com.giri.grails.scope.ClientScope import org.springframework.beans.factory.config.CustomScopeConfigurer beans = { ... // Custom scope: per-client clientScope(ClientScope) // register all custom scopes customScopeConfigurer(CustomScopeConfigurer) { scopes = [(ClientScope.SCOPE_NAME) : ref('clientScope')].asImmutable() } ... }

Step-3 Scope a client-specific service with custom scope

Say, we have a service ClientService that we need to scope at clientScope. Just define a static scope property set with this custom scope like:
grails-app/services/com/giri/ClientService.groovy
package com.giri import com.giri.grails.scope.ClientScope class ClientService { static scope = ClientScope.SCOPE_NAME def clientDataBuilderService //DI //delegates to clientDataBuilderService synchronized Map buildClientData(String clientId) { clientDataBuilderService.buildData(clientId) } ... }

Step-4 Custom-scoped service - Dependency Injection

Grails supports Spring Dependency Injection by convention. A property name that matches the class name of a Spring managed bean gets injected automatically. Unlike Spring applications, you don't need @Autowired annotation on a property or a setter method. But for custom scoped beans, the actual custom scoped bean might get instantiated lazily when the client context (in this case, clientId) is available in the application (either taken from the request, session, security authentication etc.). This context is not known at the start of the application. So, without creating proxies, dependency injection may not be possible. This requires the scoped bean to be programmatically resolved, unlike auto-wired by Grails convention. The following is a way to get a handle to the scoped bean instance. The assumption here is, that the context needed for creating a scoped bean for specific client (clientId) is available in the security context and all end-points are secured.

grails-app/controllers/com/giri/ClientController.groovy
package com.giri import grails.converters.JSON import grails.util.Holders class ClientController { ... def index(String clientId) { ClientService clientService = Holders.grailsApplication.mainContext.getBean('clientService', ClientService.class) clientService.buildClientData(clientId) as JSON } ... }

TIP: Unit Testing

Without scoped bean dependency injected by following Grails naming convention for DI and by referring it using Grails ApplicationContext puts us into a limitation in unit tests. The actual scoped bean instance is needed which can otherwise be mocked if it was injected. Typically, Grails doesn't load bean definitions as the complete ApplicationContext is not needed in unit-tests. The following are two options that Grails offers to get away with this and have bean definitions loaded and ApplicationContext available for unit-tests:

Option-1
test/unit/com/giri/ClientControllerSpec.groovy
package com.giri import grails.test.mixin.TestFor import spock.lang.Specification @TestFor(ClientController) class ClientControllerSpec extends Specification { static loadExternalBeans = true //loads beans defined in resources.groovy and beans are available in applicationConext ClientService clientService def setup() { //controller.clientService = Mock(ClientService) //doesn't work clientService = applicationContext.getBean('clientService') clientService.clientDataBuilderService = Mock(ClientDataBuilderService) } void "test index"() { when: controller.index('client-1') then: (1.._) * clientService.clientDataBuilderService.buildData('client-1') >> ['abcd' : 1234] and: response.json == [ "abcd": 1234 ] } }

Option-2
package com.giri import grails.test.mixin.TestFor import spock.lang.Specification @TestFor(ClientController) class ClientControllerSpec extends Specification { ClientService clientService def setup() { //controller.clientService = Mock(ClientService) //doesn't work //define bean to get into applicationContext as it is not injected to mock it out defineBeans { clientService(ClientService) } clientService = applicationContext.getBean('clientService') clientService.clientDataBuilderService = Mock(ClientDataBuilderService) } void "test index"() { when: controller.index('client-1') then: (1.._) * clientService.clientDataBuilderService.buildData('client-1') >> ['abcd' : 1234] and: response.json == [ "abcd": 1234 ] } }

Summary

To add custom scope to a Grails Application, all you need to know is some Spring Framework details and Grails integration with Spring. There is still one improvement that can be made to this solution, getting scoped beans injected by following Grails convention. This particular use-case requires client context (clientId) to be available for scoped bean cache maintenance. This makes the case for a proxy to be generated for this custom-scoped beans. A proxy bean needs to be generated and injected for scoped bean at the start of application. The proxy bean should be able to retrieve the actual target bean from the scoped cache and delegate method calls to that target object. This might simply need some additional Spring configurations, I guess. I left it out for now, to explore later.

References

Tuesday, September 05, 2017

Create a Secured Restful API App with Grails 3 and PostgreSQL - Step by Step: Part 5 of 5

Part 5: Assure REST & Publish your API

At the end of my last post, we had a RESTful application with an end-point fully implemented and secured. However, we had not written any unit/integration test-specs. In this post we will write integration test-spec and mix it with REST Assured and Spring REST Docs to not only test the end-point but also to generate and publish API documents.

Importance of API Documentation

API is the middleware now. By following standard principles, the behavior of API can be consistent and predictable. Like any piece of code, APIs must be tested by all means from writing test-cases to assuring it's quality by QA. Ideally, when the application is assembled, API docs must also be generated and bundled into the artifact and be delivered together for deployment. That makes API docs a "living source of documents" and they become a common source of reference.

Spring REST Docs

There are few popular frameworks/tools to generate API documentation. Each one of such frameworks/tools has it's own adopted approach and comes with it's own benefits and drawbacks when compared with others. However, Spring IO has a project called Spring REST Docs that uniquely takes a very different approach. It's approach is centered around testing and is combined with hand-written Asciidoctor templates to produce high quality and maintainable API documentation. This approach definitely stands out as it promotes testing to it's greatest levels.

With the test-centric approach, it makes API document not only accurate-and-complete but also up-to-date and a living-resource reference. API documents generated this way are always as accurate as the code-base is. Also, as Spring framework is central to Grails framework, it becomes a natural-fit for Grails applications.

Having said all that, let's add Spring REST Docs to Grails 3 project and assure it with REST Assured.

Environment: Grails 3.2.11, Java 1.8, Apache Tomcat 8.0.20, IntelliJ IDEA Ultimate 2017.2 on Mac OS X 10.11.5 (El Capitan)

Step 0: Upgrade application from Grails 3.1.6 to 3.2.11

When I started this multi-part posts, Grails was at 3.1.6 and now it has advanced to 3.3.x. Just to catch up, I've upgraded this app from 3.1.6 to 3.2.11 (the latest on 3.2.x branch). It was an easy upgrade as it is a simple RESTful application. All I had to do was to bring gradle.properties and build.gradle files up-to-date with 3.2.11.

Step 1: Add Spring REST Docs to the Project (build configuration)

At the end of my last post, we had a secured resource (Artist) and we tested it's RESTful API for CRUD operations. That is good enough resource for taking it to the next level of generating & publishing it's API. Spring REST Docs' Getting Started has link to sample applications for reference. REST Assured Grails is the best-bet and is the basis for us. As a first step let's add Spring REST Docs support to the project as shown and described below:

Add Asciidoctor plugin.
build.gradle
plugins { ... id 'org.asciidoctor.convert' version '1.5.3' }

Run gradle tasks command and notice that asciidoctor task gets added by the plugin.
$./gradlew tasks ... Documentation tasks ------------------- asciidoctor - Converts AsciiDoc files and copies the output files and related resources to the build directory. groovydoc - Generates Groovydoc API documentation for the main source code. ... javadoc - Generates Javadoc API documentation for the main source code.

Spring REST Docs Build Configuration section has steps for Gradle build configuration. I will do this slightly different to extend build script for REST docs support by separating out additional build configuration for REST docs into it's own build file leveraging Gradle's script plugin concept. This way it is more cleaner and brings in some modularity to the build script.

Create a new build script file restdocs.gradle under project's gradle dir and reference it in the main build.gradle file at the very bottom as shown below:
build.gradle
apply from: 'gradle/restdocs.gradle'

Let's populate restdocs.gradle as shown below. I will add comments into the build script to explain certain code blocks.
gradle/restdocs.gradle
buildscript { repositories { maven { url 'https://repo.spring.io/libs-snapshot' } } } repositories { maven { url 'https://repo.spring.io/libs-snapshot' } } //add extra user-defined properties to the project through ext block ext { snippetsDir = file('build/docs/generated-snippets') //output dir of rest api doc snippets generated restDocsVersion = '2.0.0.BUILD-SNAPSHOT' restAssuredVersion = '2.9.0' } dependencies { testCompile "org.springframework.restdocs:spring-restdocs-core:$restDocsVersion" testCompile "org.springframework.restdocs:spring-restdocs-restassured:$restDocsVersion" testCompile "org.springframework.restdocs:spring-restdocs-asciidoctor:$restDocsVersion" }

Now, just run grails clean command. We will have spring-restdocs-core and spring-restdocs-restassured downloaded from maven central repo.

Let's keep expanding this script.
//task to clean generated rest api docs snippets dir task cleanSnippetsDir(type: Delete){ delete fileTree(dir: snippetsDir) }

Run ./gradlew tasks and notice that there is a new task added under Other tasks like:
Other tasks ----------- cleanIdeaWorkspace cleanSnippetsDir console

Configure test task as shown below:
test { dependsOn cleanSnippetsDir outputs.dir snippetsDir }

Now run ./gradlew test -m or ./gradew test --dry-run which will run gradle's test task in a dry run mode. It disables all tasks and shows the order in which tasks get executed. In this case, we can now see our new task cleanSnippetsDir in the list after all classes are created and before test-case classes get compiled.

Remember we got asciidoctor task by adding Gralde plugin as the very first step. We will customize it and specify that it depends on integrationTest task. With this dependency, every time when we run this task, it will have integration tests run. We want this kind of dependency as the approach that REST Docs brings in is to have REST API docs generated from the integration test cases. So, we need integration tests to run before we have docs generated.

Having said that, let's customize that task as follows:
//Configure asciidoctor task provided by Gradle asciidoctor plugin- https://github.com/asciidoctor/asciidoctor-gradle-plugin asciidoctor { doFirst{ //just print outputDir for reference during execution phase println "Running asccidoctor task. Check generated REST docs under: ${outputDir}" } dependsOn integrationTest logDocuments = true sourceDir = file('src/docs') inputs.dir snippetsDir separateOutputDirs = false attributes 'snippets': snippetsDir //configure snippets attribute for .adoc files }

Step 2: Run tests and make them pass: grails test-app

Grails test-app runs both unit tests and integration tests.
I have not written any test specifications so far but as part of creating domain objects using grails create-domain-class command, I have a few Spock Specification unit-tests created each with a default feature method "test something"(). All these default generated specifications are expected to fail to start with. I want to keep these tests around for future but want to make them pass. An easy way is to annotate all those methods with groovy's @NotYetImplemented annotation. It reverses the net result by making it pass when it actually fails. It makes sense for an un-implemented test. But when actually implemented, it fails forcing us to remove the annotation.

Spock's @PendingFeature is similar but is added only in Spock 1.1. Grails 3.2.x comes with Spock 1.0. For now, we are all good with that wonderful annotation provided by groovy. With this we have all unit-tests passing.

It's time now to write an integration test specification for our RESTful controller: ArtistController. Instead of writing a typical integration test-case, let's mix it with REST assured and REST API Docs and get both testing and API docs generation done in this phase.

Step 3: Assure REST by writing integration specification for RESTful Controller with a mix of REST Docs and a touch of REST assured.

Step 3a: Configure REST Assured testing framework (set up your test specification to generate documentation snippets)

The Spring REST Docs documentation has outlined these steps. Here is the gist of it:

The configuration of REST Assured is nothing but a request spec (RequestSpecifiction) using ResqusetSpecBuilder by adding documentation configuration as a JUnit filter to it.

Configure REST assured documentation output directory by declaring a restDocumentation field which is initialized with an instance of JUnitRestDocumentation and annotate it with JUnit's @Rule annotation. This rule gets executed before and after each feature method. A custom output directory can be specified by passing a constructor argument. We specify this custom dir, as in the build file, the snippetsDir property we set with is slightly different ('build/docs/generated-snippets') than the default ('build/generated-snippets').

Next, setup RequestSpecification by adding a filter and configure it with the above restDocumentation initialized as JUnit Rule.

Here is how our test spec looks after this configuration:
src/integration-test/groovy/com/giri/ApiDocumentationArtistSpec
package com.giri import geb.spock.GebSpec import grails.plugins.rest.client.RestBuilder import grails.test.mixin.integration.Integration import grails.transaction.Rollback import io.restassured.builder.RequestSpecBuilder import io.restassured.specification.RequestSpecification import org.junit.Rule import org.springframework.restdocs.JUnitRestDocumentation import static org.springframework.http.HttpStatus.* import static org.springframework.restdocs.restassured3.RestAssuredRestDocumentation.documentationConfiguration @Integration @Rollback class ApiDocumentationArtistSpec extends GebSpec { @Rule protected JUnitRestDocumentation restDocumentation = new JUnitRestDocumentation('build/docs/generated-snippets') private RequestSpecification documentationSpec def setup() { //set documentation specification this.documentationSpec = new RequestSpecBuilder().addFilter( documentationConfiguration(this.restDocumentation)) .build() } ...

Step 3b: Spockify, test RESTful end-point and get documentation snippets generated

With the above configuration, let's write a feature method to test GET request of /api/artists end-point. The following is a feature method added to the above specification along with a defined static constant whose value is set with relative end-point url and an injected application port property. The port is required to override the default port(8080) of REST assured testing framework. Note that grails start the application on a random available port each time when integration tests are run.

static final String ARTISTS_ENDPOINT = '/api/artists' @Value('${local.server.port}') protected int port ... void "test and document GET request (index action) of end-point: /api/artists"() { given: "" RequestSpecification requestSpecification = RestAssured.given(this.documentationSpec) .accept(MediaType.APPLICATION_JSON_VALUE) .filter( RestAssuredRestDocumentation.document( 'artists-list-example' ) ) when: def response = requestSpecification .when() .port(port) .get(ARTISTS_ENDPOINT) then: response.then() .assertThat() .statusCode(HttpStatus.OK.value()) }

The feature method name describes the intent of this feature method. In this step, we are only testing the GET request of an end-point. We will add the support for the highlighted and document intent of this feature.

With this, if you run grails dev test-app or grails -Dgrails.env=development test-app, the test will pass. Also, we will have the following six documentation snippets generated under build/docs/generated-snippets/artists-list-example directory:
 curl-request.adoc
 http-request.adoc
 http-response.adoc
 httpie-request.adoc
 request-body.adoc
 response-body.adoc

These are the snippet files to be included in the final API documentation. Just to see the contents check http-response.adoc and it will contain the actual response received as follows:
---- HTTP/1.1 200 OK X-Application-Context: application:development:0 Content-Type: application/json;charset=UTF-8 Transfer-Encoding: chunked Date: Tue, 29 Aug 2017 22:12:13 GMT Content-Length: 148 [{"id":"90ff9ac4-b1c0-4495-94d5-1550f463561a","dateCreated":"08/29/2017","firstName":"Giridhar","lastName":"Pottepalem","lastUpdated":"08/29/2017"}] ----

Step 3c: Create asciidoctor (.adoc) source templates
Create src/docs dir and create api-guide.adoc and artists.adoc files to start with. The api-guide.adoc is the main asciidoctor template which will include artists.adoc. The artists.adoc is the asciidoctor template for artists end-point.

Shown below is a portion of api-guide.adoc
= giri-api RESTful API Guide Giridhar Pottepalem :doctype: book :icons: font :source-highlighter: highlightjs :toc: left :toclevels: 4 :sectlinks: [[overview]] = Overview [[overview-http-verbs]] == HTTP Methods giri-api API follows standard HTTP and REST conventions as closely as possible in its exposure of resources as end-points and use of HTTP methods (verbs). ... [[resources]] = Resources include::artists.adoc[]

And portions of artists.adoc is shown below for creating an Artist (POST request/save action):
[[resources-artists]] == Artists An Artist is a resource which represents an Artist. [[resources-artists-create]] === Creating an Artist A `POST` request is used to create a new Artist. TIP: An Artist can be created only by an Admin user (with role `ROLE_ADMIN`) IMPORTANT: Once a new Artist is created... ==== Request structure include::{snippets}/artists-create-example/request-fields.adoc[] ==== Example request include::{snippets}/artists-create-example/curl-request.adoc[] ==== Response structure include::{snippets}/artists-create-example/response-fields.adoc[] ==== Example response include::{snippets}/artists-create-example/http-response.adoc[]
Highlighted are the references to generated snippets that get included in the generated end HTML5 doc.

Step 3d: Generate API doc
Now, lets run asciidoctor gradle task we got added through Step 1 as shown below:
./gradlew asciidoctor //runs in test env
./gradlew -Dgrails.env=development asciidoctor //runs in dev env

This task runs all integration test specifications because we configured it to depend on integrationTest task. Once it's run successfully with no failing tests, it converts our asciidoctor API templates to HTML5 doc by populating it with included generated snippets as we referenced in artists.adoc.

Now let's enhance our specification feature method to document request and response payload structure. Let's take the case of /api/artists end-point and GET request. There is no request payload for this request. So, we will simply add response payload specification as shown below:
void "Test and document show Artist request (GET request, show action) to end-point: /api/artists"() { given: "Pick an artist to show" Artist artist = Artist.first() and: "user logs in by a POST request to end-point: /api/login" String accessToken = authenticateUser('me', 'password') and: "documentation specification for showing an Artist" RequestSpecification requestSpecification = RestAssured.given(this.documentationSpec) .accept(MediaType.APPLICATION_JSON_VALUE) .filter( RestAssuredRestDocumentation.document( 'artists-retrieve-specific-example', PayloadDocumentation.responseFields( PayloadDocumentation.fieldWithPath('id').type(JsonFieldType.STRING).description('Artist id'), PayloadDocumentation.fieldWithPath('firstName').type(JsonFieldType.STRING).description('Artist first name'), PayloadDocumentation.fieldWithPath('lastName').type(JsonFieldType.STRING).description('Artist last name'), PayloadDocumentation.fieldWithPath('dateCreated').type(JsonFieldType.STRING).description("Date Created (format:MM/dd/yyyy)"), PayloadDocumentation.fieldWithPath('lastUpdated').type(JsonFieldType.STRING).description("Last Updated Date (format:MM/dd/yyyy)") ) ) ) when: "GET request is sent" def response = requestSpecification .header("X-Auth-Token", "${accessToken}") .when() .port(this.port) .get("${ARTISTS_ENDPOINT}/${artist.id}") def responseJson = new JsonSlurper().parseText(response.body().asString()) then: "The response is correct" response.then() .assertThat() .statusCode(HttpStatus.OK.value()) and: "response contains the id of Artist asked for" responseJson.id }

Similarly, we can write a test spec to test and document POST method (creating an Artist) as shown below. Remember, I have secured this method to role: ROLE_ADIN. So, it requires admin to be authenticated first to get a security token and then pass the security token in the subsequent secured requests like POST. The following is the complete test specification with a helper method added to authenticate the user:

/** * Helper method, authenticates the given user and returns the security token. * * @param username the user id * @param password the password * @return security token once successfully authenticated */ protected String authenticateUser(String username, String password) { String authResponse = RestAssured.given() .accept(MediaType.APPLICATION_JSON_VALUE) .contentType(MediaType.APPLICATION_JSON_VALUE) .body(""" {"username" : "$username", "password" : "$password"} """) .when() .port(this.port) .post(LOGIN_ENDPOINT) .body() .asString() return new JsonSlurper().parseText(authResponse).'access_token' } void "Test and document create Artist request (POST request, save action) to end-point: /api/artists"() { given: "current number of Artists" int nArtists = Artist.count() and: "admin logs in by a POST request to end-point: /api/login" String accessToken = authenticateUser('admin', 'admin') and: "documentation specification for creating an Artist" RequestSpecification requestSpecification = RestAssured.given(this.documentationSpec) .accept(MediaType.APPLICATION_JSON_VALUE) .contentType(MediaType.APPLICATION_JSON_VALUE) .filter( RestAssuredRestDocumentation.document( 'artists-create-example', PayloadDocumentation.requestFields( PayloadDocumentation.fieldWithPath('firstName').description('Artist first name'), PayloadDocumentation.fieldWithPath('lastName').description('Artist last name') ), PayloadDocumentation.responseFields( PayloadDocumentation.fieldWithPath('id').type(JsonFieldType.STRING).description('Artist id'), PayloadDocumentation.fieldWithPath('firstName').type(JsonFieldType.STRING).description('Artist first name'), PayloadDocumentation.fieldWithPath('lastName').type(JsonFieldType.STRING).description('Artist last name'), PayloadDocumentation.fieldWithPath('dateCreated').type(JsonFieldType.STRING).description("Date Created (format:MM/dd/yyyy)"), PayloadDocumentation.fieldWithPath('lastUpdated').type(JsonFieldType.STRING).description("Last Updated Date (format:MM/dd/yyyy)") ) ) ) when: "POST request is sent with valid data" def response = requestSpecification .header("X-Auth-Token", "${accessToken}") .body("""{ "firstName" : "Bhuvan", "lastName" : "Pottepalem" }""") .when() .port(this.port) .post(ARTISTS_ENDPOINT) def responseJson = new JsonSlurper().parseText(response.body().asString()) then: "The response is correct" response.then() .assertThat() .statusCode(HttpStatus.CREATED.value()) and: "response contains the id of Artist created" responseJson.id and: "Number of Artists in the system goes up by one" Artist.count() == nArtists + 1 }

Now, simply run
./gradlew asciidoctor

We will have API docs generated under build/asciidoc dir. Open api-guide.html in a browser to see how nicely the generated API doc looks.

TIP: The beauty of Spring REST Docs framework is that, if compares the actual request/response fields with the PayloadDocumentation filed descriptions and will fail the test if any field(s) are missed or mis-matched. This ensures that the API documentation is up-to-date with the implementation.

Step 4: Publish API
Now, we have fully integrated REST Assured and Spring REST Docs into integrationTest phase with an added asciidoctor Gradle test task. The result of this is an up-to-date API document generated for our Restful service.The API document is the source for clients using this service. So, it needs to be made available. One way to achieve this is to bundle the generated HTML5 API docs with the application's deployable war or executablejar and have it's own end-point to serve it.

Spring Boot (the framework Grails3 underpins) can be leveraged to achieve this. By default Boot serves static content placed under /static or /public in the class path or root of the application context. Here is the link for reference: Spring boot Static content.

Step 4a: Bundle API documentation into deployable artifact
We will enhance our build script (restdocs.gradle) and customize war task that comes with Gradle Java plugin little bit to achieve this. Below is the code snippet which is self explanatory:
/* Bundles generated API docs into war file. * Spring boot serves static content under /public or /static or /resources or /META-INF/resources. * Hooks into war task and adds asciidoctor task dependency, also copies generaed rest docs appropriately * for bundling into war file. */ def publicDocsDir = 'WEB-INF/classes/public/docs' war { dependsOn asciidoctor from ("${asciidoctor.outputDir}") { into publicDocsDir } }

We basically made war task depend on asciidoctor task and added a step to copy generated HTML5 API docs to WEB-INF/classes/public/docs dir in the generated war file.

Now, run grails war to generate deployable war artifact:
grails war

You can explode and see that generated API docs are bundled into the war generated (giri-api-0.1.war):
e.g. jar tvf build/libs/giri-api-0.1.war | grep html will list the following:
 59738 Mon Sep 04 07:16:34 EDT 2017 WEB-INF/classes/public/docs/api-guide.html
 47974 Mon Sep 04 07:16:34 EDT 2017 WEB-INF/classes/public/docs/artists.html

Step 4b: Make API documentation available from it's own end-point
Deploy the generated war file onto a locally running tomcat.
Deploy the war onto locally running Tomcat and point your browser at: http://localhost:8080/giri-api-0.1/static/docs/api-guide.html

This will result into Access Denied error. We need to open up security to serve API docs.

Lets change application.groovy and add /static/docs/** to both grails.plugin.springsecurity.controllerAnnotations.staticRules and filterChainChainMaps as shown below:
grails.plugin.springsecurity.controllerAnnotations.staticRules = [ ... [pattern: '/static/docs/**', access:['permitAll']] ] def filterChainChainMaps = [ ... pattern: '/static/docs/**', filters: statelessFilters], ... ]

Create a war file. Undeploy previously deployed war and deploy the latest war file.

Now, http://localhost:8080/giri-api-0.1/static/docs/api-guide.html (API docs) should be served and displayed by the app.

The test specification can be enhanced easily along these lines to test and document rest of the service methods: show, update and delete available for /api/artists end-point through HTTP methods GET specific resource by id, UPDATE and DELETE respectively.

The complete source code is hosted on GitHub at https://github.com/gpottepalem/giri-api for reference.

References

Sunday, November 20, 2016

Upgrading Grails-2 application to Grails-3: Spring Security Core Plugin differences . . .

I recently upgraded one of our Grails 2.2.1 with Spring Security core plugin 1.2.7.3 on Java 1.6 application to Grails 3.2.1 with Spring Security core plugin 3.1.1 on Java 1.8. By following the recommended path detailed out well enough in Grails 3 documentation, I got the following done before I got to the point of successfully running the application:
  • Upgraded one of our in-house plugins: ZipCityState
  • Reorganized Grails artifacts and other files as per Grails-3 app directory structure
  • Rewrote build and other configurations
  • Fixed several code compilation errors and issues resulted due to changed package names of several Grails frame-work classes and some classes that are deprecated and removed
  • Upgraded static resources like images, javascript and stylesheets from resources plugin to asset-pipeline plugin by re-organizing those files and creating appropriate asset-pipeline directives to mimic resource plugin's modules defined in AppResources.groovy
Once all the above are done, I had to make the following changes from the Security aspect for the application to successfully run, display and login:

Static Rules

Static rules are now List of Maps and not just a Map. I covered this in my previous post. Check it out.

Authentication

Change username and password form fields in login page (auth.gsp) from j_username and j_password to username and password.

If you have used UsernamePasswordAuthenticationFilter.SPRING_SECURITY_LAST_USERNAME_KEY somewhere in your code, you need to change that to SpringSecurityUtils.SPRING_SECURITY_LAST_USERNAME_KEY

If you have any pre authentication checks written by extending DefaultPreAuthenticationChecks, the hibernate session seems not created and attached to the current thread at this point.

If you run into any exception like the following, you may need to use either withTransaction or withSession method on the domain object to come over this.

org.springframework.dao.DataAccessResourceFailureException: Could not obtain current Hibernate Session; nested exception is org.hibernate.HibernateException: No Session found for current thread.

Password encryption algorithm differences

The application has an admin account created in the database only once with exists check from the Bootstrap. The login failed for admin user that was created by Grails 2.2.1 app and after upgrading to Grails 3.2.1 with the following exception:

ERROR org.apache.catalina.core.ContainerBase.[Tomcat].[localhost].[/].[grailsDispatcherServlet] - Servlet.service() for servlet [grailsDispatcherServlet] in context with path [] threw exception [Filter execution threw an exception] with root cause java.lang.AssertionError: Salt value must be null when used with crypto module PasswordEncoder. Expression: salt. Values: salt = admin at org.codehaus.groovy.runtime.InvokerHelper.assertFailed(InvokerHelper.java:404) at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.assertFailed(ScriptBytecodeAdapter.java:650) at grails.plugin.springsecurity.authentication.encoding.BCryptPasswordEncoder.checkSalt(BCryptPasswordEncoder.groovy:49)

The error was bit puzzling and made me to comment out the following Spring security core plugin's configuration property set in application.groovy:

//grails.plugin.springsecurity.dao.reflectionSaltSourceProperty = 'username’

Commenting out that property revealed the issue with the following error:
WARN org.springframework.security.crypto.bcrypt.BCryptPasswordEncoder - Encoded password does not look like BCrypt

After quickly reading through documents of both Grails-2 Spring Security Core Plugin and Grails-3 Spring Security Core Plugin, there was a special mention of Bcrypt algorithm in version 3 documentation. Also, it was specified up front in the Configuration Settings section of the doc that the plugin's default security settings are maintained in DefaultSecurityConfig.groovy file. I checked both plugin 2, plugin 3 and found the following differences:

Grails-3 plugin
password { algorithm = ‘bcrypt’ encodeHashAsBase64 = false bcrypt { logrounds = 10 } hash { iterations = 10000 } }

Grails-2 plugin
password.algorithm = 'SHA-256' password.encodeHashAsBase64 = false password.bcrypt.logrounds = 10

Differences are highlighted. The hash.iterations property is set to 10000 in Grails-3 plugin, but is not set explicitly in Grail-2 plugin. I had to add algorithm and hash.iterations explicitly to match Grails-2 plugin and retain the reflectionSaltSourceProperty in application.groovy.The following are the changes:

grails.plugin.springsecurity.password.algorithm = 'SHA-256' grails.plugin.springsecurity.password.hash.iterations = 1 grails.plugin.springsecurity.dao.reflectionSaltSourceProperty = 'username'

Summary

With static rule configuration changes, password encryption properties changes and code changes to auth.gsp and some security related classes, I was able to get the application successfully migrated from Grails 2.2.1 to Grails 3.2.1 along with upgraded Spring Security core plugin.

References

Grails 3.2.1 documentation
Grails Spring Security 2 documentation
Grails Spring Security 3 documentation