Wednesday, December 08, 2021

Maven Dependencies Fix - Need to support both Log4j (ver 1) & Log4j2 (ver 2) but exclude Log4j (ver 1) . . .

Software development never gets simple. A simple logging in modern Java applications is not simple enough. There are several logging frameworks: java.util.logging, log4j, log4j2, SLF4J, Logback etc. A typical Java application these days brings in one, or more, or even all of these dependencies. It's often a confusion which one is better, which one to use, which one is in action, which one to configure, and how to configure etc. It becomes worse if you need to exclude any.

I was recently working on a task to fix Security Vulnerabilities detected and reported by a commercial SaaS tool that scans Java project codebase and it's dependent libraries. The tool scans and generates a report listing vulnerabilities detected by categorizing each into Critical/High/Medium/Low. Obviously, when there is an unpatched dependent library marked Critical, that draws superior-attention and brings in a worry with the word: "hacking"- a scary or not-scary word in Software Engineering, it depends ;).

This post is NOT about the messy path that Java logging has been going on right from the beginning. It's about dealing with a need to support both log4j (ver 1) and log4j2 (ver 2) in a Java project but want to exclude log4j from the dependency list to fix the security vulnerability reported in reference to the specific vulnerability reported on log4j in the National Vulnerability Database

To deal with this issue, all you ned to do is exclude log4j from the dependency library that depends on it, and add an explicit dependency of log4j2 and the log4j1-2 bridge.

The following is a snipper of maven pom.xml.
... <properties> <log4j.version>2.15.0</log4j.version> </properties> <dependencies> ... <dependency> <groupId>com.some.lib</groupId> <artifactId>somelib-need-log4j</artifactId> <exclusions> <!-- Fix Security Vulnerability reported with log4j-1.2.17, the last discontinued version of log4j 1. Exclude log4j 1 and depend on the log4j 1 to 2 bridge and bring in log4j 2 explicitly --> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> </exclusions> </dependency> <!-- Fix Security Vulnerabily reported with log4j-1.2.17 Log4j 1 to 2 bridge to forward log4j 1 calls to 2 --> <!-- log4j 1 to 2 bridge --> <dependency> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-1.2-api</artifactId> <version>${log4j.version}</version> </dependency> <!-- log4j 2 --> <dependency> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-api</artifactId> <version>${log4j.version}</version> </dependency> <dependency> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-core</artifactId> <version>${log4j.version}</version> </dependency> ... </dependencies> ...

TIP

Leverage maven dependency:tree goal to see which dependency's transitive dependencies bring in the library that needs an attention.

mvn dependency:tree


References

Thursday, October 21, 2021

Spring boot upgrade from 2.2.x to 2.5.x - Spring Cloud Sleuth Zipkin - log message format change . . .

I recently upgraded a Spring Boot micro-service application from Spring Boot 2.2.11 to 2.5.3 and Java OpenJDK 15 to Java Amazon Corretto 16.

Upgrading Java was smooth. Some major and breaking changes that I came across from Spring Boot side include:
  • Profile changes - documented well
  • Dependency changes - Obvious
  • Spring Cloud Sleuth Zipkin - Undocumented glitch
This post is on the Spring Cloud Sleuth Zipkin - undocumented removal of exportable property.

Environment: Java 16, Spring Boot 2.5.3 on macOS Catalina 10.15.7

We have application logs collected, ingested and indexed in Datadog by Logstash scrapping Mesos application's stdout. Soon after the upgrade, logs stopped showing up in Datadog. After some investigation, figured out that the pattern parser failed to parse log string as it was expecting log string format like: [service, trace, span, exportable]. But the format after the upgrade was: [service, trace, span]. Basically, the last exportable property with value true or false was missing. 

The trace, span, exportable are injected into log messages by Spring Cloud Sleuth and Zipkin for distributed tracing via log framework MD5. The documentation of Spring Cloud Sleuth seems not updated with this details. :(

In order to get away with this issue on the Datadog, switching to JSON log message format seemed like a better solution as the JSON is better than strict string pattern matchers. This required an explicit logback definition in resources/logback-spring.xml file (yes, XML is the only way) and a new dependency for JSON logging.

src/main/resources/logback-spring.xml
<?xml version="1.0" encoding="UTF-8"?> <configuration> <include resource="org/springframework/boot/logging/logback/defaults.xml"/> <springProperty scope="context" name="serviceName" source="spring.zipkin.service.name"/> <appender name="console" class="ch.qos.logback.core.ConsoleAppender"> <encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder"> <providers> <pattern> <pattern> { "timestamp": "%d{yyyy-MM-dd HH:mm:ss.SSS}", "severity": "%level", "service": "${serviceName:-}", "pid": "${PID:-}", "thread": "%thread", "class": "%logger{40}", "trace": "%X{traceId:-}", "span": "%X{spanId:-}", "parent": "%X{parentId:-}", "exportable": "%X{sampled:-}", "logmessage": "%message" } </pattern> </pattern> <!-- Additional support needed for logging stack trace in JSON message --> <!-- https://github.com/logfellow/logstash-logback-encoder --> <stackTrace> <throwableConverter class="net.logstash.logback.stacktrace.ShortenedThrowableConverter"> <maxDepthPerThrowable>30</maxDepthPerThrowable> <maxLength>4096</maxLength> <shortenedClassNameLength>20</shortenedClassNameLength> <rootCauseFirst>true</rootCauseFirst> </throwableConverter> </stackTrace> </providers> </encoder> </appender> <root level="INFO"> <appender-ref ref="console" /> </root> </configuration>

Maven build file, new dependency for JSON log message: pom.xml
<dependency> <groupId>net.logstash.logback</groupId> <artifactId>logstash-logback-encoder</artifactId> <version>6.6</version> </dependency>

With this a sample JSON exception log messages looks like:

{ "timestamp": "2021-12-22 12:26:26.552", "severity": "ERROR", "service": "my-service-api", "pid": "65623", "thread": "http-nio-8080-exec-6", "class": "c.g.s.e.MyServceImpl", "trace": "529f2d2d7a65dde4", "span": "529f2d2d7a65dde4", "parent": "", "exportable": "", "logmessage": "My Exception log mesage.", "stack_trace": "c.g.s.e.MyException: my exception message\n\tat c.g.s.s.MyServiceImpl.serviceMethodOne(MyServiceImpl.java:210)\n\tat c.g.s.s.MyServiceImpl.serviceMethoTwo(MyServiceImpl.java:280)\n" }

References


Wednesday, October 06, 2021

I still love Groovy . . .

I was joyfully coding in Groovy for several years. Back to Java two years ago, and have not been writing any production code in Groovy, still writing my own productive non-production utilities in Groovy whenever and wherever I can.

I am trying my best to apply neat things that I learned while working in Groovy projects by using its ecosystem frameworks like Grails framework, Gradle build tool,  Spock framework etc. Within the limitations of Java, Spring Boot, and Maven development world, I am trying hard to write less verbose, and more readable code by leveraging new Java language features including some of each of its version's preview features.

Java is evolving at a steady pace now. Better late than never ;). Still far-away compared to what Groovy was 10+ years ago, or any of current modern languages, in terms of developer's productivity.

I was bit happy to see some convenient factory methods making into Java's collection classes, version after version, since Java 9. Have been happily using one such static factory method .of() on List and Map without caring much of their internal implementations. 

Environment: Java 16, Groovy 3.0.9 on macOS Catalina 10.15.7

Today, I was happily writing code using Map.of() method and kept on adding elements, I had about a dozen of static keys and values to add. IntelliJ was also going happy with me. At some point suddenly IntelliJ turned angry (red) at me. The error was not clear, another Java classic hard-to-understand compilation error. I started to wonder what did I do wrong, was going back and forth on each element I was adding. Quickly realized I was hitting some limitation. Java language team chose the lucky number 10 for these convenient factory methods. There are actually 10 static factory methods named of() that take one to ten arguments.

I fell in love with it, the very first-time I started using it as it's little more concise and readable (not as concise and readable as groovy, but close), but quickly ran into limitations.
  • Map.of() method introduced in Java 9 allows to create an immutable map with up to 10 keys-value pairs.
  • It return an immutable map.
So, use it when you are ok with immutable small Map of up to 10 elements.

The following groovy snippet shows how close (still little more verbose) Java got to Groovy from the painful-finger-typing way to create and initialize a Map with a fixed set of elements. 

// Groovy def groovyMap = [ 'a' : [1], 'b' : [1,2], 'c' : [1,2,3], 'd' : [1,2,3,4], 'e' : [1,2,3,4,5], 'f' : [1,2,3,4,5,6], 'g' : [1,2,3,4,5,6,7], 'h' : [1,2,3,4,5,6,7,8], 'i' : [1,2,3,4,5,6,7,8,9], 'j' : [1,2,3,4,5,6,7,8,9,10], 'k' : [1,2,3,4,5,6,7,8,9,10,11], 'l' : [1,2,3,4,5,6,7,8,9,10,11,12], ] println groovyMap // Java var javaMap = Map.of( "a" , List.of(1), "b" , List.of(1,2), "c" , List.of(1,2,3), "d" , List.of(1,2,3,4), "e" , List.of(1,2,3,4,5), "f" , List.of(1,2,3,4,5,6), "g" , List.of(1,2,3,4,5,6,7), "h" , List.of(1,2,3,4,5,6,7,8), "i" , List.of(1,2,3,4,5,6,7,8,9), "j" , List.of(1,2,3,4,5,6,7,8,9,10), "k" , List.of(1,2,3,4,5,6,7,8,9,10,11), "l" , List.of(1,2,3,4,5,6,7,8,9,10,11,12) ) println javaMap

IntelliJ goes unhappy, with error:
Cannot resolve method 'of(java.lang.String, java.util.List<E>, java.lang.String, java.util.List<E>, java.lang.String, java.util.List<E>, java.lang.String, java.util.List<E>, java.lang.String, java.util.List<E>, java.lang.String, java.util.List<E>, java.lang.String, java.util.List<E>, java.lang.String, java.util.List<E>, java.lang.String, java.util.List<E>, java.lang.String, java.util.List<E>, java.lang.String, java.util.List<E>, java.lang.String, java.util.List<E>

Java compiler stays unhappy with compilation error:

no suitable method found for of(java.lang.String,java.util.List<java.lang.Integer>,java.lang.String,java.util.List<java.lang.Integer>,java.lang.String,java.util.List<java.lang.Integer>,java.lang.String,java.util.List<java.lang.Integer>,java.lang.String,java.util.List<java.lang.Integer>,java.lang.String,java.util.List<java.lang.Integer>,java.lang.String,java.util.List<java.lang.Integer>,java.lang.String,java.util.List<java.lang.Integer>,java.lang.String,java.util.List<java.lang.Integer>,java.lang.String,java.util.List<java.lang.Integer>,java.lang.String,java.util.List<java.lang.Integer>,java.lang.String,java.util.List<java.lang.Integer>) method java.util.Map.<K,V>of() is not applicable (cannot infer type-variable(s) K,V (actual and formal argument lists differ in length)) method java.util.Map.<K,V>of(K,V) is not applicable (cannot infer type-variable(s) K,V (actual and formal argument lists differ in length)) method java.util.Map.<K,V>of(K,V,K,V) is not applicable (cannot infer type-variable(s) K,V (actual and formal argument lists differ in length)) method java.util.Map.<K,V>of(K,V,K,V,K,V) is not applicable (cannot infer type-variable(s) K,V (actual and formal argument lists differ in length)) method java.util.Map.<K,V>of(K,V,K,V,K,V,K,V) is not applicable (cannot infer type-variable(s) K,V (actual and formal argument lists differ in length)) method java.util.Map.<K,V>of(K,V,K,V,K,V,K,V,K,V) is not applicable (cannot infer type-variable(s) K,V (actual and formal argument lists differ in length)) method java.util.Map.<K,V>of(K,V,K,V,K,V,K,V,K,V,K,V) is not applicable (cannot infer type-variable(s) K,V (actual and formal argument lists differ in length)) method java.util.Map.<K,V>of(K,V,K,V,K,V,K,V,K,V,K,V,K,V) is not applicable (cannot infer type-variable(s) K,V (actual and formal argument lists differ in length)) method java.util.Map.<K,V>of(K,V,K,V,K,V,K,V,K,V,K,V,K,V,K,V) is not applicable (cannot infer type-variable(s) K,V (actual and formal argument lists differ in length)) method java.util.Map.<K,V>of(K,V,K,V,K,V,K,V,K,V,K,V,K,V,K,V,K,V) is not applicable (cannot infer type-variable(s) K,V (actual and formal argument lists differ in length)) method java.util.Map.<K,V>of(K,V,K,V,K,V,K,V,K,V,K,V,K,V,K,V,K,V,K,V) is not applicable (cannot infer type-variable(s) K,V (actual and formal argument lists differ in length))

When I executed the above code snippet in groovyconsole Groovy compiler at least gave me little better message pointing at the line (29: "k" , List.of(1,2,3,4,5,6,7,8,9,10,11),) that failed compilation and made me think of exceeding some limitation.

groovy.lang.MissingMethodException: No signature of method: static java.util.Map.of() is applicable for argument types: (String, List12, String, List12, String, ListN, String, ListN, String...) values: [a, [1], b, [1, 2], c, [1, 2, 3], d, [1, 2, 3, 4], e, [1, 2, ...], ...] at ConsoleScript9.run(ConsoleScript9:29)

The workaround, I had to go more verbose; at least, better than old way of painful-finger-typing. ;)
import static java.util.Map.entry; // Groovy def groovyMap = [ 'a' : [1], 'b' : [1,2], 'c' : [1,2,3], 'd' : [1,2,3,4], 'e' : [1,2,3,4,5], 'f' : [1,2,3,4,5,6], 'g' : [1,2,3,4,5,6,7], 'h' : [1,2,3,4,5,6,7,8], 'i' : [1,2,3,4,5,6,7,8,9], 'j' : [1,2,3,4,5,6,7,8,9,10], 'k' : [1,2,3,4,5,6,7,8,9,10,11], 'l' : [1,2,3,4,5,6,7,8,9,10,11,12], ] println groovyMap // Java var javaMap = Map.ofEntries( entry("a" , List.of(1)), entry("b" , List.of(1,2)), entry("c" , List.of(1,2,3)), entry("d" , List.of(1,2,3,4)), entry("e" , List.of(1,2,3,4,5)), entry("f" , List.of(1,2,3,4,5,6)), entry("g" , List.of(1,2,3,4,5,6,7)), entry("h" , List.of(1,2,3,4,5,6,7,8)), entry("i" , List.of(1,2,3,4,5,6,7,8,9)), entry("j" , List.of(1,2,3,4,5,6,7,8,9,10)), entry("k" , List.of(1,2,3,4,5,6,7,8,9,10,11)), entry("l" , List.of(1,2,3,4,5,6,7,8,9,10,11,12)) ) println javaMap

NOTE: It's only the Map.of() that has this limitation, the methods List.of(), Set.of() do not have.

Gotcha

  • The method map.of() returns an immutable map though the method signature says it returns simply a Map.
  • In other words it is an unmodifiable map, keys and values cannot be added, removed or updated.
  • When operation to modify the returned Map like put(), replace(), or remove() are performed, they would result with an UnsupportedOperationException with a null exception exception message ;)

Conclusion

I still love Groovy for its simple, less confusing, yet more expressive syntax.

References


Thursday, August 05, 2021

Maven - multi-module Java project code coverage . . .

In addition to developing, building & deploying modern Java applications is also a developers' concern. Build tools come with a promise to save developers' time. But often times they suck developers in. Maven is known for that ;)

Choosing the right building tool before you even start an application/project is a standard practice these days. In modern Java world, the two popular build system choices are: Maven and Gradle. Though Gradle started as the "Next Generation Build Tool" with groovy programming language and well designed DSL to write build scripts, and by addressing pitfalls of Maven, maven still rules modern Java world with the legacy XML (Extensible Markup Language), which is not a programming language and is only good for structured data management. The modern software principle "as code" applied to everything these days, even to infrastructure, still doesn't apply to Maven build scripts.

A multi-module project is inevitable if you build any application with modularity and reusability. Maven poses great many challenges in this use-case. There are solutions available for every issue, but you end up spending too much time reading the poor documentation again and again scratching your head, doing more of the same with plugins documentations, and even more of the same in the form of question & answers on the stackoverflow. Clearly, this is not the way, but unfortunately is the way to find solutions, these days.

This post is the result of a 3-day fight with Maven in getting the multi-module code coverage working in a Maven multi-module project. The following 3 maven plugins are in main focus in this are(n)a:
Environment: Java 16, Spring Boot 2.5.3 on macOS Catalina 10.15.7

The Problem Scenario - code in one module, test-cases in another module

It is quite common in multi-module project to have code in one module, and some test-cases if not all in other module(s). For instance, a multi-module maven project with a sharable domain module, a sharable services module and an API micro-service application module (spring-boot based) is a best example of this scenario.

In this scenario, for example, the domain model code can get it's code coverage from unit test-cases as both source code and test-cases reside in the same module. The Maven Jacoco code coverage plugin works quite well in this case. But, it could be bit hard to write integration test-cases for the services module as it requires spring application context and spring-boot configurations. So, the API application module will definitely have a set of integration test-cases as it is a spring boot application with spring context and configurations available. This is the case that requires a better solution for generating code coverage reports by covering the multi-module distributed application code with distributed test-cases.

The two key-points in this scenario are:
  1. Test-cases in one module (API application) covering code in other module (services/domain) in addition to the other module's own code coverage.
  2. An individual module-wise code coverage report for all the modules with their own code coverage by their own test-cases and coverage threshold checks.
  3. A overall consolidated/aggregated but module-wise code coverage report for the entire application code and a code coverage threshold check for the overall code in all modules.

The Solution

The JaCoCo maven plugin from version 0.7.7 onwards offers a new report-aggregate goal. This is the goal that can be leveraged to get an aggregated code coverage report generated. However, it is not straight forward getting this done.

The following is an example multi-module project structure, my-app is the root project, my-app-api is a Spring Boot application with my-app-domain and my-app-services modules that it depends on:
. └── my-app ├── my-app-api │ └── pom.xml ├── my-app-domain │ └── pom.xml ├── my-app-services │ └── pom.xml ├── my-app-code-coverage │ └── pom.xml └── pom.xml

The project's root module my-app's pom.xml file looks something as shown below:
... <modules> <module>my-app-api</module> <module>my-app-domain</module> <module>my-app-services</module> <module>my-app-code-coverage</module> </modules> ... <properties> <jacoco.plugin.version>0.8.7</jacoco.plugin.version> <surefire.plugin.version>2.22.2</surefire.plugin.version> <failsafe.plugin.version>2.22.2</failsafe.plugin.version> </properties> ... <build> <plugins> <!-- jacoco for code coverage --> <plugin> <groupId>org.jacoco</groupId> <artifactId>jacoco-maven-plugin</artifactId> <version>${jacoco.plugin.version}</version> <executions> <!-- jacoco agent for unit-tests code coverage --> <execution> <id>initialize-coverage-before-unit-test-execution</id> <goals> <goal>prepare-agent</goal> </goals> </execution> <!-- jacoco agent for integration-tests code coverage --> <execution> <id>initialize-coverage-before-integration-test-execution</id> <goals> <goal>prepare-agent</goal> </goals> <phase>pre-integration-test</phase> <configuration> <propertyName>integrationTestCoverageAgent</propertyName> </configuration> </execution> </executions> </plugin> <!-- UNIT tests--> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire-plugin</artifactId> <version>${surefire.plugin.version}</version> <configuration> <excludes> <exclude>**/*IT.java</exclude> </excludes> <!-- NOTE: In case if you need to pass in special JVM argumengts for instance say, to enable Java language preview features, the following is how it MUST be done. The @{argLine} goes through late evaluattion that points to Jacoco agent JVM argument followed by any additional JVM arguments of your choice each separated by space. The expression @{argLine} retains Jacoco JVM agent argument. Without this any additional JVM arguments that you add will be taken but you lose Jacoco's argument causing to lose code coverage report files and hence the coverage report. --> <argLine>@{argLine} --enable-preview</argLine> </configuration> </plugin> <!-- INTEGRATION tests --> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-failsafe-plugin</artifactId> <version>${failsafe.plugin.version}</version> <executions> <execution> <id>integration-tests</id> <goals> <goal>integration-test</goal> <goal>verify</goal> </goals> <configuration> <additionalClasspathElements> <additionalClasspathElement>${basedir}/target/classes</additionalClasspathElement> </additionalClasspathElements> <includes> <include>**/*IT.java</include> </includes> <excludes> <exclude>com.my.api.service.MyNotUsedServiceIT</exclude> </excludes> <!-- NOTE: When running as a Maven plugin, the JaCoCo agent configuration is prepared by invoking the prepare-agent or prepare-agent-integration goals, before the actual tests are run. This sets a property named argLine which points to the JaCoCo agent, later passed as a JVM argument to the test runner. --> <argLine>${integrationTestCoverageAgent}</argLine> <!-- NOTE: In case if you need to pass in special JVM argumengts for instance say, to enable Java language preview features, the following is how it MUST be done. The @{argLine} goes through late evaluattion that points to Jacoco agent JVM argument followed by any additional JVM arguments of your choice each separated by space. The expression @{argLine} retains Jacoco JVM agent argument. Without this any additional JVM arguments that you add will be taken but you lose Jacoco's argument causing to lose code coverage report files and hence the coverage report. --> <argLine>@{argLine} --enable-preview</argLine> </configuration> </execution> </executions> </plugin> </plugins> </build> ...

Notable points from the above build file around code coverage are:
  • The JaCoCo plugin configuration for code coverage with two execution configurations for unit and integration tests coverage. Make a note of the configuration <propertyName>integrationTestCoverageAgent</propertyName >, it can be any string. The same name should be passed as an argument (argLine) for failsafe configuration. Also, make sure that you retain Jacoco JVM agent argument pointed to by Jacoco added argLine property to be evaluated late in the game as the recommended expression @{argLine} in case if you have additional JVM arguments to be passed for test executions.
  • The surefire plugin configuration for unit tests.
  • The failsafe plugin configuration for integration tests.
The domain my-app-domain module's pom.xml file is something like shown below:
... <build> <plugins> <plugin> <groupId>org.jacoco</groupId> <artifactId>jacoco-maven-plugin</artifactId> <version>${jacoco.plugin.version}</version> <executions> <execution> <id>generate-code-coverage-report</id> <phase>test</phase> <goals> <goal>report</goal> </goals> </execution> <execution> <id>perform-code-coverage-threshold-check</id> <goals> <goal>check</goal> </goals> <configuration> <!-- Set Rule to fail build if code coverage is below certain threshold --> <rules> <rule implementation="org.jacoco.maven.RuleConfiguration"> <element>BUNDLE</element> <limits> <limit implementation="org.jacoco.report.check.Limit"> <counter>INSTRUCTION</counter> <value>COVEREDRATIO</value> <minimum>0.60</minimum> </limit> </limits> </rule> </rules> </configuration> </execution> </executions> </plugin> ... </plugins> ... </build> ...

The api my-app-api module's pom.xml file is something like shown below, very similar to my-app-domain module:
... <build> <plugins> <plugin> <groupId>org.jacoco</groupId> <artifactId>jacoco-maven-plugin</artifactId> <version>${jacoco.plugin.version}</version> <executions> <execution> <id>generate-code-coverage-report</id> <phase>test</phase> <goals> <goal>report</goal> </goals> </execution> <execution> <id>perform-code-coverage-threshold-check</id> <goals> <goal>check</goal> </goals> <configuration> <!-- Set Rule to fail build if code coverage is below certain threshold --> <rules> <rule implementation="org.jacoco.maven.RuleConfiguration"> <element>BUNDLE</element> <limits> <limit implementation="org.jacoco.report.check.Limit"> <counter>INSTRUCTION</counter> <value>COVEREDRATIO</value> <minimum>0.80</minimum> </limit> </limits> </rule> </rules> </configuration> </execution> </executions> </plugin> ... </plugins> ... </build> ...

Notable points from the above two modules' build files around code coverage are:
  • The JaCoCo plugin's additional configuration for code coverage with execution configurations for code coverage report (goal: report) and code coverage threshold check (goal: check) with a threshold ration number (0.60 for domain and 0.80 for api).
  • No surefire and failsafe configurations are needed as they are available in sub-modules from the root/main module's build configuration.
The new modulemy-app-code-coverage module's pom.xml file for consolidated code coverage is like shown below:
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <parent> <groupId>com.giri</groupId> <artifactId>my-app</artifactId> <version>1.0.0-SNAPSHOT</version> </parent> <modelVersion>4.0.0</modelVersion> <artifactId>my-app-code-coverage</artifactId> <packaging>pom</packaging> <name>My Api App Service multi-module code coverage</name> <description>Module for My Api App multi-module code coverage across all modules</description> <properties> <code.coverage.project.dir>${basedir}/../</code.coverage.project.dir> <code.coverage.overall.data.dir>${basedir}/target/</code.coverage.overall.data.dir> <maven-resources-plugin.version>3.2.0</maven-resources-plugin.version> </properties> <dependencies> <dependency> <groupId>com.giri</groupId> <artifactId>my-app-domain</artifactId> <version>${project.version}</version> <scope>compile</scope> </dependency> <dependency> <groupId>com.giri</groupId> <artifactId>my-app-services</artifactId> <version>${project.version}</version> <scope>compile</scope> </dependency> <dependency> <groupId>com.giri</groupId> <artifactId>my-app-api</artifactId> <version>${project.version}</version> <scope>compile</scope> </dependency> </dependencies> <build> <plugins> <!-- required by jacoco for the goal: check to work --> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-resources-plugin</artifactId> <version>${maven-resources-plugin.version}</version> <executions> <execution> <id>copy-class-files</id> <phase>generate-resources</phase> <goals> <goal>copy-resources</goal> </goals> <configuration> <overwrite>false</overwrite> <resources> <resource> <directory>../my-app-domain/target/classes</directory> </resource> <resource> <directory>../my-app-services/target/classes</directory> </resource> <resource> <directory>../my-app-api/target/classes</directory> </resource> </resources> <outputDirectory>${project.build.directory}/classes</outputDirectory> </configuration> </execution> </executions> </plugin> <plugin> <groupId>org.jacoco</groupId> <artifactId>jacoco-maven-plugin</artifactId> <version>${jacoco.plugin.version}</version> <executions> <execution> <id>report-aggregate</id> <phase>verify</phase> <goals> <goal>report-aggregate</goal> </goals> </execution> <execution> <id>merge-results-data</id> <phase>verify</phase> <goals> <goal>merge</goal> </goals> <configuration> <fileSets> <fileSet> <directory>${code.coverage.project.dir}</directory> <includes> <include>**/target/jacoco.exec</include> </includes> </fileSet> </fileSets> <destFile>${code.coverage.overall.data.dir}/aggregate.exec</destFile> </configuration> </execution> <execution> <id>perform-code-coverage-threshold-check</id> <phase>verify</phase> <goals> <goal>check</goal> </goals> <configuration> <dataFile>${code.coverage.overall.data.dir}/aggregate.exec</dataFile> <rules> <rule> <element>BUNDLE</element> <limits> <limit> <counter>INSTRUCTION</counter> <value>COVEREDRATIO</value> <minimum>0.90</minimum> </limit> </limits> </rule> </rules> </configuration> </execution> </executions> </plugin> </plugins> </build> </project>

Notable points from the above build file around code coverage are:
  • List all modules that the code lives in as dependencies for this module to get the consolidated report generated.
  • Collect all compiled class files from all modules under this module's build directory. This requires maven-resource-plugin and is required for JaCoCo goal: check for coverage threshold check.
  • JaCoCo plugin configuration with three executions with goals: report-aggregate, merge, and check for code coverage threshold check.
  • The JaCoCo execution goal: report-aggregate is the one that gets aggregate reports generated.
  • The JaCoCo goal: merge is needed to merge all modules' jacoco.exec files to be merged into one file.
  • And, of course, the JaCoCo goal: check is needed for the overall code coverage threshold check and a threshold ratio number (0.90) which is different than any of the individual module's threshold ration number.
With the above maven module build files, from the root project just run: mvn clean install or mvn clean verify. It cleans, compiles code, runs all test-cases, and generates code coverage reports in each module's build directory: target/site/jacoco. Each module's coverage report shows code coverage attained from test-cases existing within that module. This number could be different (less or equal) for the same module in the overall code coverage report. It also, generates an overall aggregated code coverage report in the newly added module's build directory: my-app-code-coverage/target/site/jacoco-aggregate. The overall code-coverage generates module-wise code coverage with the overall coverage threshold level checked.

TIPS

  • Have plugin configurations in the main/root project pom.xml file so that they are available to sub-modules. Only overwrite or add things that are necessary for the sub-module. For example the report goal configuration for JaCoCo in each sub-module and report-aggregate goal configuration for the overall code-coverage module.
  • The main/root project can define common code coverage configurations & executions for JaCoCo, surefire and failsafe plugins for all modules with coverage threshold check value set to 0.0 in the root with sub-modules overriding that property with their specific values. That way build scripts can follow the DRY principle.
  • It is good to have each module report generated in it's build taget to see the code coverage of the module by it's own test-cases though the special overall code coverage module generates reports for the overall coverage for all modules.
  • If there is any module that contains code but not test-cases due to any limitations like not having needed spring application context, and boot configurations, then that module's build file (pom.xml) doesn't need JaCoCo additional configuration for goal: report.
  • Another helpful maven goal is help:effective-pom. By running this command: mvn help:effective-pom for any sub-module, you can see and verify the effective pom (XML) with all parent inherited plugins and properties resolved, so that you don't need to do any guess work. This is very useful in multi-module builds to investigate any issues.

GOTCHAS

  • If surefire or failsafe plugins do not run unit & integration test-cases and do not leave a clue even when run in debug mode with mvn -X option, just try adding junit dependency surefire-junit47 as described in the plugin documentation to specify the test-framework provider.
  • For passing additional JVM arguments like --enable-preview or anything else, make sure to use the expression ${argLine} to retain Jacoco JVM agent argument along with your additional arguments. Missing that expression will miss code coverage and it's hard figuring out.
  • With Java 16, you might run into IllegalClassFormatException if your integration test-cases hit any code that uses reflection. The test-cases pass and they get the correct code coverage. This exception in the build output can just be treated as a misleading noise and can be filtered by excluding all those classes that are involved in the reflection. For instance an exception like: java.lang.instrument.IllegalClassFormatException: Error while instrumenting com/giri/app/util/MyClassOneMethodAccess. can be filtered by adding <excludes> to the JaCoCo plugin configuration as shown below:
<build> <plugins> <!-- jacoco for code coverage --> <plugin> <groupId>org.jacoco</groupId> <artifactId>jacoco-maven-plugin</artifactId> <version>${jacoco-maven-plugin.version}</version> <configuration> <!-- Filter the misleading Exception noise by IllegalClassFormatException from JaCoCo instrumentation --> <excludes> <exclude>*MyClassOneMethodAccess*</exclude> <exclude>*MyClassTwoMethodAccess*</exclude> ... </excludes> </configuration> ...
  • If code coverage threshold ratio number doesn't match the coverage report total coverage percentage number (less than the threshold number), this could be due to a silent failure in appending the integration tests coverage report to the JaCoCo generated binary coverage report file: jacoco.exec which is used for generating the HTML code coverage reports for both unit and integration tests combined. This file typically gets generated with results after running the unit tests and get appended with results of integration tests. To fix this issue, the combined binary report file can be separated and then merged as shown below. This also gives greater control on code coverage reporting.
<build> <plugins> <!-- jacoco for code coverage --> <plugin> <groupId>org.jacoco</groupId> <artifactId>jacoco-maven-plugin</artifactId> <version>${jacoco-maven-plugin.version}</version> <configuration> <excludes> <exclude>*MyClassOneMethodAccess*</exclude> <exclude>*MyClassTwoMethodAccess*</exclude> </excludes> </configuration> <executions> <!-- jacoco unit test agent for code coverage --> <execution> <id>initialize-coverage-before-unit-test-execution</id> <goals> <goal>prepare-agent</goal> </goals> <configuration> <destFile>${project.build.directory}/jacoco-unit.exec</destFile> </configuration> </execution> <!-- jacoco integration test agent for code coverage --> <execution> <id>initialize-coverage-before-integration-test-execution</id> <goals> <goal>prepare-agent</goal> </goals> <phase>pre-integration-test</phase> <configuration> <propertyName>integrationTestCoverageAgent</propertyName> <destFile>${project.build.directory}/jacoco-integration.exec</destFile> </configuration> </execution> <execution> <id>generate-merged-code-coverage-report</id> <phase>post-integration-test</phase> <goals> <goal>merge</goal> <goal>report</goal> </goals> <configuration> <!-- merge config --> <destFile>${project.build.directory}/jacoco-merged.exec</destFile> <fileSets> <fileSet> <directory>${project.build.directory}</directory> <includes> <include>*.exec</include> </includes> </fileSet> </fileSets> <!-- report config --> <dataFile>${project.build.directory}/jacoco-merged.exec</dataFile> </configuration> </execution> <!-- Threshold check --> <execution> <id>coverage-check</id> <goals> <goal>check</goal> </goals> <configuration> <dataFile>${project.build.directory}/jacoco-merged.exec</dataFile> <!-- Set Rule to fail build if code coverage is below certain threshold --> <rules> <rule implementation="org.jacoco.maven.RuleConfiguration"> <element>BUNDLE</element> <limits> <limit implementation="org.jacoco.report.check.Limit"> <counter>INSTRUCTION</counter> <value>COVEREDRATIO</value> <minimum>${jacoco.percentage.instruction}</minimum> </limit> </limits> </rule> </rules> </configuration> </execution> </executions> </plugin> ...
  • Skipping unit/integration tests - both surefire and failsafe plugins offer a default pre-defined property skipTests which is false by default and when set to true skips both unit and integration tests. Unless until needed, no special configuration is needed to get a good hold on running and skipping tests. However, this is bit tricky. From the project-root/main-module run the following to control running tests of my-app-api application module.
Skip all tests:
  ./mvnw -pl my-app-api clean install -DskipTests
Run only unit tests (Skip integration tests): 
  ./mvnw -pl my-app-api surefire:test
Run specific unit test: 
  ./mvnw -pl my-app-api surefire:test -Dtest=MyUtilTest
Run specific set of unit tests, matching pattern:
  ./mvnw -pl my-app-api surefire:test -Dtest=MyU*
Run only integration tests (Skip unit tests):
  ./mvnw -pl my-app-api failsafe:integration-test
Run specific integration test:
  ./mvnw -pl my-app-api failsafe:integration-test -Dit.test=MyAppIT
Run specific set of integration tests, matching pattern:
  ./mvnw -pl my-app-api failsafe:integration-test -Dit.test=MyApp*


Summary

Maven eats up your time. You often get puzzled with many things mixed up in XML files. It's always confusingly challenging to deal with XML as specification for driving application builds.

"Making simple things super-complex" is what Software Engineering is all about. Of course, new concepts, languages, frameworks keep coming in attempts to make complex simple, but in reality only making complex more-complex. Anyways, have FUN with solving build issues/problems, and finding/inventing/re-inventing solutions in Maven & it's plugins.

References

Thursday, May 13, 2021

PostgreSQL - JSON - Gotchas . . .

These days, JSON is widely used for data-interchange mainly because it's light-weight, simple and readable. Many programming languages have a very good support for JSON, either a built-in support or through libraries.

Databases also added support for JSON in recent years. PostgreSQL is no exception in this space. However, each Database adds its own set of JSON functions and specific syntax for working with JSON data stored in its database. There is no standard and one has to become familiar with the database specific syntax and function to save and retrieve JSON data in and out of database. This post mainly focuses on gotchas dealing with JSON in PostgreSQL Database.

PostgreSQL database Version:12.x

Gotcha-1: Updating JSON data

PostgreSQL documentation lists all JSON functions. However, updating JSON data has a Gotcha.  It only works if the JSON column has JSON data. If the JSON column is NULL, then the function jsonb_set() doesn't work. It's bit frustrating as it doesn't fail with an error either. It misleads by returning the number of records updated.

Click on the DEMO to check it out.

Given below is the SQL from the above DEMO to try out:
-- check version SELECT version(); -- create test table CREATE TABLE test_json ( id varchar(36) NOT NULL, profile json NULL, -- age: generated column, value derived from profile json age smallint GENERATED ALWAYS AS ((profile ->> 'age')::smallint) STORED, CONSTRAINT pk_giri_test PRIMARY KEY (id) ); SELECT * FROM test_json; -- insert a record INSERT INTO test_json (id, profile) VALUES('2eab2d99-167d-41b7-8227-d89ae45d3801', '{"fname":"Giri", "lname":"Pottepalem", "age":30}'); -- check data SELECT * FROM test_json; -- insert another record with no profile INSERT INTO test_json (id) VALUES('2eab2d99-167d-41b7-8227-d89ae45d3802'); -- check data SELECT * FROM test_json; -- try to update profile using json_set UPDATE test_json SET profile = jsonb_set(profile::jsonb, '{fname}', '"boo"'::jsonb) WHERE id = '2eab2d99-167d-41b7-8227-d89ae45d3802'; -- check data, notice that previous update did not work though it returned 1 rows affected SELECT * FROM test_json; -- lets update using regular UPDATE test_json SET profile = '{"fname":"boo", "lname":"pottepalem"}'::jsonb WHERE id = '2eab2d99-167d-41b7-8227-d89ae45d3802'; -- check data, notice that previous update actually updated profile SELECT * FROM test_json; -- update one property using jsonb_set UPDATE test_json SET profile = jsonb_set(profile::jsonb, '{age}', '15'::jsonb) WHERE id = '2eab2d99-167d-41b7-8227-d89ae45d3802'; -- check data, notice that jsonb_set now updated profile SELECT * FROM test_json; -- update multiple properties using jsonb_set UPDATE test_json SET profile = jsonb_set( jsonb_set( jsonb_set( profile::jsonb, '{age}', '18'::jsonb )::jsonb, '{fname}', '"Bhuvan"'::jsonb )::jsonb, '{lname}', '"Pottepalem"'::jsonb ) WHERE id = '2eab2d99-167d-41b7-8227-d89ae45d3802'; -- check data, notice that jsonb_set now updated profile with multiple properties SELECT * FROM test_json; -- update json by concatening with ||, properties that exists get updated, properties that don't exist get added UPDATE test_json SET profile = profile::jsonb || '{"fname" : "boo", "lname" : "potte", "address" : "dummy address"}' WHERE id = '2eab2d99-167d-41b7-8227-d89ae45d3802'; -- check data, notice that existing properties got updated and new properties got added SELECT * FROM test_json; -- remove address property from json UPDATE test_json SET profile = profile::jsonb - 'address' WHERE id = '2eab2d99-167d-41b7-8227-d89ae45d3802'; -- check data, notice that address property is removed from the json SELECT * FROM test_json; -- update json by concatening with ||, properties that exists get updated, properties that do not exist get added, integer age UPDATE test_json SET profile = profile::jsonb || '{"fname" : "Bhuvan"}' || '{"lname" : "Pottepalem"}' || '{"age" : 19}' || '{"address" : "dummy address"}' WHERE id = '2eab2d99-167d-41b7-8227-d89ae45d3802'; -- check data, notice that existing properties got updated and new properties got added, including generated column age SELECT * FROM test_json; -- append another column data to a JSON column SELECT profile :: jsonb || jsonb_build_object('id', id) as profile_json_with_id_added FROM test_json;


References

Wednesday, April 07, 2021

Maven Multi-module Gotchas . . .

Typically, in a maven multi-module project you have a parent-module/main-project and multiple sub-modules with each sub-module producing it's own artifact: jar, war etc.. The main module must be of packaging type pom (<packaging>pom</packaging>) with sub-modules listed. 

For instance, the following is main module/project's pom.xml in project's root directory with it's sub-modules listed:

... <groupId>com.my</groupId> <artifactId>my-service</artifactId> <version>1.0.0-SNAPSHOT</version> <packaging>pom</packaging> ... <modules> <module>my-domain</module> <module>my-core</module> <module>my-service-api</module> </modules>

Gotcha-1

Always use {project.version} for module dependencies.

It is also typical that a sub-module depends on another. For instance core sub-module could depend on domain sub-module. In this case the core sub-module needs to add a dependency on the domain sub-module. When such module dependencies are specified in respective module's pom.xml, never ever specify the dependency version. Instead reference {project.version}.

... <parent> <artifactId>my-service</artifactId> <groupId>com.my</groupId> <version>1.0.0-SNAPSHOT</version> </parent> <artifactId>my-core</artifactId> <packaging>jar</packaging> <name>My Core Name</name> <description>My Core Description</description> <dependencies> <dependency> <groupId>${project.groupId}</groupId> <artifactId>my-domain</artifactId> <version>${project.version}</version> </dependency> ... </dependencies>

Instead of {project.version}, if you use parent module/project version. e.g. in this case, 1.0.0-SNAPSHOT, then you might run into issues when compiling core module. For instance, when a new domain class is added to domain module and is used in core module, you might run into core module compilation issues of not finding new class added.

This is because if you have an artifact repository in which your previous my-domain-1.0.0.SNAPSHOT-*.jar versions are available maven downloads the most recent one into it's cache which results into the newly added domain class not found. This would cause nasty compilation issues giving no clues why.

When you build domain module with mvn clean install, it builds domain module and produces it's new artifact (my-domain-1.0.0.SNAPSHOT.jar) and installs it in local cache. But when you clean up cache and build core module or even main project/module (mvn clean install), the domain module doesn't get built and installed from it's sources, instead it gets downloaded form your artifact repo, which results into older version not having newly added domain class.

If you use {project.version}, and build core module or main project module with mvn clean install, it always builds and installs domain module producing new my-domain-1.0.0.SNAPSHOT.jar in local cache.

Gotcha-2

Multiple application modules sharing same test-case source code.

I recently ran into this situation. The task was to migrate a spring-boot Java micro-service api application from MySQL to PostgreSQL. But let the MySQL api app continue be in place for sometime in parallel along with new api-pg app module added which interfaces with PostgreSQL database. With the addition of new api-pg module, two api artifacts would come out of the maven build: existing api app, and new api-pg app.

First I modularized the existing spring-boot application backed by typical old-fashioned JDBC, DAO layer with inline SQL statements for MySQL database into multiple modules like domain, core, api etc in order to facilitate code reuse between two api applications with minimal specific code in each application. Then added a new api-pg module which is a spring-boot application by itself with minimal Java source code like, main spring-boot application, additional Java configurations, data access layer (DAO Impl), and bootstrap configuration files etc.

This posed a challenge to make the test-cases runnable during maven build for both the applications, keeping the source code in one module. Maven by default runs all test-cases during it's test phase unless otherwise told to skip by passing additional flag like: -DskipTests. As test-cases source code was chosen to be left in the api app module (MySQL based), maven finds it in test-phase and runs. Where as, for the new api-pg module, as test-cases source code is not there in that module, it wouldn't bother to run.

So, it requires bit of a hack getting test-case source code in one api module but making it available in both api modules during maven test-phase of each module. The module that has source code is obviously compiled and ran during it's test phase. It doesn't make sense to somehow make test-case source code available for the other module. The new api-pg module at least needs the compiled classes to be available in Maven's target directory to make them running during this apps test-phase.

That idea of having test-cases source code in one module, getting compiled and run for that module, but make compiled test-case classes available for the other module requires bit of hack. This is is where the following two plugins come in handy to deal with this kind of situation:


The maven-jar-plugin can be leveraged in api module to create a jar file of all the compiled test-case classes from the api module.

The maven-dependency-plugin can be leveraged in api-pg module to unpack the jar file to it's target/test-classes dir so that they get executed as part of it's build. This technique works pretty neat.

By leveraging maven-jar-plugin, the maven build file of api application module: pom.xml  needs the following addition:

... <parent> <artifactId>my-service-api</artifactId> <groupId>com.giri</groupId> <version>1.0.0-SNAPSHOT</version> </parent> ... <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-jar-plugin</artifactId> <version>3.2.0</version> <executions> <execution> <goals> <goal>test-jar</goal> </goals> </execution> </executions> </plugin> ... </plugins>

With the above change, when the api app is built, it produces an additional jar file like: <artifactId>-<version>-tests.jar. For instance, if the artifactId of api module is my-api-service and version is 1.0.0-SNAPSHOT, then the build produces my-api-service-1.0.0-SNAPSHOT-tests.jar file.

Now, by leveraging maven-dependency-plugin, the maven build file of api-pg application module: pom.xml  needs the following addition:
 
... <parent> <artifactId>my-service-api-pg</artifactId> <groupId>com.giri</groupId> <version>1.0.0-SNAPSHOT</version> </parent> ... <dependencies> <dependency> <groupId>com.giri</groupId> <artifactId>my-service-api</artifactId> <version>${project.version}</version> <classifier>tests</classifier> <type>test-jar</type> <scope>test</scope> </dependency> </dependencies> ... <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-dependency-plugin</artifactId> <executions> <execution> <id>unpack</id> <phase>process-test-classes</phase> <goals> <goal>unpack</goal> </goals> <configuration> <artifactItems> <artifactItem> <groupId>com.giri</groupId> <artifactId>my-service-api</artifactId> <version>${project.version}</version> <type>test-jar</type> <outputDirectory>${project.build.directory}/test-classes</outputDirectory> </artifactItem> </artifactItems> </configuration> </execution> </executions> </plugin> ... </plugins> ...

Make sure that you also add test scope dependency for the artifact (tests jar) that gets produced.

That's it. When you run the project build, when maven builds api module, it compiles all test, runs and builds. Then, when it goes for api-pg module build, it unpacks the compiled test classes into it's target and runs all tests for this module as well. This way test-cases source resides in one module, but gets run in both the modules.

NOTE
You may need to make sure that in the main project pom.xml file you list modules in the order that you want them to be built. For instance:

... <modules> <module>my-service-domain</module> <module>my-service-core</module> <module>my-service-api</module> <module>my-service-api-pg</module> </modules> ...

TIP

  • In a multi-module project, whenever there are issues with dependencies, always go to local maven cache for your groupId: ~/.m2/repository/com/my and blow out all sub-directories, files and then run maven build from root project. If  a specific sub-module runs into issues, blow out all that sub-module's module dependencies in maven cache and just build that sub-module and see.

Friday, March 05, 2021

Spring Cloud Config - Gotchas . . .

Spring Cloud Config takes externalization of application property-resources or configuration-files (name-value pair property files: *.properties, and YML files: *.yml) one step further away from the application into an external and central location like a git repo or a file-system.

It comes with a server-side application that accesses property resources from the central location and makes them available for client applications through HTTP resource-based API end-points. This Server application can easily be embeddable in a simple Spring Boot application. Once embedded in a Spring Boot application, this will be your Cloud config server application to be integrated with all client applications. Client applications integrate with the server by setting spring.cloud.config properties like uriuser, password etc.

Environment: Java 15, Spring Boot 2.3.8.RELEASE on macOS Catalina 10.15.7

Gotchas

  • Client applications are identified by Cloud config server by their application names (spring.application.name property value of the respective client application)
  • With git repository hosting property resource files (configuration files like *.properties or *.yml), if you use application.yml or application.properties names for configuration files, these files become common configuration to all client applications that depend on Cloud config server for externalized config files.
  • One config server can serve multiple clients' (multiple applications) externalized configurations. To best leverage this feature, organize externalized config files by the application's name. For instance if there are two Spring Boot client applications (named: my-app-1, and my-app-2) and both are configured to use one Spring Cloud config server for their externalized configurations hosted in a git repo, better to have two separate config files like my-app-1.yml and my-app-2.yml for the respective application configurations in the git repo. One yml file can have configurations for all environments like local, dev, int, cert, prod etc with each environment mapping to respective spring profile and each profile configuration separated by yml directive ---
  • The HTTP service of Cloud Config Server exposes end-points in different forms with client-application-name, client-application-profile, and label as part of the end-point forms. Label is nothing but the git branch and is optional with master as implicit value.
  • With git repository set as the host for clients' externalized configuration files, if any specific changes to those files are made and committed in a branch other than master, the cloud config server does not serve that branch unless it is explicitly asked for by specifying the optional label part in the end-point URL. This is the case even if that specific branch is currently checked out.
  • When label (branch name) is not specified, it always uses the implicit label and checks out master branch and responds with the configuration taken from property resources available in the master branch.
  • If explicitly asked for a different branch by  specifying the branch-name as the label in the end-point URL, it checks out that specific branch and servers property resources available in that branch for the given application and it's profile.

    Local Testing

  • For local testing, if your git repo (.git) for clients' external configuration files is in a directory under your home e.g. ~/dev/my-cloud-config, then set spring.cloud.config.server.git.uri to file://Users/<yourMacUserId>/dev/my-cloud-config in the cloud config server spring boot application bootstrap.yml/application.properties for the profile, let's say: int
  • Assuming your cloud config server is running locally on port:8888, the URLs: 1) http://localhost:8888/my-app-1/int 2) http://localhost:8888/my-app-1-int.yml 3) http://localhost:8888/my-app-1-int.properties to check my-app-1 application's int environment/profile configuration results in checking out master branch to serve my-app-1 application's configuration. If you are on a branch other than master, you will notice that you will be switched to master once you make a request to the above URLs to verify the configuration. This is due to the implicit master branch checkout that the cloud config server does under the hood.

  • In your local git repo where client applications' config files are hosted, if your are on a branch other than master and have local changes pending (not committed), the implicit master checkout fails resulting into an exception like:
    org.springframework.cloud.config.server.environment.NoSuchRepositoryException: Cannot clone or checkout repository: file:///Users/<myMacUserId>/dev/my-cloud-config
  • If your configuration files for client applications are on a specific git branch other than master, and if you want to test the cloud config server against that specific branch, then the endpoint /{application}/{profile}[/{label}] URL (e.g. http://localhost:8888/my-app-1/int/my-branch may work.

    Encrypted Content

  • If clients' configuration files have password property like spring.datasource.password and the values are plain text, then the response shows the property with the original value. If the password uses an encrypted value starting with {cipher} followed by encrypted password (e.g. `{cipher}my3ncryp!ed^assword`) then the response doesn't contain spring.datasource.password property or it's value. Instead, you will see a special property "invalid.spring.datasource.password" with a special value "<n/a>"   "invalid.spring.datasource.password": "<n/a>". That means you gotta use a special way to see those encrypted password properties. There is a separate section in the doc explaining this with two special end-points (/encrypt, /decrypt) provided.

TIP

  • In this case, your client application's property resource file changes are on a specific branch, you are unable to get that working, you will be better off changing the Cloud Server Spring Boot application's default implicit master branch to your specific branch by setting the property spring.cloud.config.server.git.default-label to your branch name. This works even if your branch name has / in it (e.g. feature/my-branch).

References