Thursday, August 31, 2023

Spring Boot 2.6 to 2.7 - Profile related changes go through increased complexity levels . . .

Indirection is directly proportional to complexity. The more the indirection levels in one direction, the more the turns to take in reverse direction to understand. The higher the indirection levels, the deeper the dive in finding details.

A simple feature like Spring Profiles interwoven across several Spring frameworks is not only made it difficult to understand but also is painful to follow through multiple, disconnected sets of details, and multiple documents. Pulling necessary details together, across different versions of each framework, related to one specific feature is even more painful.

Environment: Java 20, Spring Boot 2.6.3, Spring Boot 2.7.14, maven 3.9.3 on macOS Catalina 10.15.7

Spring Profiles is a pretty simple to understand indirection, abstracted out and externalized with several benefits. Spring - as a framework to support all possible ways that this feature can be supported, also started to evolve. Cloud and cloud related frameworks extended it further increasing the levels of indirection and thus its complexity.

In my recent attempt upgrading a Spring Boot application from Spring Boot 2.6.x to 2.7.x, failed to get deployed quietly with no ERRORS or WARNINGS in Kubernetes (K8S) environment. I noticed a log message about multiple active profiles in logs, and started digging deeper into why there were more than one active profile. There was certainly an interwoven behavioral change between these two Spring Boot minor versions around active profiles with different frameworks working together on profile related configurations, mainly coming in from bootstrap and application yaml files.

The scenario

I did migrate a Spring Boot 2.6.3 application successfully to 2.7.14 and tested locally with local profile. The application also depends on Spring framework support for Vault, and Consul. The default active profile (spring.profiles.active) is set to test in application.yml file. When the application is run locally an explicit active profile parameter is passed (-Dspring-boot.run.profiles=local) to the maven goal: spring-boot:run. So, profile: local passed through maven build option gets passed as Java option, takes the precedence, and overrides the default active profile test. The default active profile test is good for the default profile active as there is no explicit profile specified during maven test phase when it runs unit and integration tests. Environment specific configuration files like application-int.ymlapplication-int.yml have spring.config.activate.on-profile set to respective environment. So, environment specific configuration (e.g. application-int.yml) is considered when active profile is set to specific environment (e.g. int). All looked good before and after the upgrade.

When the application gets built through concourse pipeline, after successful build including unit and integration tests run, it automatically gets deployed to both int and cert. The deployment script sets Java option -Dspring.profiles.active=<env>, thus overriding the default active profile test with specific env as command line option takes the precedence over configuration files. So, all worked as expected for local profile after the upgrade and went for deployment with the same expectations.

Also, initially the application had one set of yaml files (application) when Apache Mesos was the deployment platform. Later it got migrated to K8S deployment with added Vault and Consul support, which forced a new set of yaml configuration files (bootstrap). Mainly Vault and Consul required certain properties to be configured through bootstrap yaml set. So after  the application got migrated from Apache Mesos to K8S, it ended up having two different sets of application properties: bootstrap.yml set and application.yml set, along with their counterpart env specific files for localtestintcert and prod. With that there were 2 sets of 7 yaml files for each set. The first set: bootstrap, bootstrap-local, bootstrap-test, bootstrap-dev, bootstrap-int, bootstrap-cert and bootstrap-prod. The second set:  application, application-local, application-test, application-dev, application-int, application-cert and application-prod.

The issue

The upgraded application (upgraded to Spring Boot 2.7.14) failed to come up successfully after it got deployed to K8S cluster through Concourse CI/CD pipeline with logs showing two active profiles during the application startup after deployment onto both int and cert. For instance on int logs, both test and int were logged in as active profiles.

It was bit hard to dig into where this profiles handling of active profile getting overridden by the Java option passed was broken - instead of overriding, it was appended to active profiles. Spring allows more than one profile to be specified to be active separated by comma treating it as a list of active profiles. When multiple profiles are active, each profile specific configuration is read in the order of environments that appear in the list, and configuration properties get consolidated, with the later environment related properties replacing the previous environment related if there are same properties configured in both. In this case, it's like merging multiple environment specific configurations.

In my case, test profile was default active. The active profile passed through Java options, instead of replacing the default list which has just one element test, it was getting prepended to the list. So in int the active profiles were: int, test and similarly in cert: cert, test. The test profile being the last element in the list, was taking the precedence and the application in both int and cert failed to start up as test environment specific configuration was not good for int or cert.

The fix (better one) - separating out test configurations into it's own configuration area under test

All application, and bootstrap yaml files resided under src/main/resources directory.

The fix I did for this issue involved the following changes:
1. Removed default active profile configuration property set in application.yml file.
2. Moved test profile specific configuration files (application-test.yml and bootstrap-test.yml) that contain spring.config.activate.on-profile property set to test along with other test environment related properties under src/test/resources directory.
3. Added new application.yml and bootstrap.yml files under test/resources directory. Both just contain one property spring.profiles.active set to test.

With this separation, all configuration files including the base and environment specific reside under src/main/resources directory. There is no active profile set at all in any of these configuration files under src/main/resources. In other words, active profile is always passed in as an option for all environments: local, dev, int, cert, and prod.

Test environment configuration files: application.ymlbootstrap.ymlapplication-test.yml, and bootstrap-test.yml reside under src/test/resources directory.
Both application.yml, and bootstrap.yml have active profile set to test. So spring testing framework considers these configurations when found under src/test/resources during unit and integration tests. That way test environment active profile and it's associated test specific configurations are totally isolated from all other env configurations into its own configuration area: src/test/resources

This is more cleaner approach. Also isolates test configurations under src/test/resources area which are not even bundled into the application jar.

Summary

Enterprise Java's unnecessary overcomplexity gave birth to Spring Framework and it has quickly become popular and a de-facto Java framework since then. The core of it is based on the simple Dependency Injection (or Inversion of Control - IoC) design pattern. It started to grow into every corner of the technology concept. It is not simple anymore, in my opinion. There are too many layers of details one needs to know. With tons and tons Spring-eco-system frameworks, it has become even more complex. A framework started to simplify Enterprise Java development has grown too big to be(come) complex.

A simple Java framework like JUnit itself is getting complex. So, no wonder Spring being in there already. Software developers absolutely love indirection and complexity. In Software Development simplicity is a rare quality. Even if some simplicity exists, it slowly becomes complex over a period of time. Simple becomes complex, complex only becomes super complex ;)

Happy software development, and enjoy the complexity!

References

Monday, August 28, 2023

IntelliJ - the Community Edition rescued the broken Ultimate Edition . . . ;)

It sound funny to say that - IntelliJ Community Edition (free edition) rescued Ultimate Edition (paid edition). But that's what really put me back on Ultimate Edition after many months of switching to Community Edition for my day-to-day development.

Environment: IntelliJ Community Edition 2023.2.1IntelliJ Ultimate Edition 2023.2.1, maven 3.9.3 on macOS Catalina 10.15.7

The Issue

I stopped using IntelliJ Ultimate Edition as for some weird reason it got stuck with maven dependencies broken paths issue. Specifically, SLF4J and javax libraries were shown in maven dependency broken paths in red. It happened few months ago. One day one of my maven projects that uses Lombok and had classes annotated with @Slf4J suddenly couldn't recognize the log (logger object) statements. When I checked project module dependencies there were broken maven dependency paths.

I tried all recommended and possible ways to recover from this issue like: Invalidate Caches and restart the IDE(A), blowing away specific libraries maven cache ~/.m2/libraries/.../*.* and letting it build again, blowing away entire maven local cache ~/.m2/*.* and letting it build again, downloading dependency jars and explicitly setting dependency paths in the project/module settings, reinstalling the Ultimate Edition, even reinstalling different version of the Ultimate Edition etc. Nothing worked. I spent lot of time few times since then and couldn't get it back to working. Finally gave up and moved to Community Edition. Same projects that were having broken dependency issues in Ultimate Edition, when opened in Community edition, had no issues with those dependencies, same maven local cache paths are used by both. The Ultimate Edition complains, the Community Edition doesn't. Alas!

The Fix

Here is what I did to fix it.

Started IntelliJ Ultimate Edition. At the startup there is a Customize link, click that and click Import Settings... link as shown below:

Selected the Community Edition settings directory (~/Library/Application Support/JetBrains/IdeaIC2023.2) of Community Edition in which my projects were fine with dependencies. It prompted to take a backup of current settings. I did that and imported ommunity Edition settings. It opened projects and the issue of broken dependency paths was gone.

It seemed like, there was some broken dependency path setting saved in the Ultimate Edition settings that was stuck and not getting fixed by any means.

Some Internals

IntelliJ saves all it's settings under the specific version's area. On Mac, by default, this area is under ~/Library/Application Support/JetBrains dir. The community edition directories start with IdeaIC<version> (e.g IdeaIC2023.2) and ultimate edition directories start with IntelliJIdea<version> (e.g. IntelliJIdea2023.2). The options sub-directory is where plugins settings get saved. 

Plugin Settings

Plugin settings are pat of IntelliJ settings and get stored in xml files under options sub-directory of specific IntelliJ version's settings directory. For instance, awesome editor plugin is a simple and pretty neat plugin which lets add image backgrounds. I did setup plugin awesome editor to display different kinds of images for different types of files. To get all the settings from one IntelliJ version to another, just copy the plugin settings file (in this case it is: awesome-editor-3.xml). 

Here is an example to copy plugin settings set in Ultimate edition to Community Edition.
NOTE: Once copied restart IntelliJ Community Edition.

$ cd ~/Library/"Application Support"/JetBrains $ ls -al |grep IdeaIC drwxr-xr-x 17 pottepalemg 163264107 544 Jul 26 2022 IdeaIC2022.1 drwxr-xr-x 20 pottepalemg 163264107 640 Nov 30 2022 IdeaIC2022.2 drwxr-xr-x 20 pottepalemg 163264107 640 Mar 30 15:23 IdeaIC2022.3 drwxr-xr-x 20 pottepalemg 163264107 640 Jul 7 11:31 IdeaIC2023.1 drwxr-xr-x 20 pottepalemg 163264107 640 Aug 28 10:13 IdeaIC2023.2 $ ls -al |grep IntelliJIdea drwxr-xr-x 22 pottepalemg 163264107 704 Aug 24 15:43 IntelliJIdea2020.1 drwxr-xr-x 19 pottepalemg 163264107 608 Nov 19 2020 IntelliJIdea2020.2 drwxr-xr-x 21 pottepalemg 163264107 672 Mar 2 2022 IntelliJIdea2020.3 drwxr-xr-x 20 pottepalemg 163264107 640 Jul 9 2021 IntelliJIdea2021.1 drwxr-xr-x 24 pottepalemg 163264107 768 Feb 9 2022 IntelliJIdea2021.2 drwxr-xr-x 25 pottepalemg 163264107 800 Apr 12 2022 IntelliJIdea2021.3 drwxr-xr-x 26 pottepalemg 163264107 832 Sep 23 2022 IntelliJIdea2022.1 drwxr-xr-x 25 pottepalemg 163264107 800 Nov 30 2022 IntelliJIdea2022.2 drwxr-xr-x 25 pottepalemg 163264107 800 Jun 28 13:26 IntelliJIdea2022.3 drwxr-xr-x 24 pottepalemg 163264107 768 Aug 16 15:40 IntelliJIdea2023.1 drwxr-xr-x 22 pottepalemg 163264107 704 Aug 28 10:36 IntelliJIdea2023.2 drwxr-xr-x 11 pottepalemg 163264107 352 Aug 25 15:35 IntelliJIdea2023.2-backup $ find . -name awesome* ./IntelliJIdea2023.2/options/awesome-editor-3.xml $ ls -ltr ./IdeaIC2023.2/options | grep awesome* $ cp ./IntelliJIdea2023.2/options/awesome-editor-3.xml ./IdeaIC2023.2/options

TIPS

Getting back lost database connection settings into Ultimate Edition

IntelliJ Ultimate Edition comes bundled with Database plugin that supports all features that are available in DataGrip (an SQL IDE which is a product of JetBrains). I had database connections set to connect to various PostgreSQL databases (local, int, cert, prod etc.) which I lost by importing Community Edition settings. But I had a settings backup prompted and done for the Ultimate Edition settings that my broken Ultimate Edition was setup with when I imported Community Edition Settings. The backup directory is also listed in the above list of Ultimate Editions setting directories. To get those settings back onto my new settings imported from Community Edition, I had to repeat the Customize > Import Settings... step two more times. First time pointing it to the backup directory and selecting and copying all database connections settings to the clipboard. Second time pointing it to the Community Edition Directory and pasting those connection settings from the clipboard. This was the only way I could get those database connections copied. I got all connection settings except passwords fro every connection. I had to set password for every single connection individually. This was not possible by simply copying xml files like I did for awesome editor plugin.

Summary

No software application or tool is bug-free. Applications do crash, tools do get corrupted. There is no one solution that works or fixes a similar issue for everybody. Some stupid, nasty, unknown, not very well documented internals of tools do take up lot of time to discover a fix that works for your situation and may help some others who get into similar situation.

Hope this blog post on my discovery saves someone's time sometime when that someone bumps into it.

Wednesday, August 23, 2023

Spring Batch - upgrade from 4.3.x to 5.0.x (breaking changes) . . .

It makes sense to expect some breaking changes between two major versions. I recently ran into this when upgrading a Spring Boot based Spring Batch application from Spring Boot 2.7.14 to Spring Boot 3.1.2.

The application is a Spring Boot command-line runnable app, which takes the batch job name(s) and input data file(s) (job related datafile arg, and filename) and processes those files by kicking off those jobs. This application required few major changes to get it functioning. I had to read the docs to identify few things that were already deprecated in 4.x and are removed in 5.x. also finding out on new things put in place.

Environment: Java 20, Spring Boot 2.7.14, Spring Boot 3.1.2, Spring Batch 4.3.8, Spring Batch 5.0.2, maven 3.9.3 on macOS Catalina 10.15.7

The following is the summary of high level changes. I am not going into details in this post as these changes are easy enough to understand at a high level.

1. The in memory map datasource in earlier versions was deprecated. Now it's removed. So, it requires to put in a configuration for this. The following is an example.
import org.springframework.batch.core.repository.JobRepository; import org.springframework.batch.core.repository.support.JobRepositoryFactoryBean; import org.springframework.batch.support.transaction.ResourcelessTransactionManager; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseBuilder; import org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseType; import org.springframework.transaction.PlatformTransactionManager; import javax.sql.DataSource; /** * In memory data source configuration for batch jobs. * Defines {@link DataSource}, {@link PlatformTransactionManager} and {@link JobRepository} beans. * * @author Giri * created Aug 16, 2023 */ @Configuration public class InMemoryBatchRepositoryConfig { @Bean public DataSource inMemoryDataSource() { EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder(); return builder.setType(EmbeddedDatabaseType.H2) .addScript("classpath:org/springframework/batch/core/schema-drop-h2.sql") .addScript("classpath:org/springframework/batch/core/schema-h2.sql") .generateUniqueName(true) .build(); } @Bean public PlatformTransactionManager resourceLessTransactionManager() { return new ResourcelessTransactionManager(); } @Bean public JobRepository jobRepository() throws Exception { JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean(); factory.setDataSource(inMemoryDataSource()); factory.setTransactionManager(resourceLessTransactionManager()); factory.afterPropertiesSet(); return factory.getObject(); } }

2. @EnableBatchProcessing annotation takes new properties: dataSourceRef, transactionManagerRef.
@Slf4j @Configuration @EnableBatchProcessing(dataSourceRef = "inMemoryDataSource", transactionManagerRef = "resourceLessTransactionManager") public class MyJobConfig { @Value("${data_filename:NONE}") String dataFilename; ... }

3. JobBuilderFactory, StepBuilderFactory are removed. Use JobBuilder and StepBuilder instead.
@Slf4j @Configuration @EnableBatchProcessing(dataSourceRef = "inMemoryDataSource", transactionManagerRef = "resourceLessTransactionManager") public class MyJobConfig { ... @Bean public Job myJob( JobRepository jobRepository, Step myJobStep, JobCompletionNotificationListener jobCompletionNotificationListener) { return new JobBuilder("myJob", jobRepository) .listener(jobCompletionNotificationListener) .flow(myJobStep) .end() .build(); } @Bean public Step myJobStep( JobRepository jobRepository, PlatformTransactionManager platformTransactionManager, ItemFailureLoggerListener itemFailureLoggerListener, StepListener stepListener){ var step = new StepBuilder("myJobStep", jobRepository) .<MyDomainObject, String> chunk(10, platformTransactionManager) .reader(myDomainObjectReader()) //input .processor(myDomainObjectProcessor(isGm5)) //transformer/processor .writer(myDomainObjectWriter())//output .listener((ItemProcessListener) itemFailureLoggerListener) .build(); step.registerStepExecutionListener(stepListener); return step; } ... }

4. Unlike previous versions, jobs won't start by setting the property: spring.batch.job.names to a comma separated names of jobs. Need to explicitly start the job launcher. The following is a code snippet of the main application which takes a job name passed from command line by property spring.batch.job.name and launches that job.

@Slf4j @SpringBootApplication public class BatchApplication implements CommandLineRunner, InitializingBean { @Value("${spring.batch.job.name:NONE}") private String jobName; @Autowired ApplicationContext applicationContext; @Autowired JobLauncher jobLauncher; @Override public void afterPropertiesSet() { try { validateJobName(); } catch (Exception ex) { log.error(ex.getMessage()); System.exit(SpringApplication.exit(applicationContext, () -> 1)); } } public static void main(String[] args) { SpringApplication app = new SpringApplicationBuilder(ScoringEtlApplication.class) .web(WebApplicationType.NONE) .logStartupInfo(false) .build(args); app.run(args); } @Override public void run(String... args) { log.debug("Beans Count:{} Beans:{}", applicationContext.getBeanDefinitionCount(), applicationContext.getBeanDefinitionNames()); Job job = (Job)applicationContext.getBean(jobName); JobParameters jobParameters = new JobParametersBuilder() .addString("jobID", String.valueOf(System.currentTimeMillis())) .toJobParameters(); try { jobLauncher.run(job, jobParameters); log.info("Finished running job(s): {} with args: {}", jobName, Arrays.stream(args).collect(Collectors.joining(", "))); } catch (JobExecutionAlreadyRunningException | JobRestartException | JobInstanceAlreadyCompleteException | JobParametersInvalidException e) { throw new RuntimeException(e); } } /** * Validates jobName and displays usage and appropriate error message upon validation failure. */ private void validateJobName() throws URISyntaxException { String errorMessage = null; if (jobName.isEmpty() || jobName.equals("NONE")) { errorMessage = "No job(s) specified. Please, specify valid job_name(s)"; } if(errorMessage != null) { String runningJar = this.getClass().getClassLoader().getClass() .getProtectionDomain() .getCodeSource() .getLocation() .toURI() .getPath(); // display Usage log.info(""" USAGE: Run a specific job: java -jar <runnableJar> \\ --spring.batch.job.name=<job_name> \\ --<job_arg_data_filename>="job_data_file.csv" """.replace("runningJar", runningJar)); throw new IllegalArgumentException(errorMessage); } else { List allJobNames = Arrays.stream(applicationContext.getBeanNamesForType(Job.class)).toList(); log.debug("Available Jobs: {}", allJobNames); log.info("Running Job: {}", jobName); } } }

Other classes referenced in Job configurations:
public class JobCompletionNotificationListener extends JobExecutionListenerSupport { @Value("${spring.batch.job.name:default-job}") private String jobName; @Override public void beforeJob(JobExecution jobExecution) { log.info("JOB: {} - About to start.", jobName); } @Override public void afterJob(JobExecution jobExecution) { if(jobExecution.getStatus() == BatchStatus.COMPLETED) { var from = jobExecution.getStartTime(); var to = jobExecution.getEndTime(); log.info("JOB: {} - Finished running. Took {} milliseconds. Verify results.", jobName, ChronoUnit.MILLIS.between(from, to)); } } } @Slf4j @Component public class ItemFailureLoggerListener extends ItemListenerSupport

TIPS

When running the command line application, if there are warnings: Using deprecated '-debug' fallback for parameter name resolution. Compile the affected code with '-parameters' instead or avoid its introspection: MyJobConfig, then suppress it by setting the following  maven-compiler-plugin setting:
<plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>${maven-compiler-plugin.version}</version> <configuration> <source>${javac.source.version}</source> <target>${javac.target.version}</target> <release>${javac.release.version}</release> <compilerArgs> <arg>-parameters</arg> </compilerArgs> </configuration> </plugin>

References

Friday, August 11, 2023

Spring boot - your own banner, actuator, version details etc . . .

Art is good for eyes. Spring boot out of the box comes with a nice text banner of it's name : Spring Boot and displays the version right below the banner. Out of the box, the banner mode is set on and it shows up when the application gets started. There are several articles available on customizing this banner. Also, spring boot documentation has a brief section on thus as well (link in the resources).

Environment: Java 20, Spring Boot 2.7.14, maven 3.9.3 on macOS Catalina 10.15.7

It's always good to customize the banner and see your application name whenever it comes up. Of course, you can display more good-to-have version details like: Spring Boot version, Java version, Your application version, and other available properties along with the banner.

Following is the quick list of steps to have a custom banner in your application.
  • Generate a text banner for your application name. There are several sites for doing this. The one such is: https://springhow.com/spring-boot-banner-generator/. Generate a text banner and download the text file.
  • Edit the file and add the following properties for versions at the bottom:
____ _ ____ ___ __ | __ ) ___ ___ | |_ / ___|_ __ __ _ __ _| \ \ / / __ ___ | _ \ / _ \ / _ \| __| | | _| '__/ _` |/ _` | |\ \ / / '_ ` _ \ | |_) | (_) | (_) | |_ | |_| | | | (_| | (_| | | \ V /| | | | | | |____/ \___/ \___/ \__| \____|_| \__,_|\__,_|_| \_/ |_| |_| |_| :: Spring Boot :: ${spring-boot.version} :: Running on Java :: ${java.version} :: Application :: ${project.version}
  • Place the text file with file-name: banner.txt under src/java/resources folder.
  • In your maven build file (pom.xml), make sure that you turn on maven filtering and add the resource directory for the extra version properties added to be resolved during the build.
  • Also, make sure you have maven-resources-plugin configured to filter resources.
<build> <resources> <resource> <filtering>true</filtering> <directory>${project.basedir}/src/main/resources/</directory> </resource> </resources> <testResources> <testResource> <filtering>true</filtering> <directory>${project.basedir}/src/test/resources/</directory> </testResource> </testResources> ... <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-resources-plugin</artifactId> <version>3.3.1</version> <executions> <execution> <id>filter-resources</id> <phase>process-resources</phase> <goals> <goal>copy-resources</goal> </goals> <configuration> <resources> <!-- # Filtered Resources --> <resource> <directory>${project.basedir}/src/main/resources/</directory> <filtering>true</filtering> </resource> </resources> <outputDirectory>${project.build.directory}</outputDirectory> </configuration> </execution> </executions> </plugin> ... </build>

Gotcha-1

When you run maven build goal(s) that also runs test-cases like for e.g. ./mvnw clean install, the banner shows up with all properties resolved for every integration test-case. However, for the custom properties used in the banner to be filtered and shown you need to add the same custom property in application.yml if you happen to separate out test configurations under test/resources. Also, make sure that you have enabled <testResources> filtering as well as shown above.

Gotcha-2 (Spring Boot 3.x)

The above ${project.version} doesn't work in Spring boot 3.x. In this case a custom application version property (e.g. app.version) can be defined in application.yml or application.properties and that can be used in the banner.txt file.
E.g. application.yml
spring: application: name: @project.name@ # custom property for banner app: version: @project.version@

____ _ ____ ___ __ | __ ) ___ ___ | |_ / ___|_ __ __ _ __ _| \ \ / / __ ___ | _ \ / _ \ / _ \| __| | | _| '__/ _` |/ _` | |\ \ / / '_ ` _ \ | |_) | (_) | (_) | |_ | |_| | | | (_| | (_| | | \ V /| | | | | | |____/ \___/ \___/ \__| \____|_| \__,_|\__,_|_| \_/ |_| |_| |_| :: Spring Boot :: ${spring-boot.version} :: Running on Java :: ${java.version} :: Application :: ${app.version}

Actuator

Actuator provides production-ready endpoints for monitoring the application. Just adding the following dependency in pom.xml will do. Once the application is up, check http://localhost:8080/actuator

<dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-actuator</artifactId> </dependency>

However, by default not all end-points are enabled. Only /actuator, and /actuator/health end-points are enabled. By setting the property management.endpoints.web.exposure.include=* in application.properties or corresponding application.yml will activate all end-points. 

Also, multiple end-points can be listed separated by comma.
e.g. management.endpoints.web.exposure.include=health,info,beans,env

Selectively certain end-points can be disabled, for instance to disable refresh end-point, set management.endpoints.web.exposure.exclude=refresh 

NOTE: The refresh end-point requires an empty POST request. e.g. curl -X POST http://localhost:8081/actuator/refresh 

Application information

The /actuator/info endpoint displays application information. By default there is no information. So, http://localhost:8080/actuator/info displays empty JSON:
{}

Maven plugin - spring-boot-maven-plugin

This plugin comes with build execution goal: build-info which is run by default and creates build information file: build-info.properties under target/classes/META_INF directory. Properties listed in this file are available through the endpoint: http://localhost:8080/actuator/info.
The following is an example of build-info goal execution configuration:

<plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <executions> <execution> <!-- Useful info on /actuator/info --> <id>build-info</id> <goals> <goal>build-info</goal> </goals> </execution> </executions> </plugin>

The above configuration generates build-info.properties file with pre-defined build info properties like:
build.artifact=boot-graalvm build.group=com.example build.name=boot-graalvm build.time=2023-09-22T16\:04\:27.970Z build.version=0.0.1-SNAPSHOT

With the above file generated, the /actuator/info endpoint response would look like:
{ "build": { "artifact": "boot-graalvm", "name": "boot-graalvm", "time": "2023-09-22T16:04:27.970Z", "version": "0.0.1-SNAPSHOT", "group": "com.example" } }
 
Additional custom properties can be added by configuring the build-info goal. For instance to add Java version and the Spring Boot version that the application is running with, the following additional configuration can be added:
... <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>3.1.2</version> <relativePath/> <!-- lookup parent from repository --> </parent> <groupId>com.example</groupId> <artifactId>boot-graalvm</artifactId> <version>0.0.1-SNAPSHOT</version> <name>boot-graalvm</name> <description>GraalVm project for Spring Boot</description> <properties> <java.version>20</java.version> <spring.boot.version>${parent.version}</spring.boot.version> </properties> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <executions> <execution> <!-- Useful info on /actuator/info --> <id>build-info</id> <goals> <goal>build-info</goal> </goals> <configuration> <additionalProperties> <java.version>${java.version}</java.version> <spring.boot.version>${spring.boot.version}</spring.boot.version> </additionalProperties> </configuration> </execution> </executions> ...

Which would result with additional properties in the build-info.properties file like:
build.java.version=20 build.spring.boot.version=3.1.2

The /actuator/info endpoint response looks like:
{ "build": { "java": { "version": "20" }, "spring": { "boot": { "version": "3.1.2" } }, "version": "0.0.1-SNAPSHOT", "artifact": "boot-graalvm", "name": "boot-graalvm", "time": "2023-09-22T18:52:39.193Z", "group": "com.example" } }

Java information

Info endpoint has several info contributors like build, env, java, git etc. By default they are disabled. Enabling these would show related information collected by Spring.

To enable java info, add management.info.java.enabled=true in application.properties or management.info.java.enabled:true in application.yml.

This will show the following java info details collected like:
{ "build": { "java": { "version": "20" }, "spring": { "boot": { "version": "3.1.2" } }, "version": "0.0.1-SNAPSHOT", "artifact": "boot-graalvm", "name": "boot-graalvm", "time": "2023-09-22T18:52:39.193Z", "group": "com.example" }, "java": { "version": "20", "vendor": { "name": "Amazon.com Inc.", "version": "Corretto-20.0.0.36.1" }, "runtime": { "name": "OpenJDK Runtime Environment", "version": "20+36-FR" }, "jvm": { "name": "OpenJDK 64-Bit Server VM", "vendor": "Amazon.com Inc.", "version": "20+36-FR" } } }

TIP

These build properties generated and available through /actuator/info are also are available through BuildProperties object which can be auto-wired into any of the Spring managed beans and be accessed. For instance these properties can be outputted on swagger-ui page.


References

Saturday, August 05, 2023

Java bytecode - compiler version options and compatibilities . . .

One of many strengths of Java platform is its backward compatibility with the language. As language keeps evolving and moving forward, the good old syntax is still supported for backward compatibility. However, the compiler adds certain indicative options for specifying version details. The --source, --target are two such compiler (javac) options. From Java 9 onwards a third option --release got added to this mix. Getting a good understanding of these options is not trivial without actually experiencing all three. When compiling source code of a single class you may not need to specify these options. But in Java project when building with maven like build system, one needs to understand these options and their implications.

Environment: Java 20, Spring Boot 2.7.15, maven 3.9.3 on macOS Catalina 10.15.7

The maven-compiler-plugin

Maven build system uses maven-compiler-plugin for compiling source code. This plugin documentation upfront talks about source and target options and highly recommends to change these in plugin configuration. In order to change these per application/module needs, one needs to look under the hood for understanding.

Various extra Java compiler options can be specified in the maven-compiler-plugin configuration. The actual Java compiler options related to version are: --source, --target and --release that can be specified and passed to the compiler during code compilation through maven-compiler-plugin configuration. This can be done in two different ways in pom.xml:

1. Through maven properties: maven.compiler.source,  maven.compiler.target and maven.compiler.release as highlighted below:

<properties> <maven.compiler.source>20</maven.compiler.source> <maven.compiler.target>20</maven.compiler.target> <maven.compiler.target>20</maven.compiler.target> </properties>

If these properties are not explicitly defined, maven compiler plugin uses 1.8 for source and target.

2. Through the plugin configuration settings as highlighted below. Note - For convenience defined extra properties and used for source, target and release but straight version numbers can be used.

... <properties> <java.version>20</java.version> <javac.source.version>${java.version}</javac.source.version> <javac.target.version>${java.version}</javac.target.version> <javac.release.version>${java.version}</javac.release.version> <!-- Maven plugins --> <maven-compiler-plugin.version>3.11.0</maven-compiler-plugin.version> </properties> <build> <pluginManagement> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>${maven-compiler-plugin.version}</version> <configuration> <source>${javac.source.version}</source> <target>${javac.target.version}</target> <release>${javac.release.version}</release> <compilerArgs> <arg>-Xlint:all</arg> </compilerArgs> </configuration> </plugin> </plugins> </pluginManagement> ...

If no special configuration is required, it doesn't require even to specify maven-compiler-plugin. If specified, the above are two ways to control/change default 1.8 set by the plugin for these options which eventually get passed to the Java compiler (javac) during code compilation of sources (both under src and test

Note from Java 9 onwards, the values to these options are not like 1.7, 1.8 but must be 7 and 8.

Java 20

Java 20 compiler doesn't support version 7 for source, target anymore. The supported releases are 8 through 20. So, for any reason if maven compiler plugin is set explicitly with 1.7, build fails with ERRORS saying: Source option 7 is no longer supported. Use 8 or later. , and Target option 7 is no longer supported. Use 8 or later. 

Now it's time to understand what these options actually tell the compiler, javac. The compiler's help option (javac -help) lists all available options and a brief description about each option. The -source, -target, -release options descriptions are helpful to some extent.

--source <release>, -source <release> Provide source compatibility with the specified Java SE release. Supported releases: 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20 --target <release>, -target <release> Generate class files suitable for the specified Java SE release. Supported releases: 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20 --release <release> Compile for the specified Java SE release. Supported releases: 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20

To understand these compiler options better, we can compile a simple Java application class with just main method.

HelloJava.java
import java.util.Properties; public class HelloJava { public static void main(String[] args) { Properties systemProperties = System.getProperties(); System.out.println(String.format("Hello Java %s!", systemProperties.getProperty("java.vm.specification.version"))); systemProperties.entrySet().stream() .filter(entry -> entry.getKey().toString().startsWith("java")) .toList().stream() .forEach(entry -> System.out.println(entry.getKey() + "=" + systemProperties.getProperty(entry.getKey().toString())) ); } }
Note - The above class prints system properties that start with "java" with their values. It uses toList() method that Java 16 added to Stream class. With this the expectation is- the code should not be compiled for Java/JVM version less than 16.

Java compiler version options

Let's compile the class with different Java versions and compiler options.
 
// compile with Java 20: no options specified $ sdk use java 20.0.2-amzn // check: major version $ javap -verbose HelloJava.class | grep major major version: 64 // run on Java 20: works $ javac HelloJava.java "Hello Java 20!" // switch to Java 17 and run: fails with LinkageError $ sdk use java 17.0.1.12.1-amzn $ java HelloJava Error: LinkageError occurred while loading main class HelloJava java.lang.UnsupportedClassVersionError: HelloJava has been compiled by a more recent version of the Java Runtime (class file version 64.0), this version of the Java Runtime only recognizes class file versions up to 61.0 // switch to Java 15 and run: fails with LinkageError $ sdk use java 15.0.2.7.1-amzn $ java HelloJava Error: LinkageError occurred while loading main class HelloJava java.lang.UnsupportedClassVersionError: HelloJava has been compiled by a more recent version of the Java Runtime (class file version 64.0), this version of the Java Runtime only recognizes class file versions up to 59.0 // switch to Java 8 and run: fails with UnsupportedClassVersionError Exception $ sdk use java 8.0.352-amzn $ java HelloJava Error: A JNI error has occurred, please check your installation and try again Exception in thread "main" java.lang.UnsupportedClassVersionError: HelloJava has been compiled by a more recent version of the Java Runtime (class file version 64.0), this version of the Java Runtime only recognizes class file versions up to 52.0 at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:756) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:473) at java.net.URLClassLoader.access$100(URLClassLoader.java:74) at java.net.URLClassLoader$1.run(URLClassLoader.java:369) at java.net.URLClassLoader$1.run(URLClassLoader.java:363) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:362) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:601)

So. when compiled with a specific version of java compiler (in this case Java 20 and no version options are specified), it gets compiled with default target of the Java compiler which is 20. The class generated cannot be run on prior JVM versions (prior to 20). To find the target of JVM code the byte-code is generated for, use javap -verbose HelloJava.class | grep major.

Now, let's try compiling for target 17 and try to run on different JVM versions.

// compile with Java 20 for target 17: --source and --target options specified $ sdk use java 20.0.2-amzn $ javac --source=17 --target=17 HelloJava.java warning: [options] system modules path not set in conjunction with -source 17 1 warning // check: major version $ javap -verbose HelloJava.class | grep major major version: 61 // run on Java 20: works $ java HelloJava Hello Java 20! // switch to Java 17 and run: works $ sdk use java 17.0.1.12.1-amzn // run on Java 17: works $ java HelloJava Hello Java 17! // swicth to Java 15 and run: fails with LinkageError $ sdk use java 15.0.2.7.1-amzn $ java HelloJava Error: LinkageError occurred while loading main class HelloJava java.lang.UnsupportedClassVersionError: HelloJava has been compiled by a more recent version of the Java Runtime (class file version 61.0), this version of the Java Runtime only recognizes class file versions up to 59.0 // compile with Java 20 for target 17: --source and --target options specified $ sdk use java 20.0.2-amzn $ javac --source=15 --target=15 HelloJava.java warning: [options] system modules path not set in conjunction with -source 15 1 warning // check: major version $ javap -verbose HelloJava.class | grep major major version: 59 // switch to Java 17 and run: works $ sdk use java 17.0.1.12.1-amzn $ java HelloJava Hello Java 17! // swicth to Java 15 and run: fails with NoSuchMethodError $ sdk use java 15.0.2.7.1-amzn $ java HelloJava Hello Java 15! Exception in thread "main" java.lang.NoSuchMethodError: 'java.util.List java.util.stream.Stream.toList()' at HelloJava.main(HelloJava.java:15) // switch to Java 8 and run: fais with UnsupportedClassVersionError $ sdk use java 8.0.352-amzn $ java HelloJava Error: A JNI error has occurred, please check your installation and try again Exception in thread "main" java.lang.UnsupportedClassVersionError: HelloJava has been compiled by a more recent version of the Java Runtime (class file version 59.0), this version of the Java Runtime only recognizes class file versions up to 52.0 at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:756) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:473) at java.net.URLClassLoader.access$100(URLClassLoader.java:74) at java.net.URLClassLoader$1.run(URLClassLoader.java:369) at java.net.URLClassLoader$1.run(URLClassLoader.java:363) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:362) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:601) // compile with Java 20 for target 8: --source and --target options specified $ sdk use java 20.0.2-amzn $ javac --source=8 --target=8 HelloJava.java javac --source=8 --target=8 HelloJava.java warning: [options] bootstrap class path not set in conjunction with -source 8 warning: [options] source value 8 is obsolete and will be removed in a future release warning: [options] target value 8 is obsolete and will be removed in a future release warning: [options] To suppress warnings about obsolete options, use -Xlint:-options. 4 warnings // check: major version $ javap -verbose HelloJava.class | grep major major version: 52 // switch to Java 20 and run: works $ sdk use java 20.0.2-amzn $ java HelloJava Hello Java 20! // switch to Java 17 and run: works $ sdk use java 17.0.1.12.1-amzn $ java HelloJava Hello Java 17! // swicth to Java 15 and run: fails with NoSuchMethodError $ sdk use java 15.0.2.7.1-amzn $ java HelloJava Hello Java 15! Exception in thread "main" java.lang.NoSuchMethodError: 'java.util.List java.util.stream.Stream.toList()' at HelloJava.main(HelloJava.java:15)

So, when compiled for a target version, it cannot be run on prior JVM versions.

Let's try target 7.
$ sdk use java 20.0.2-amzn $ javac --source=7 --target=7 HelloJava.java warning: [options] bootstrap class path not set in conjunction with -source 7 error: Source option 7 is no longer supported. Use 8 or later. error: Target option 7 is no longer supported. Use 8 or later.
Java version 7 is not supported anymore.

Let's experience --release option.
// compile with Java 20 using options --source and --target options specified $ sdk use java 20.0.2-amzn $ javac --source=17 --target=17 --release=17 HelloJava.java error: option --source cannot be used together with --release error: option --target cannot be used together with --release Usage: javac <options> <source files> use --help for a list of possible options // compile with Java 20 for target 17: --release options specified $ sdk use java 20.0.2-amzn $ javac --release=17 HelloJava.java // check: major version $ javap -verbose HelloJava.class | grep major major version: 61 // switch to Java 20 and run: works $ sdk use java 20.0.2-amzn $ java HelloJava Hello Java 20! // switch to Java 17 and run: works $ sdk use java 17.0.1.12.1-amzn $ java HelloJava Hello Java 17! // switch to Java 15 and run: fails with NoSuchMethodError $ sdk use java 15.0.2.7.1-amzn $ java HelloJava Hello Java Exception in thread "main" java.lang.NoSuchMethodError: 'java.util.List java.util.stream.Stream.toList()' at HelloJava.main(HelloJava.java:14) // compile with Java 20 for target 15: --release options specified $ sdk use java 20.0.2-amzn $ javac --release=15 HelloJava.java HelloJava.java:14: error: cannot find symbol .toList().stream() ^ symbol: method toList() location: interface Stream<Entry<Object,Object>> 1 error // compile with Java 20 for target 15: --source --target options specified $ sdk use java 20.0.2-amzn $ javac --source=15 --target=15 HelloJava.java warning: [options] system modules path not set in conjunction with -source 15 1 warning // switch to Java 15 and run: fails with NoSuchMethodError $ sdk use java 15.0.2.7.1-amzn $ java HelloJava Exception in thread "main" java.lang.NoSuchMethodError: 'java.util.List java.util.stream.Stream.toList()' at HelloJava.main(HelloJava.java:14)

So, --release option does a strict compilation time checks to see if the code is compliant with the release target and fails to compile if a method not supported in target version is used in the code. This makes sure the compiled class works on target release (the target JVM version that the code is released to run on). Whereas, the --target option doesn't do code compliance checks during compilation time, it simply compiles code but fails during runtime. So, --release seems like better option to leverage when specifying.

Implications of version options

  • No compiler options specified: The code gets compiled with default target as the version of the Java compiler.
  • All 3 options  --source, --target and --release specified:  Not allowed.
  • Options --source--target specified: 1) Both can be same (e.g. 17, 17). 2) The option: --source can be lower version (e.g. 15) and --target can be higher version (e.g. 17), but not the other way. If --source is higher version (e.g.17) and --target is lower version (e.g.15), compiler fails with a warning: warning: source release 17 requires target release 17, it doesn't get compiled.
  • Only option --source: The option --source can be any version but default target would be the version of Java compiler being used.
  • Only option --target: The option: --target cannot be lower than the version of Java compiler being used because default source would be the version of the compiler being used. A lower target version gets into compiler option source higher, target lower and fails compilation. E.g javac --target=19 HelloJava.java with Java 20 compiler fails with warning: target release 19 conflicts with default source release 20 and code doesn't get compiled.
  •  Only option --release: Strict code check during compilation to make sure that compiled code gets compiled and works on the target version. Also, the byte-code generated runs only on the release specified and higher, but doesn't run on any lower versions.
    • Compiling using Java 20, and no --release option or --release=20 results with major version 64 (Java 20).
    • Compiling using Java 20 with --release=19 results with major version 63 (Java 19).
    • Compiling using Java 20 with --release=17 results with major version 61 (Java 17) and would result with LinkageError when run on lower version other than 17. Works on 17 and higher.

The maven-compiler-plugin variations with these options

Maven, out of the box with no maven-compiler-plugin specified in pom.xml, has the following variations with it's compiler version option properties (maven.compiler.source, maven.compiler.target and maven.compiler.release).

<properties> <maven.compiler.source>20</maven.compiler.source> <maven.compiler.target>20</maven.compiler.target> <maven.compiler.release>20</maven.compiler.release> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> </properties>
  • No version properties are specified: (No maven.compiler.source, maven.compiler.target and maven.compiler.release properties): Build fails with compilation ERRORS: Source option 5 is no longer supported. Use 8 or later. and Target option 5 is no longer supported. Use 8 or later.
  • All 3 version properties are specified: The option: maven.compiler.release is ignored, it can be any junk. Only source and target properties matter. The value for target option cannot be less than the source. For instance, source 20, target 19 fails build with warning: source release 20 requires target release 20.
  • Only source version property is specified: When only source is specified, target must also be specified. Otherwise, maven build fails with: Fatal error compiling: warning: source release 20 requires target release 20
  • Only target version property is specified: When only target is specified, source must also be specified. Otherwise, maven build fails with: Source option 5 is no longer supported. Use 8 or later.
  • The release property specified: When release is specified and is 8 or later, this takes the precedence. Make a special NOTE of it. When release is specified, it takes the precedence and sourcetarget options are ignored. In this case, any non-sense value will make the build work and code gets compiled for the release version specified. But this must be 8 or later.
With maven-compiler-plugin specified in pom.xml, has the following variations with these options specified in the plugin configuration (<configuration>): 

<properties> <maven.compiler.source>20</maven.compiler.source> <maven.compiler.target>20</maven.compiler.target> <maven.compiler.release>20</maven.compiler.release> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> </properties> ... <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>3.11.0</version> <configuration> <source>20</source> <target>20</target> <release>20</release> </configuration> </plugin> </plugins> </build>

With the above way of having both defined set of properties, and maven-compiler-plugin configuration, the configuration values override the property values defined. If no configuration is specified for the maven-compiler-plugin, it uses the set of properties defined. With configuration <source>, <target>, and <release> taking the precedence, the variations are as follows:
  • No <configuration> specified for the plugin, but set of properties are defined: It uses defined set of properties if exist, release takes the precedence over target and source.
  • No properties are defined, and no <configuration> is specified for the plugin : It defaults to target 1.8.
  • All 3 configuration options are specified: The release configuration takes the precedence and is built for the release version.
  • No properties are set, and only source configuration option is specified: When only source is specified, target must also be specified. Otherwise, maven build fails with: Fatal error compiling: warning: source release 20 requires target release 20
  • No properties are set, and only target configuration option is specified: Code gets compiled for the target specified.
  • No properties are set, and both target and release configuration options are specified: The release option takes the precedence and code gets compiled for the release specified.

Summary

For Java 9 and after, use --release option.
For older versions prior to Java 9 use --source and --target options.

TIPS

  • Use SDKMAN to install multiple Java versions and easily switch between different versions.
  • If You see noisy warning: Using deprecated '-debug' fallback for parameter name resolution. Compile the affected code with '-parameters' instead or avoid its introspection:, then add compiler argument <arg>-parameters</arg> to the maven-compiler-plugin configuration as shown below:
<plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>${maven-compiler-plugin.version}</version> <configuration> <source>${javac.source.version}</source> <target>${javac.target.version}</target> <release>${javac.release.version}</release> <compilerArgs> <arg>-Xlint:all</arg> <arg>-parameters</arg> </compilerArgs> </configuration> </plugin>

References