Thursday, September 22, 2022

Enums - all the way to persistence . . .

In any application there would be a need for pre-defined ordered set of constants. Enum is a data type good for such cases. Java has enum since 1.5. Databases do support enum type and PostgreSQL has a special support for this since release 8.3. JPA and Spring Data is a good match to use in modern Java applications for persistence, especially in Spring Boot applications. 

Environment: Java 17, Spring Boot 2.6,7 on macOS Catalina 10.15.7

Example Scenario - A persistable entity object in a Spring Boot micro-service application with JPA and PostgreSQL DB.

DDL Script
-- create enum type genders CREATE TYPE genders AS ENUM( 'MALE', 'FEMALE' ); -- create people table CREATE TABLE people( id VARCHAR(36) PRIMARY KEY, first_name VARCHAR(50) NOT NULL, last_name VARCHAR(50) NOT NULL, gender genders NOT NULL ); -- Unique Constraints ALTER TABLE people ADD CONSTRAINT people_fname_lname_uk UNIQUE (first_name, last_name);

Maven dependencies: pom.xml
... <dependencyManagement> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>2.6.3</version> <type>pom</type> <scope>import</scope> </dependency> </dependencies> </dependencyManagement> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-data-jpa</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-validation</artifactId> </dependency> <dependency> <groupId>com.vladmihalcea</groupId> <artifactId>hibernate-types-55</artifactId> <version>2.16.2</version> </dependency> <dependency> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> <version>1.18.20</version> <scope>provided</scope> </dependency> ... </dependencies> ...

Enum: Gender.java
import lombok.AllArgsConstructor; @AllArgsConstructor public enum Gender { MALE("Male"), FEMALE("Female"); String genderName; }

Domain Object: Person.java
import com.vladmihalcea.hibernate.type.basic.PostgreSQLEnumType; import lombok.AllArgsConstructor; import lombok.Builder; import lombok.Data; import lombok.NoArgsConstructor; import org.hibernate.annotations.Type; import org.hibernate.annotations.TypeDef; import javax.persistence.Column; import javax.persistence.Entity; import javax.persistence.EnumType; import javax.persistence.Enumerated; import javax.persistence.GeneratedValue; import javax.persistence.GenerationType; import javax.persistence.Id; import javax.persistence.Table; import javax.persistence.UniqueConstraint; import javax.validation.constraints.NotNull; import java.util.UUID; @Entity @Table( name = "people", uniqueConstraints = { @UniqueConstraint( columnNames = {"firstName", "lastName"}, name = "people_fname_lname_uk" ) } ) @TypeDef( name = "pgsql_enum", typeClass = PostgreSQLEnumType.class ) @Data @Builder @NoArgsConstructor @AllArgsConstructor public class Person { @Id @GeneratedValue(strategy = GenerationType.AUTO) @Column(length = 36, nullable = false, updatable = false) @Type(type="org.hibernate.type.UUIDCharType") private UUID id; @NotNull String firstName; @NotNull String lastName; @NotNull @Enumerated(EnumType.STRING) @Type(type = "pgsql_enum") Gender gender; }

JPA Repository: PersonRepository.java
import org.springframework.data.jpa.repository.JpaRepository; import org.springframework.data.jpa.repository.Query; import java.util.UUID; public interface PersonRepository extends JpaRepository<Person, UUID> { Person findByFirstNameAndLastNameAndGender(String firstName, String lastName, Gender gender); @Query(value = "SELECT * FROM people WHERE first_name = :firstName AND last_name = :lastName AND gender = CAST(:#{#gender.name()} as genders)", nativeQuery = true) Person findByFirstNameAndLastNameAndGenderNQ(String firstName, String lastName, Gender gender); }

JPA Repository Test: PersonRepositoryIT.java
import org.junit.Test; import org.junit.runner.RunWith; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.autoconfigure.jdbc.AutoConfigureTestDatabase; import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest; import org.springframework.boot.test.autoconfigure.orm.jpa.TestEntityManager; import org.springframework.test.context.ActiveProfiles; import org.springframework.test.context.junit4.SpringRunner; import java.util.List; import static org.assertj.core.api.Assertions.assertThat; @ActiveProfiles("test") @RunWith(SpringRunner.class) @DataJpaTest @AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE) public class PersonRepositoryIT { @Autowired TestEntityManager testEntityManager; @Autowired PersonRepository personRepository; @Test public void findBy_returns_expected() { // given: data in DB List.of( Person.builder().firstName("Giri").lastName("Potte").gender(Gender.MALE), Person.builder().firstName("Boo").lastName("Potte").gender(Gender.MALE) ).forEach(testEntityManager::persist); testEntityManager.flush(); // expect: assertThat( personRepository.findByFirstNameAndLastNameAndGender("Giri", "Potte", Gender.MALE) ).isNotNull() .extracting("firstName", "lastName", "gender") .isEqualTo(List.of("Giri", "Potte", Gender.MALE)); // expect: assertThat( personRepository.findByFirstNameAndLastNameAndGenderNQ("Boo", "Potte", Gender.MALE) ).isNotNull() .extracting("firstName", "lastName", "gender") .isEqualTo(List.of("Boo", "Potte", Gender.MALE)); } }

TIP

From Database side, sometimes, it's handy to have some SQLs dealing with enum type to create, drop, list values, add a new value, modify existing value etc.

I was just fiddling with PostgreSQL enum type and the following is the link:

References

Saturday, September 17, 2022

Do more with less - the way to write code, what Groovy taught me . . .

I was not happy at all going back to verbose native Java, leaving my years of happy Groovy and Grails development on JVM. The shrinking market for Groovy pushed me out, but I am not away from it. I still stay with it and write Groovy code on any given day. At least on the side, when I explore and experience some of new Java language features along the way. One question always comes to my mind is - how did we do it in Groovy, why didn't we have to worry about this obvious expectations. The comparison always pleasantly proves that Groovy stood by it's literal word-meaning- very pleasant, to work with, and a very well thought-out and super consistent language on JVM right from day-1.

Environment: Java 17, Groovy 4.0.2 on macOS Catalina 10.15.7

In my recent Java code, I was dealing with a list of request objects that come in a specific order, and had to give the response objects back in the same order after doing some internal processing that included database query for the list by ids with Spring Data's findAllBy conventional interface. In between I built a map from the request list for faster lookup of specific object's request details. I wanted to retain the order in the map and then the Java's functional feature threw me out to know some internals and come back before take things granted for to use it.

The following is a simple code snippet comparing both Java and Groovy features in the same Groovy script. That's another beauty of Groovy, you can write both Java and Groovy code in one class file.

import java.util.stream.Collectors // a data item record Item(Integer id, String name){} //groovy List items = [ new Item(1, 'One'), new Item(2, 'Two'), new Item(3, 'Three'), new Item(4, 'Four'), new Item(5, 'Five'), new Item(6, 'Six'), new Item(7, 'Seven'), new Item(8, 'Eight'), new Item(9, 'Nine'), new Item(10, 'Ten'), ] println "Groovy\n======" println "Items[id:Integer, name:String]: ${items}" def itemsMap = items.collectEntries{[it.name(), it.id()]} println "Items map by name (order preserved): ${itemsMap}" println "Items map keys (order preserved): ${itemsMap.keySet()}" // java List itemsJava = List.of( new Item(1, "One"), new Item(2, "Two"), new Item(3, "Three"), new Item(4, "Four"), new Item(5, "Five"), new Item(6, "Six"), new Item(7, "Seven"), new Item(8, "Eight"), new Item(9, "Nine"), new Item(10, "Ten") ); System.out.println("Java\n===="); System.out.println("Items[id:Integer, name:String]:" + itemsJava); var itemsMapJava = itemsJava.stream() .collect( Collectors.toMap( item -> item.name, item -> item.id ) ); System.out.println("Items map by name (order NOT preserved): " + itemsMapJava); System.out.println("Items map keys (order NOT preserved): " + itemsMapJava.keySet().stream().toList()); var itemsMapOrderPreserved = itemsJava.stream() .collect( Collectors.toMap( item -> item.name, item -> item.id, (key1, key2) -> key1, // key conflict resolver LinkedHashMap::new // pass the underlying map implementation you want, to preserve the order, defaults to HashMap ) ); System.out.println("Items map by name (order preserved): " + itemsMapOrderPreserved); System.out.println("Items map keys (order preserved): " + itemsMapOrderPreserved.keySet().stream().toList());

The output:

Groovy ====== Items[id:Integer, name:String]: [Item[id=1, name=One], Item[id=2, name=Two], Item[id=3, name=Three], Item[id=4, name=Four], Item[id=5, name=Five], Item[id=6, name=Six], Item[id=7, name=Seven], Item[id=8, name=Eight], Item[id=9, name=Nine], Item[id=10, name=Ten]] Items map by name (order preserved): [One:1, Two:2, Three:3, Four:4, Five:5, Six:6, Seven:7, Eight:8, Nine:9, Ten:10] Items map keys (order preserved): [One, Two, Three, Four, Five, Six, Seven, Eight, Nine, Ten] Java ==== Items[id:Integer, name:String]:[Item[id=1, name=One], Item[id=2, name=Two], Item[id=3, name=Three], Item[id=4, name=Four], Item[id=5, name=Five], Item[id=6, name=Six], Item[id=7, name=Seven], Item[id=8, name=Eight], Item[id=9, name=Nine], Item[id=10, name=Ten]] Items map by name (order NOT preserved): [Eight:8, Five:5, Six:6, One:1, Four:4, Nine:9, Seven:7, Ten:10, Two:2, Three:3] Items map keys (order NOT preserved): [Eight, Five, Six, One, Four, Nine, Seven, Ten, Two, Three] Items map by name (order preserved): [One:1, Two:2, Three:3, Four:4, Five:5, Six:6, Seven:7, Eight:8, Nine:9, Ten:10] Items map keys (order preserved): [One, Two, Three, Four, Five, Six, Seven, Eight, Nine, Ten]

Conclusion

Java started to evolve, changing faster than it used to be and faster than the community can catch up- all for good. It's slowly getting to be less verbose. Still, certain obvious expectations are not so obvious in Java.

Groovy shined next to legacy Java, still shines next to modern Java, on the JVM.

Wednesday, May 25, 2022

Maven - finer control of running tests . . .

There should always be something new to learn everyday. Otherwise, life is boring. In software development, it is even more boring if we don't explore to learn anything new.

Maven is popular open source build tool but comes with a price tag of "Time", consumes much of your time - the most precious of all. I've just leaned a better way to take a finer control of running specific unit/integration test. Two years ago, when I was back to Maven-Java world from Gradle-Groovy/Grails world, I had to do quite a bit of exploration. I still end up doing now and then even after a couple of years. Welcome to the Maven world ;)

Environment: Java 18, maven 3.8.5 on macOS Catalina 10.15.7

Surefire and Failsafe are maven plugins used for unit and integration tests. Out-of-the-box, with no additional configuration, you can simply skip running all tests by passing the command line argument to define the property skipTests like :  -DskipTests. This skips all tests, both unit and integration which is also same setting it to true: -DskipTests=true. So, specifying or specifying by setting it with true is same. By default, this property is set with false. So, not specifying at all on the command line, or specifying it with false like: -DskipTests=false is same.

With Surefire and Failsafe plugins and no additional configurations, tests can be run like shown( assuming that we have a multi-module project with my-service-api as a module and running maven commands for the api module from the root project) below:

Run all unit tests:
 ./mvnw -pl my-service-api clean test
Run specific unit test:
 ./mvnw -pl my-service-api clean test -Dtest=MyService1
Run specific unit test method:
 ./mvnw -pl my-service-api clean test -Dtest=MyService1#myServiceTest1

Run all integration tests:
 ./mvnw -pl my-service-api clean integration-test 
Run specific integration test:
 ./mvnw -pl my-service-api clean integration-test -Dit.test=MyServiceIT1
Run Run specific integration test method:
 ./mvnw -pl my-service-api clean integration-test -Dit.test=MyServiceIT1#method1

The problem with this is, running integration test, runs all unit-tests. If you DO NOT want unit tests to be run, when integration test/tests are run, we need to have a finer control to turn off unit tests.

The following explicit surefire and failsafe plugin configuration by three additional properties will give the finer control with the -DskipTests flag intact.

<properties> ... <!-- For finer control of running tests --> <skipTests>false</skipTests> <skipUTs>${skipTests}</skipUTs> <skipITs>${skipTests}</skipITs> </properties> <build> <plugins> ... <!-- surefire for unit tests --> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire-plugin</artifactId> <version>${maven-surefire-plugin.version}</version> <dependencies> <dependency> <groupId>org.apache.maven.surefire</groupId> <artifactId>surefire-junit47</artifactId> <version>${maven-surefire-plugin.version}</version> </dependency> </dependencies> <configuration> <skipTests>${skipUTs}</skipTests> </configuration> </plugin> <!-- failsafe for integration tests --> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-failsafe-plugin</artifactId> <version>${maven-failsafe-plugin.version}</version> <dependencies> <dependency> <groupId>org.apache.maven.surefire</groupId> <artifactId>surefire-junit47</artifactId> <version>${maven-surefire-plugin.version}</version> </dependency> </dependencies> <executions> <execution> <id>integration-tests</id> <goals> <goal>integration-test</goal> <goal>verify</goal> </goals> <configuration> <argLine>${integrationTestCoverageAgent}</argLine> <classesDirectory>${project.build.outputDirectory}</classesDirectory> <includes> <include>**/*IT.java</include> </includes> </configuration> </execution> </executions> <configuration> <skipTests>${skipITs}</skipTests> </configuration> </plugin> ... </plugins> ... </build>

With the above three defined additional properties and set to default: false for both the plugins, all tests run out of the box. Now, with their own separate properties defined for unit-tests and integration-tests, unit tests can be set not to run when we want to run specific integration test or all like:

Run specific integration test, don't run unit tests:
 ./mvnw -pl my-service-api clean integration-test -DskipUTs=true -Dit.test=MyServiceIT1

Run specific integration test, don't run unit tests (same as above, no explicit true):
 ./mvnw -pl my-service-api clean integration-test -DskipUTs -Dit.test=MyServiceIT1

But using verify goal instead of integration-test goal is better. Reason in TIP.

Run specific integration test (BETTER):
 ./mvnw -pl my-service-api clean verify -DskipUTs -Dit.test=MyServiceIT1

Run specific integration test (CLEANER):
 ./mvnw -pl my-service-api clean post-integration-test -DskipUTs -Dit.test=MyServiceIT1

But using post-integration-test goal instead of integration-test goal is cleaner. Reason in TIP.

TIP

Use verify goal instead of integration-test. If you have Docket container started for DB in integration test phase, it will not be stopped after finishing running tests. This will make your subsequent runs fail to start the container. You will running docker command to remove the container (docker rm -f <postgres-image-name>.

The goal: verify is a way to have cleaner way of executing integration test cases in this case.

The goal verify not only runs integration test case(s) but also verifies code coverage for coverage threshold check. The build might result with FAILURE at the end due to coverage threshold not being met. That is ok, we know that we are excluding unit tests and only running partial integration tests and won't expect coverage threshold to be met. But, it will execute cleanly stopping the containers if started any.

The goal: post-integration-test is another cleaner way to execute integration test cases.

Unlike the goal verify, this runs all phases up to post-integration-test (pre-integration-test, integration-test and post-integration-test) leaving out the last verify phase that checks code coverage.  This will execute cleanly stopping the containers if started.

A typical docker container plugin configurations to start before running integration test(s) and stop after running test(s) looks like:

<plugin> <groupId>io.fabric8</groupId> <artifactId>docker-maven-plugin</artifactId> <version>0.33.0</version> <executions> <execution> <id>start</id> <phase>pre-integration-test</phase> <goals> <goal>start</goal> </goals> </execution> <execution> <id>stop</id> <phase>post-integration-test</phase> <goals> <goal>stop</goal> </goals> </execution> </executions> <configuration> <images>  </images> </configuration> </plugin>


References

Saturday, May 07, 2022

Keep your Maven builds DRY - leverage placeholder feature in multi-module project for version . . .

Another Maven blog post in a row, makes me feel like digging into Maven never ends ;). It's an XML world anyway, and requires considerable effort in making any small feature change to work.

I created a maven multi-module Spring Boot micro-service application a couple of years ago which is a key service for the business. Every time when there is a feature change, or a new feature addition, I always look for opportunities to upgrade tech-stack. Of course, Maven cannot be left behind. I keep upgrading maven wrapper, the tech-stack, and make build scripts better following the DRY programming principle.

Problem Context

The Spring Boot micro-service is a maven multi-module build project with about 4 sub modules (lib, domain, etl, and api). The root module and all sub-modules have semantic <version> tag specified. Due to some limitations that I ran into earlier with an older maven version, the semantic <version> tag value in all modules was repeated. So, every time when there is a version change, we had to update value in all modules. Our CI environment tags every pull request that gets merged into master/main Git branch by appending timestamp, and git-commit-id to the semantic version tag specified in the build scripts like: <semantic-version>-<timestamp in YYYYMMddHHmm format>.<short-commitId> (e.g. 2.0.1-202205070824.174cd82), thus making every commit a release candidate. However, the semantic version that we specify in maven builds is what we decide to change based on the nature of the feature. 

Environment: Java 17, Spring Boot 2.5.6, maven 3.8.5 on macOS Catalina 10.15.7

Starting from 3.5.0 maven started to allow placeholders for versions. A property (e.g. my-version) can be defined in root module with a value and the property can be used as a placeholder for <version> tags in all modules including the root module like: <version>${my-version}</version>. This feature alone might work for a single module project, but doesn't work when you have a multi-module maven build project. An extra plugin is needed for it to work. Otherwise your placeholders in sub-module dependencies are not resolved and replaced with its value and causes errors on instances like when you run any maven goal for a specific sub-module that has root module specified in <parent> block. Several blogposts and Stackoverflow question-answer references only talk about this feature with example XML snippets. 

Maven Flatten Plugin

The missing important piece is the Maven Flatten Plugin. This plugin makes the feature complete. Maven documentation about this feature does talk about this. But due to the nature of today's fast-paced development and not that great Maven's documentation, developers rely more on Stackoverflow and other direct Google hits. I also went through this, but finally ended up reading maven documentation, and test trials to make it work.

Following are examples pom.xml snippets of this feature.

Root module's pom.xml
<project ...> <groupId>com.giri.services</groupId> <artifactId>my-service</artifactId> <version>${my-service.version}</version> <modules> <module>my-lib</module> <module>my-service-domain</module> <module>my-service-etl</module> <module>my-service-api</module> </modules> <properties> <my-service.version>2.1.0-SNAPSHOT</my-service.version> ... </properties> <build> ... <plugins> ... <plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>flatten-maven-plugin</artifactId> <version>1.2.7</version> <configuration> <updatePomFile>true</updatePomFile> <flattenMode>resolveCiFriendliesOnly</flattenMode> </configuration> <executions> <execution> <id>flatten</id> <phase>process-resources</phase> <goals> <goal>flatten</goal> </goals> </execution> <execution> <id>flatten.clean</id> <phase>clean</phase> <goals> <goal>clean</goal> </goals> </execution> </executions> </plugin> </plugins> </build> </project>

Sub-module's pom.xml
<project ...> ... <parent> <groupId>com.giri.services</groupId> <artifactId>my-service</artifactId> <version>${my-service.version}</version> </parent> <artifactId>my-service-domain</artifactId> <packaging>jar</packaging> ... </project>

With this, next time when I want to bump up revision numbers, I only need to change the value in one pom.xml file from the root, unlike 5 earlier. That makes it DRY.

TIPS

IntelliJ inline Error in sub-module's pom.xml file
IntelliJ IDEA still complains with an error saying Properties in parent definition are prohibited for the inline placeholder in sub-module's <parent> block though you set it to use maven wrapper of your app or maven installed on your system which is higher than 3.5.0, or whatever through IDEA preferences for Maven Build.

Simply ignore this. Your build works both inside IDEA or outside from command line.

Extra . files generated
Also, notice that there are extra .flattened-pom.xml files generated in root and every sub-module folders. Just let them hang around there.

Avoid conflicting placeholder names
I wouldn't use ${revision} as  as the placeholder name for this feature as specified in the maven document when I also have the release candidate plugin. Release candidate plugin has this placeholder name reserved for git-commit-id.

References

Monday, May 02, 2022

Maven - running Java app . . .

When Maven is the build tool and you have no choice, you need to factor in additional amount of development time dealing with its build file: the pom.xml. Any new additional feature eats up unexpected time.

Environment: Java 17, maven 3.8.5 on macOS Catalina 10.15.7

Recently, I had to work on a task that involved writing a new Plain Old Java Application (POJA) with a main method. This application is new addition to the existing half-a-dozen Java applications family residing in a Maven module in a multi-module project. Each application is a kind of ETL app (Extract, Transform, and Load data) with XML, Excel spread-sheets as data sources. The transformed output is Flyway SQL script. Sounds simple! But, not really!!

The existing applications family has a strong contract with their parent class through Inheritance (OO model) & Copy-and-paste (popular dev-model) for sharing code, inherit about 50 final and non-final Static constants  Also, every application's specific run requires changing constant values for every run and checkin latest code changes. Huh, old messy way of maintaining apps. Joining the legacy family, following the inheritance model (to keep the family relationship), at least I wanted to 1) Eliminate Copy-and-paste sharing model 2) Add CLI support to pass in run specific values for constants.

In Groovy world, no specific library is needed for developing Java apps with CLI support. It comes with CliBuilder. For Java, googling found me Picocli. But Maven got in the way giving bit hard time for executing the application. After going through some stackoverflow explanations and my experiments, learnt that two ways of running the app is possible. Both require to leverage the plugin: Exec Maven Plugin.

1. Configure the plugin - if there are multiple applications, multiple executions can be setup. 

... <plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>exec-maven-plugin</artifactId> <version>1.4.0</version> <executions> <!-- mvn compile exec:java@my-etl-app --> <execution> <id>my-etl-app</id> <goals> <goal>java</goal> </goals> <configuration> <mainClass>com.giri.etl.MyEtlAppPicoCli</mainClass> </configuration> </execution> <!-- mvn compile exec:java@my-etl-app-new --> <execution> <id>my-etl-app-new</id> <goals> <goal>java</goal> </goals> <configuration> <mainClass>com.giri.etl.MyEtlAppNewPicoCli</mainClass> </configuration> </execution> </executions> </plugin> </plugins> </build>

With this the application can be run by specifying the goal exec:java like:
mvn compile exec:java@my-etl-app
 
assuming that the code is compiled and the other dependent modules are built.

2.  Configure the plugin in Profiles - if there are multiple applications, multiple profiles can be setup. 

... <profiles> <!-- mvn exec:java -Pmy-etl-app --> <profile> <id>my-etl-app</id> <build> <plugins> <plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>exec-maven-plugin</artifactId> <version>1.4.0</version> <configuration> <mainClass>com.giri.etl.MyEtlAppPicoCli</mainClass> </configuration> </plugin> </plugins> </build> </profile> <!-- mvn exec:java -Pmy-etl-app-new --> <profile> <id>my-etl-app-new</id> <build> <plugins> <plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>exec-maven-plugin</artifactId> <version>1.4.0</version> <configuration> <mainClass>com.giri.etl.MyEtlAppNewPicoCli</mainClass> </configuration> </plugin> </plugins> </build> </profile> </profiles> </project>

With this the application can be run by specifying the goal exec:java like:
mvn compile exec:java -Pmy-etl-app
assuming that the code is compiled and the other dependent modules are built.

TIP-1

There is also a standard way of specifying the main application class directly like:
mvn compile exec:java -Dexec.mainClass="com.giri.etl.MyEtlAppPicoCli" -Dexec.args="-y=2022"

To pass application arguments to the Java app when running the maven goal: exec:java, use -Dexec.args like: 
mvn compile exec:java -Pmy-etl-app -Dexec.args="-<arg1>=<value1> -<arg2>=<value2>"

E.g. 
mvn exec :java -Pmy-etl-app -Dexec.args="-y=2022 -run-extra-checks=false"
Assuming that -y, -run-extra-checks are specific CLI arguments added to my app by leveraging Picocli Framework.

TIP-2

Java Preview Feature maven support (both compile-time, and run-time)

Java preview features are disabled by default. If any of the preview features are used in the code, it must be enabled explicitly by using the command line option --enable-preview for both compiler and runtime.

Compile-time
For compile-time support in maven builds, the maven-compiler-plugin must be configured to pass the compiler flag as shown below:

</build> <plugins> ... <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>3.7.0</version> <configuration> <release>${jdk.version}</release> <source>${jdk.version}</source> <target>${jdk.version}</target> <compilerArgs> <arg>-Xlint:all</arg> <arg>--enable-preview</arg> </compilerArgs> </configuration> </plugin> </plugins> </build>

Run-time
With Exec Maven Plugin used for running the Java application, the run-time preview features enabling is also necessary. The above maven-compiler-plugin configuration is only good for compilation. For runtime system in this scenario, a jvm.config file needs to exist in the .mvn folder of the project where you are running the application from.
The content of this file should be just:
--enable-preview

References