The JsonPointer. Simplifying the way to work with JSON

Since Java API for JSON ProcessingĀ  (JSR 374) version 1.1, it is possible to use JosnPointer.

JsonPointer is a specification of rfc6901 and as we can read on it, JSON Pointer defines a string syntax for identifying a specific value
within a JavaScript Object Notation (JSON) document.

In other words, it is possible now to evaluate and change values from our JsonObjects using a pointer string instead to go through the whole chain of calls and recreating an object builder at the end.

So instead of that:

    String nameWithObject = jsonObject.getJsonArray("user_mentions").getJsonObject(0).getString("name");

we can do that:

    String nameWithPointer = ((JsonString)Json.createPointer("/user_mentions/0/name").getValue(jsonObject)).getString();

We can easily see, that the use of pointers make easier to know which element we are fetching and more intuitive to write.
However, since the pointer is returning a JsonValue, we need to use a cast to be able to fetch the final value.

Why JsonPointer is not providing methods to directly get java types like JsonObject is doing, is something I do not really know.

So, what can we do with the JSON pointer?
We can not only get values from a JsonStructure using pointer notation but also modify the object without the need to reconvert it into its builder equivalent. So specifically we can:

  • add a value to a JsonStructure
  • check if a value is contained into a JsonStructure
  • remove a value from a jsonStructure
  • replace a value into a JsonStructre

Let’s see some examples. For the examples, I will use the json object shown at the end of thas a basis:

  • Get a simple value from an object.
    JsonNumber id = ((JsonNumber) Json.createPointer("/id").getValue(example));
  • get an object from an object.
        JsonObject user = Json.createPointer("/user").getValue(example).asJsonObject();
  • get an array from an object
        JsonArray userMentions = Json.createPointer("/user_mentions").getValue(example).asJsonArray();
  • get an element from an array
        JsonObject mention = Json.createPointer("/user_mentions/0").getValue(example).asJsonObject();
        String mentionName = ((JsonString) Json.createPointer("/user_mentions/0/name").getValue(example)).getString();
        int mentionIndex0 = ((JsonNumber) Json.createPointer("/user_mentions/0/indices/1").getValue(example)).intValue();
  • check if an object contains an element
  • Add a simple value
           JsonObject extendedExample = Json.createPointer("/timestamp").add(example, Json.createValue(System.currentTimeMillis()));
  • Add an element to a JsonArray. The pointer must point to the last_element + 1 index. Empty elements would produce an error.
        extendedExample = Json.createPointer("/user_mentions/0/indices/2").add(extendedExample, Json.createValue(30));
        Assertions.assertEquals(30, ((JsonNumber) Json.createPointer("/user_mentions/0/indices/2").getValue(extendedExample)).intValue());
  • Replace elements
        example = Json.createPointer("/id").replace(example, Json.createValue(2));
        Assertions.assertEquals(2, example.getInt("id"));
        example = Json.createPointer("/user_mentions/0/indices/1").replace(example, Json.createValue(9999));
        Assertions.assertEquals(9999, ((JsonNumber) Json.createPointer("/user_mentions/0/indices/1").getValue(example)).intValue());
  • Remove elements
        example = Json.createPointer("/id").remove(example);

    Json used for the examples:

"id": 1,
"name": "some-name",
"lastname": "some-lastname"
"user_mentions": [
"name": "Twitter API",
"indices": [
"screen_name": "twitterapi",
"id": 6253282,
"id_str": "6253282"

In another article I will speak about the JsonPatch, rfc defined as

JSON Patch defines a JSON document structure for expressing a sequence of operations to apply to a JavaScript Object Notation (JSON) document; it is suitable for use with the HTTP PATCH method. The "application/json-patch+json" media type is used to identify such patch documents.

Sourcecode on github

create your mockito.ArgumentMatcher

When testing you usually need to mock. I mock mostly using Mockito and usually, I stub using when and verify calls using verify.

Normally you will want to verify that a given method has called with specific parameters or that a mocked method returns the desired value only when called also with specific parameters.

In the following class, an article is the same if its id is the same. We can, however, be interested in match the article using its contents (tile and text) instead of its database id.

public class Article {

  private int id;
  private String title;
  private String text;

  public Article(int id, String title, String text) { = id;
    this.title = title;
    this.text = text;

  public int getId() {
    return id;

  public String getTitle() {
    return title;

  public String getText() {
    return text;

  public boolean equals(Object o) {
    if (this == o) return true;
    if (o == null || getClass() != o.getClass()) return false;
    Article article = (Article) o;
    return id ==;

  public int hashCode() {
    return Objects.hash(id);

Usually, it is enough to use the Mocikto.eq() ArgumentMatcher that will use the Object.equals() method.


However, if we want to match against its contents, we need a new matcher “ArticleMatcher” that compares title and text. Therefore, I will just create a class ArticleMatcher implements ArgumentMatcher

class ArticleMatcher implements ArgumentMatcher
<Article> {

  public final Article article;

   * register our matcher.
  public static Article eq(Article article) {
    mockingProgress().getArgumentMatcherStorage().reportMatcher(new ArticleMatcher(article));
    return null;

  public ArticleMatcher(Article article) {
    this.article = article;

   * Implements matches method with our matching logic.
   * @param article
   * @return
  public boolean matches(Article article) {
    return this.article.getText().equalsIgnoreCase(article.getText());

  public String toString() {
    return "<ArticleMatcher>";

Now we can use our ArgumentMutcher to create stubs and verify calls:

class ArticleTest {

  void checkVerify() {

    Article article1 = new Article(1, "someText", "title");
    Article article2 = new Article(2, "someText", "title");
    Article article3 = new Article(3, "someText", "title");

    Publication publication = Mockito.spy(Publication.class);

    Mockito.verify(publication, Mockito.times(0)).addArticle(Mockito.eq(article3));


  void checkStub() {

    Article article1 = new Article(1, "someText", "title");
    Article article2 = new Article(2, "someText", "title");
    Article article3 = new Article(3, "someText", "title");

    Publication publication = Mockito.spy(Publication.class);
    Mockito.when(publication.getArticlesLike(ArticleMatcher.eq(article2))).thenReturn(Arrays.asList(article1, article2));

    List<Article> articles = publication.getArticlesLike(article3);
    Assertions.assertEquals(2, articles.size());

So easy šŸ™‚

code in github

Java CLI!


When writing devops scripts I lose a lot of time when writing the logic in bash. Bash has a lot of powerful commands with functions. However, the orchestration of all these commands, returns checks, functions, debug, … is extremely hard using pure shell script.


Solution: JDK 11!

Since JDK 11 It is possible to run Java code without any compilation. That opens the possibility to use java files as CLI scripts. It is a similar solution to using javascript with nashorn in order to use the full power of java.

So my proposal is to orchestrate the script using java while executing bash commands.

This approach has 2 issues to solve.

  1. Be able to easily pass run parameters: CREST from Tomitribe!
  2. Execute shell commands and get the output.

Crest, as defined by tomitribe is a “Command-line API styled after JAX-RS”. For example, the declaration of a “ls” command would look like:

  public void ls(@Option("a") boolean all, @Option("h") boolean human, @Option("l") boolean list, URI path) {

Crest has a lot of options, so I suggest you take a look at his page.

Now, we are able to create a java command easily. The implementation could be java based or we could want to call some other bash commands.
That can be done using plain java as follows:

public CompletableFuture<Process> executeCommand(String command, Consumer<String> stdOutConsumer, Consumer<String> errConsumer) {

    Process p;

    try {
      p = Runtime.getRuntime().exec(command);
      run(p::getInputStream, stdOutConsumer);
      run(p::getErrorStream, errConsumer);
      return p.onExit();
    } catch (IOException e) {
      throw new RuntimeException(e);

private void run(Supplier<InputStream> streamSupplier, Consumer<String> streamConsumer) {
    try (InputStream stream = streamSupplier.get()) {
      StringBuilder currentLine = new StringBuilder();
      int nextChar;
      while ((nextChar = != -1) {
        if (nextChar == '\n') {
        currentLine.append((char) nextChar);
    } catch (IOException e) {
      throw new RuntimeException(e);

Here, I am passing the standard output text to the Consumers<> line per line.

To run everything together without the need to compile you must add the CREST dependencies into the classpath:

java -cp /path/to/tomitribe-crest-api-0.10.jar:/path/to/tomitribe-crest-0.10.jar:/path/to/tomitribe-util-1.0.0.jar \

/path/to/blogs-posts-code/cli/src/main/java/tech/lacambra/blog/cli/ ls -la .

One useful use case is small DB backups scripts. I use java to check, create and delete folders and create propers logs while using native mysqldump command to execute the DB backup

you can get the full code on github

Automatic update of client static files without redeploying the WAR file

When creating web-applications in node.js a bunch of nice features is available. One of my favorites is that when a change has been performed, the change is immediately available on the browser.

In the case of Java EE, we will need to redeploy each type, an operation that with my MacBook takes about t2 seconds and with my corporate laptop about 10 seconds. Even not a complete disaster, I think that we can agree on that is not really optimal.

So what I would like is that once I have performed a change on a static file (js, html, css), this change is also immediately on the browser using my standard application server (I have tested it with Wildfly).

And so we can achieve it:

1. Create an exploded war.

2: Compile project to create the target folder with the deployed sources.

3: Using the onchange tool synchronize the statics file folder under webapp with the location where your IDE is copying the exploded files. This command looks for changes under a given directory and then executes any passed command. In our case, we just copy all the static files to the exploded target directory.

onchange 'path/to/watched/src/' -- cp -Rf path/to/watched/src/ /path/to/exploded/war/files

4: Begin coding.

With these simple steps, we will get our browser in snyc with our code. However, we still need to click F5 to update the browser.

State-machines and Bean Validation. Good fit for business objects flows.

In the last times, I have been involved in several projects following the same pattern.

One or more Business Objects with a state will change their states after receiving some external event.

When the objects are into a state, different validation rules apply.

This simple description too often ends up with a mess of “if”, “else, “switch blocks”, “spaghetti code” and so on, that makes readability, maintainability, and testability extreme hard.

Moreover, if the number of states and validations are high enough, the software becomes a side-effects nightmare.

So, in this article, I will try to explain a simple approach that will help to organize state-specific code in a more efficient way increasing readability, maintainability, extensibility, and testability and reduces side-effects without the need of any big implementation logic or external tools/libs/ platforms.

Bean Validation API. What is it?

The Bean Validation API is a specification of Java EE (JSR 380) that makes easy to validate objects and their fields.

It uses annotations to specify what must be validated and how. Once the validation happens, we will have available a list of errors that will give us all needed information about what has failed.

Interesting is that we can use groups. So we do not need to validate all fields at the same time, but we have a way to specify which fields must be validated.

For example, given the class Item:

public class Item {

  private Integer id;

  private String name;

  private BigDecimal price;

  public Item(Integer id, String name, BigDecimal price) { = id; = name;
    this.price = price;

we are validating that the id is not null, the attribute name cannot be an empty String and that price must be bigger than one. Then, to test that it works, we just need to run the following code:

class ValidationTest {

  private ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
  private Validator validator;

  void setUp() {
    validator = factory.getValidator();

  void validateSingleItem() {

    Item item = new Item(1, "MacbookPro", BigDecimal.ONE);
    Set&amp;amp;amp;amp;amp;lt;ConstraintViolation&amp;amp;amp;amp;amp;lt;Item&amp;amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp;gt; violations = validator.validate(item);

    item = new Item(1, "", BigDecimal.ZERO);
    violations = validator.validate(item);


In this case, I am using the Hibernate implementation of Bean Validation.

State machine. What is it and why should you use it?

In our context, I will define a state machine as the representation of the states where an object or flow can be and the transitions that allow going from one state to another state. The referenced object or flow can be in only and only one state at a time.

Match with Validation API:

It is in this context where the validation API comes to place. Using a state machine, it is trivial to integrate this business validation into the state machine. The only thing you need to do is to associate each state with one or more validation groups. Then, when the object enters a state, the state machine can apply the validation for the defined groups of the entering state.

Using this design, we achieve two goals:

We define and make it transparent which validation rules are applied each time in a semantic way, without any need to understand the validation logic itself.
Describing the states and transitions it makes really transparent how the object flow looks like

Let’s see the most simple example, where we have some business object or pojos with a state. The state flow of our business object can be represented using a StateMachine. In this example, the current state of the state machine is the object itself but in more complex scenarios, a state can represent the state and relations of several objects.

To model the object flow, we are gone a code a TransitionBuilder class. Using the builder, we will describe transitions from a source state to a target state when an event triggers.

 public class TransitionBuilder {
    private State source;
    private State target;
    private Object event;

    private TransitionBuilder() {

    public TransitionBuilder fromState(State source) {
      this.source = source;
      return this;

    public TransitionBuilder goToState(State target) { = target;
      return this;

    public TransitionBuilder onEvent(Object event) {
      this.event = event;
      return this;

    public TransitionBuilder addAndBeginTransition() {
      Transition transition = new Transition(source, target, event);
      return new TransitionBuilder();

    public StateMachine done() {
      Transition transition = new Transition(source, target, event);
      return StateMachineBuilder.this.done();

Now we can model our simple object flow. As an example, we are modeling a really simple order object. An order has a state INIT, (item)BOOKED, (item)DISPATCHED, ON_TRACK, DELIVERED.
Now, using the builder above, we just need to create our state model:

public StateMachine create() {

    InitState initState = new InitState();
    BookedState bookedState = new BookedState();
    DispatchedState dispatchedState = new DispatchedState();
    OnTrackState onTrackState = new OnTrackState();
    DeliveredState deliveredState = new DeliveredState();

    return new StateMachineBuilder()










It is possible now, using the Java Validation API to assign one or more validation groups to each state, and with a little bit of simple logic, we will trigger per each transition the validation with the groups of the target state.

The next code illustrates the workflow:

void testStateMachine() {

StateMachine stateMachine = new OrderStateMachineFactory().create();
Order order = new Order();

Optional&amp;amp;amp;amp;amp;lt;ConstraintViolationException&amp;amp;amp;amp;amp;gt; r = stateMachine.trigger(Event.START_ORDER, order);

System.out.println("1:" + r.get().getMessage());

r = stateMachine.trigger(Event.START_ORDER, order);

assertEquals(new BookedState().getName(), order.getState());

r = stateMachine.trigger(Event.DISPATCH, order);
System.out.println("2:" + r.get().getMessage());

order.setAddress("Major Str. PLZ 122 Berlin");

r = stateMachine.trigger(Event.DISPATCH, order);
assertEquals(new DispatchedState().getName(), order.getState());

r = stateMachine.trigger(Event.SEND, order);
assertEquals(new OnTrackState().getName(), order.getState());

r = stateMachine.trigger(Event.DELIVER, order);
assertEquals(new DeliveredState().getName(), order.getState());

The state machine itself is a simple class implementing a method trigger that, given an event and an object with a state (current state of the StateMachine) just look for the target event and triggers it.
The state object is just triggering the validation and updating the state of the object:

public class StateMachine {

  public List&lt;Transition&gt; transitions;

  public StateMachine(List&lt;Transition&gt; transitions) {
    this.transitions = new ArrayList&lt;&gt;(transitions);

  public Optional&lt;ConstraintViolationException&gt; trigger(Object event, StateObject stateObject) {

    Object state = stateObject.getState();

    Optional&lt;ConstraintViolationException&gt; r = transitions
        .filter(t -> t.getEvent().equals(event))
        .filter(t -> t.getSource().getName().equals(stateObject.getState()))
        .orElseThrow(() -> new InvalidTransitionException(event, stateObject.getState()))

    //Simulates a roll-back in case of error
    r.ifPresent(ex -&gt; stateObject.setState(state));

    return r;
  public Optional&lt;ConstraintViolationException&gt; onState(StateObject stateObject) {
    enterState((Order) stateObject);
    Set&lt;ConstraintViolation&lt;StateObject&gt;&gt; violations = validator.validate(stateObject, getValidationGroups());
    if (!violations.isEmpty()) {
      return Optional.of(new ConstraintViolationException("Violations on state " + getName() + ". " + toString(violations), violations));

    return Optional.empty();

You can find the code of this article on GitHub

Extrapolate “run” command of docker container

If you want to get a docker run command that emulates a running container, you can useĀ assaflavie/runlike :
docker run --rm -v /var/run/docker.sock:/var/run/docker.sock assaflavie/runlike YOUR-CONTAINER

As an output you get a docker run command containg all required parameters to get the same result as “YOUR-CONTAINER”.

E.g. To get the run command of a mariaDB container:
docker run --rm -v /var/run/docker.sock:/var/run/docker.sock assaflavie/runlike mariadb
and you get:
docker run
-p 33333:3306
--detach=true mariadb:latest mysqld

Avoid “permission denied” error when using docker in docker

When having the following message:

Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Get http://%2Fvar%2Frun%2Fdocker.sock/v1.38/images/json: dial unix /var/run/docker.sock: connect: permission denied

you need to assure that the user running in the container has the correct uid and gid than a user on the host with access to docker. That means that into the containerĀ  the user must belong to the docker group in the host.

You can use the docker parameter –group-add to the docker command to enforce that the user running in docker is also added into the given GID

docker run -d -v /var/run/docker.sock:/var/run/docker.sock -p --group-add 999 ...

Automatic update of js and html files while developing war applications

To automatic update client contents of a war app without to redeploy, it is just needed to copy changed files into the correct exploded war location.

To automaticallay perform the copy just use:

onchange 'path/to/watched/src/' -- cp -Rf path/to/watched/src/ /path/to/exploded/war/files

you can install onchange with npm:

npm -g install onchange