Object Mapping advanced features & QoL with Java

Object Mapping advanced features & QoL with Java

·

8 min read

This is the Java version of the Kotlin article.

When working with multi-layered applications, external libraries, a legacy code base or external APIs, we are often required to map between different objects or data structures.

In this tutorial, we will check out some Object Mapping libraries advanced features to simplify this task while saving development and maintenance time.

In our examples we will use the library ShapeShift. It's a light-weight object mapping library for Java/Kotlin with lots of cool features.

Auto Mapping

We will start with a bang. Auto mapping can and will save you lots of time and boiler-plate code. Some applications require manual mapping between objects, but most applications will save tons of time working on boring boiler-plate by just using this one feature. And it gets even better, with ShapeShift's default transformers we can even use automatic mapping between different data types.

Simple Mapping

Let's start with a simple example for auto mapping. We have our two objects, imagine they could also have tens, hundreds, or even thousands (for the crazy people here) of fields.

public class User {
    private String id;
    private String name;
    private String email;
    private String phone;
    // Getters, Setters, Constructors, Equals...
}

public class UserDTO {
    private String id;
    private String name;
    private String email;
    private String phone;
    // Getters, Setters, Constructors, Equals...
}

We want to map all the field from User to UserDTO. Using auto mapping we don't need to write any boiler-plate code. The mapper will be defined as follow:

MappingDefinition mappingDefinition = new MappingDefinitionBuilder(User.class, UserDTO.class)
        .autoMap(AutoMappingStrategy.BY_NAME_AND_TYPE)
        .build();

Voila! All the fields will be mapped automatically without any manual boiler-plate code.

Advanced Mapping

In this example we will use the power of default transformers to take auto mapping even further.

public class User {
    private String id;
    private String name;
    private Date birthDate;
    // Getters, Setters, Constructors, Equals...
}

public class UserDTO {
    private String id;
    private String fullName;
    private Long birthDate;
    // Getters, Setters, Constructors, Equals...
}

Note that the types of the birthDate field are different in the source and destination classes. But using the power of default transformers we can still use auto mapping here.

MappingDefinition mappingDefinition = new MappingDefinitionBuilder(User.class, UserDTO.class)
        .autoMap(AutoMappingStrategy.BY_NAME)
        .build();

We changed the auto mapping strategy to BY_NAME so it will map fields also with different types. Now we need to register a default transformer to the ShapeShift instance in order for it to know how to transform Date to Long.

ShapeShift shapeShift = new ShapeShiftBuilder()
        .withTransformer(
                Date.class, 
                Long.class, 
                new DateToLongMappingTransformer(), 
                true // Default Transformer
        )
        .build();
    }

We can also add manual mapping on top of the auto mapping in order to add/change behavior. The source and destination classes have different names for the name field so we will add manual mapping for it.

MappingDefinition mappingDefinition = new MappingDefinitionBuilder(User.class, UserDTO.class)
        .autoMap(AutoMappingStrategy.BY_NAME)
        .mapField("name", "fullName")
        .build();

Auto mapping is great for use cases that does not require specific mapping. It helps reduce the amount manual boiler-plate code needed to configure mapping and also helps you keep your sanity.

Transformers

Transformers are very useful feature that allows you to transform the type/value of a field to a different type/value when mapping a field.

Some use cases we have been using widely:

  • Transform date to long and vice versa between server and client objects.
  • Transform JSON string to it's actual type and vice versa between server and client objects.
  • Transform comma separated string to list of enums.
  • Transform another object id to its object or one of its fields from the DB using Spring transformers.

Basic Transformers

We will start with a simple transformer example. Date-to-Long and Long-to-Date transformers:

public class DateToLongMappingTransformer implements MappingTransformer<Date, Long> {
    @Nullable
    @Override
    public Long transform(@NonNull MappingTransformerContext<? extends Date> context) {
        return context.getOriginalValue() != null ? context.getOriginalValue().getTime() : null;
    }
}

public class LongToDateMappingTransformer implements MappingTransformer<Long, Date> {
    @Nullable
    @Override
    public Date transform(@NonNull MappingTransformerContext<? extends Long> context) {
        return context.getOriginalValue() != null ? new Date(context.getOriginalValue()) : null;
    }
}

All we need to do now is to register them.

ShapeShift shapeShift = new ShapeShiftBuilder()
        .withTransformer(Date.class, Long.class, new DateToLongMappingTransformer(), true) // "true" is optional, we are registering the transformers as default transformers, more on that later.
        .withTransformer(Long.class, Date.class, new LongToDateMappingTransformer(), true)
        .build();

That's it! We can now use the transformers when mapping objects.

public class User {
    private String id;
    private String name;
    private Date birthDate;
    // Getters, Setters, Constructors, Equals...
}

public class UserDTO {
    private String id;
    private String name;
    private Long birthDate;
    // Getters, Setters, Constructors, Equals...
}

MappingDefinition mappingDefinition = new MappingDefinitionBuilder(User.class, UserDTO.class)
        .mapField("id", "id")
        .mapField("name", "name")
        .mapField("birthDate", "birthDate").withTransformer(DateToLongMappingTransformer.class) // We don't have to state the transformer here because it is a default transformer
        .build();

Inline Transformers

In some use cases we want to transform the value but we don't need a reusable transformer and we don't want to create a class just for a one time use.

Inline transformers for the rescue! Inline transformers allow to transform the value without the need to create and register and transformer.

MappingDefinition mappingDefinition = new MappingDefinitionBuilder(Source.class, Target.class)
        .mapField("birthDate", "birthYear").withTransformer(context -> context.getOriginalValue() != null ? ((Date) context.getOriginalValue()).getYear() : null)
        .build();

Advanced Transformers

Transformers also allow us to do transformations with the DB or other data sources.

In this example we will use the power of the Spring Boot integration to create transformers with DB access.

We have three models:

  • Job - DB entity.
  • User - DB entity.
  • UserDTO - Client model.
public class Job {
    private String id;
    private String name;
    // Getters, Setters, Constructors, Equals...
}

public class User {
    private String id;
    private String jobId;
    // Getters, Setters, Constructors, Equals...
}

public class UserDTO {
    private String id;
    private String jobName;
    // Getters, Setters, Constructors, Equals...
}

We want to convert the jobId on User to jobName on UserDTO by querying the job from the DB and setting it on the DTO.

In Spring's case, you generally avoid interaction with the application context from static functions or functions on domain objects.

We will use a ShapeShift's Spring integration to create a component as a transformer to access our DAO bean.

@Component
public class JobIdToNameTransformer implements MappingTransformer<String, String> {

    @Autowired
    private JobDao jobDao;

    @Nullable
    @Override
    public String transform(@NonNull MappingTransformerContext<? extends String> context) {
        if (context.getOriginalValue() == null) {
            return null;
        }
        Job job = jobDao.findJobById(context.getOriginalValue());
        return job.getName();
    }
}

All that's left to do is to use this transformer in our mapping.

MappingDefinition mappingDefinition = new MappingDefinitionBuilder(User.class, UserDTO.class)
        .mapField("id", "id")
        .mapField("jobId", "jobName").withTransformer(JobIdToNameTransformer.class)
        .build();

Another bonus of using transformers is their reusability. In some use cases, We could create more generic transformers that will have application-wide usage.

Default Transformers

When registering transformers you can indicate wether a transformer is a default transformer. A default transformer of types \ is used when you map a field of type \ to field of type \ without specifying a transformer to be used.

As we already seen, default transformers are useful for recurring transformations and especially for automatic mapping.

Deep Mapping

What if we want to map from/to fields that available inside a field which is an object? We can even do that, easily.

In order to access child classes we can use the full path of a field. Let's look at the following example.

public class From {

    private Child child = new Child();
    // Getters, Setters, Constructors, Equals...

    class Child {
        private String value;
        // Getters, Setters, Constructors, Equals...
    }
}

public class To {
    private String childValue;
    // Getters, Setters, Constructors, Equals...
}

We want to map the value field in Child class inside the From class to the childValue field in the To class. We will use the full path of value which is child.value.

MappingDefinition mappingDefinition = new MappingDefinitionBuilder(From.class, To.class)
        .mapField("child.value", "childValue")
        .build();

The full path is supported in both source and destination fields, it also supports multi level depth (e.g. x.y.z).

Conditional Mapping

Conditions allow us to add a predicate to a specific field mapping to determine whether this field should be mapped.

Using this feature is as easy as creating a condition.

public class NotBlankStringCondition implements MappingCondition<String> {
    @Override
    public boolean isValid(@NonNull MappingConditionContext<String> context) {
        return context.getOriginalValue() != null && !context.getOriginalValue().trim().isEmpty();
    }
}

And adding the condition to the desired field mapping.

public class SimpleEntity {
    private String name;
    // Getters, Setters, Constructors, Equals...
}

public class SimpleEntityDisplay {
    private String name = "";
    // Getters, Setters, Constructors, Equals...
}

MappingDefinition mappingDefinition = new MappingDefinitionBuilder(SimpleEntity.class, SimpleEntityDisplay.class)
        .mapField("name", "name")
        .withCondition(NotBlankStringCondition.class)
        .build();

Inline Conditions

Like transformers, conditions can also be added inline using a function.

MappingDefinition mappingDefinition = new MappingDefinitionBuilder(SimpleEntity.class, SimpleEntityDisplay.class)
        .mapField("name", "name")
        .withCondition(context -> context.getOriginalValue() != null && !((String) context.getOriginalValue()).trim().isEmpty())
        .build();

Annotations Mapping

This specific feature receives lots of hate because it breaks the separation of concerns principle. Agreed, this could be an issue in some applications, but in some use cases where all objects are part of the same application it can also be very useful to configure the mapping logic on top of the object. Check out the documentation and decide for yourself.

Conclusion

Object mapping libraries are not the solution for every application. For small, simple applications using boiler-plate mapping functions are more than enough. But, when developing larger, more complex applications, object mapping libraries can take your code to the next level, saving you development and maintenance time. All of these while reducing the amount of boiler-plate code and overall improving the development experience.

On a personal note, I used to work with manual mapping functions and was ok with it. It was "just" some simple lines of code. After upgrading our applications to use object mapping as part of our "boiler-plate free" framework (We will discuss that framework at a later time), I can't go back. Now we spend more time on what's important and interesting and almost no time on boring boiler-plate code.