Skip to main content
Version: Deploy 22.3

Extending Stitch

Implementing custom input/output formats

Stitch can also support input/output formats other than JSON and YAML. To do that you need to implement the following java interfaces:

  • com.xebialabs.deployit.plugin.stitch.service.engine.format.input.adapter.InputAdapter
  • com.xebialabs.deployit.plugin.stitch.service.engine.format.output.consumer.OutputConsumer

For InputAdapter following methods must be implemented:

  • Set<String> getFileFormats() - set of file formats (extensions) that can be handled by adapter
  • JsonNode convert(InputStream input) - method that converts content of input file to JsonNode which then can be handled by the Stitch engine

For OutputConsumer following methods must be implemented:

  • Set<String> getFileFormats() - set of file formats (extensions) that consumer converts to
  • void write(JsonNode node, OutputStream destination) - method that converts JsonNode to output

Here are examples of InputAdapter and OutputConsumer that would enable the Stitch engine to process xml input/output:

package com.xebialabs.deployit.plugin.stitch.service.engine.format.input.adapter;

import com.fasterxml.jackson.databind.JsonNode;
import com.xebialabs.deployit.core.serialization.XmlSerialization;
import org.springframework.stereotype.Component;

import java.io.IOException;
import java.io.InputStream;
import java.util.HashSet;
import java.util.Set;

@Component
public class XmlInputAdapter implements InputAdapter {
@Override
public Set<String> getFileFormats() {
HashSet<String> formats = new HashSet<>();
formats.add("xml");
return formats;
}

@Override
public JsonNode convert(InputStream inputStream) throws IOException {
return XmlSerialization.mapper().readTree(inputStream);
}
}
package com.xebialabs.deployit.plugin.stitch.service.engine.format.output.consumer;

import com.fasterxml.jackson.databind.JsonNode;
import com.xebialabs.deployit.core.serialization.XmlSerialization;
import org.springframework.stereotype.Component;

import java.io.IOException;
import java.io.OutputStream;
import java.util.Set;

@Component
public class XmlOutputConsumer implements OutputConsumer{
@Override
public Set<String> getFileFormats() {
return null;
}

@Override
public void write(JsonNode node, OutputStream destination) throws IOException {
XmlSerialization.mapper().writerWithDefaultPrettyPrinter().withRootName("root").writeValue(destination, node);
}
}

Implementing custom processors

You can create new processor types that can be handled by implementing the following Java interface: com.xebialabs.deployit.plugin.stitch.service.engine.processor.handler.ProcessorHandler. The following methods in this interface must be implemented:

  • String type() - this lets engine know which processor to handle
  • String fileParameterName() - each processor can have one parameter that is stored in the external file.

This way it can then be stored in a database during syncing of the Stitch repository.

  • JsonNode handle(JsonNode input, StitchProcessor processor,DeploymentContext deploymentContext, Map<String, Object> params) - this is a transformation method which does the actual transformation.
  • void validate(StitchProcessorDto processor) - this validates the processor during synchronization.

Here is an example for the processor that replaces the regex in the input file. This is how the processor would look like:

    processor:
- type: regex
description: "replace numbers with X"
weight: 10
parameters:
regex: "[0-9]"
replaceBy: "X"

or with the external regex file:

    processor:
- type: regex
description: "replace numbers with X"
weight: 10
parameters:
regexFile: regexFile.txt
replaceBy: "X"

Here is how the implemented ProcessorHandler that handles the processors in the examples above, would look like:

package com.xebialabs.deployit.plugin.stitch.service.engine.processor.handler;

import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.xebialabs.deployit.plugin.stitch.model.StitchParameter;
import com.xebialabs.deployit.plugin.stitch.model.StitchProcessor;
import com.xebialabs.deployit.plugin.stitch.service.engine.context.DeploymentContext;
import com.xebialabs.deployit.plugin.stitch.service.engine.index.dto.StitchProcessorDto;
import com.xebialabs.deployit.plugin.stitch.service.engine.index.dto.StitchProcessorParameterDto;
import com.xebialabs.deployit.plugin.stitch.service.engine.processor.InvalidProcessorException;
import org.springframework.stereotype.Component;

import java.util.Map;
import java.util.Optional;

@Component
public class RegexProcessorHandler implements ProcessorHandler {
public static final String HANDLER_TYPE = "regex";
public static final String REPLACE_BY_PARAMETER = "replaceBy";
public static final String REGEX_PARAMETER = "regex";
public static final String REGEX_FILE_PARAMETER = "regexFile";

@Override
public String type() {
return HANDLER_TYPE;
}

@Override
public String fileParameterName() {
return REGEX_FILE_PARAMETER;
}

@Override
public JsonNode handle(JsonNode input, StitchProcessor processor, DeploymentContext deploymentContext, Map<String, Object> params) throws JsonProcessingException {
String replaceBy = getReplaceBy(processor);
String regex = getRegex(processor);
String result = input.toPrettyString().replaceAll(regex, replaceBy);
return new ObjectMapper().readTree(result);
}

private String getReplaceBy(StitchProcessor processor) {
Optional<? extends StitchParameter> replaceBy = processor
.getParameters()
.stream()
.filter(param -> REPLACE_BY_PARAMETER.equals(param.getName()))
.findAny();
if (replaceBy.isPresent()) {
return replaceBy.get().getValue();
} else {
return failWhenReplaceByNotPresent(processor.describeAsString());
}
}

private String getRegex(StitchProcessor processor) {
Optional<? extends StitchParameter> regexParam = processor
.getParameters()
.stream()
.filter(param -> REGEX_PARAMETER.equals(param.getName()))
.findAny();
Optional<? extends StitchParameter> regexFileParam = processor
.getParameters()
.stream()
.filter(param -> REGEX_FILE_PARAMETER.equals(param.getName()))
.findAny();
if (regexParam.isPresent()) {
return regexParam.get().getValue();
} else if (regexFileParam.isPresent()) {
return regexFileParam.get().getFileContent().orElseThrow(() -> new InvalidProcessorException(String.format(
"%s has no fileContent for '%s' parameter saved in the database!",
processor.describeAsString(), REGEX_FILE_PARAMETER
)));
} else {
return failWhenRegexNotPresent(processor.describeAsString());
}
}

@Override
public void validate(StitchProcessorDto processor) {
Optional<? extends StitchProcessorParameterDto> replaceByParam = processor.getProcessorParameters().stream()
.filter(param -> REPLACE_BY_PARAMETER.equals(param.getName())).findAny();
Optional<? extends StitchProcessorParameterDto> regexParam = processor.getProcessorParameters().stream()
.filter(param -> REGEX_PARAMETER.equals(param.getName())).findAny();
Optional<? extends StitchProcessorParameterDto> regexFileParam = processor.getProcessorParameters().stream()
.filter(param -> REGEX_FILE_PARAMETER.equals(param.getName())).findAny();
if (!replaceByParam.isPresent()) {
failWhenReplaceByNotPresent(processor.describeAsString());
}
if (regexFileParam.isPresent() && regexParam.isPresent()) {
throw new InvalidProcessorException(String.format(
"%s shouldn't have both '%s' and '%s' parameters defined!",
processor.describeAsString(), REGEX_PARAMETER, REGEX_FILE_PARAMETER
));
} else if (!regexFileParam.isPresent() && !regexParam.isPresent()) {
failWhenRegexNotPresent(processor.describeAsString());
}
}

private String failWhenReplaceByNotPresent(String processorDescription) {
throw new InvalidProcessorException(String.format(
"%s has no '%s' parameter!", processorDescription, REPLACE_BY_PARAMETER)
);
}

private String failWhenRegexNotPresent(String processorDescription) {
throw new InvalidProcessorException(
String.format("%s has no '%s' or '%s' parameter!",
processorDescription, REGEX_PARAMETER, REGEX_FILE_PARAMETER
));
}
}

Implementing custom SpEL utility class

You can create new SpEL utility class that can be handled by extending the following Java abstract class: com.xebialabs.deployit.plugin.stitch.service.engine.processor.handler.expression.resolver.AbstractContextResolver. It is also mandatory for the new utility class to be annotated with the following annotation: com.xebialabs.deployit.plugin.stitch.service.engine.processor.handler.expression.annotation.ExpressionUtility.

Utility class should have constructor that accepts two parameters: com.xebialabs.deployit.plugin.stitch.service.engine.context.DeploymentContext and com.xebialabs.deployit.plugin.stitch.service.engine.context.InputContext because the abstract class AbstractContextResolver has two implemented getters: getDeploymentContext() and getInputContext() that return DeploymentContext and InputContext received through the constructor, respectively.

This way all annotated utility classes will be loaded during the startup, and their utility methods will be ready to use.

Here is an example how to create custom "env" utility class:

@ExpressionUtility("env")
public class EnvResolver extends AbstractContextResolver {
public EnvResolver(final DeploymentContext deploymentContext, final InputContext inputContext) {
super(deploymentContext, inputContext);
}

public String id() {
return getEnvironment().getId();
}

public String name() {
return getEnvironment().getName();
}

private Environment getEnvironment() {
DeployedApplication currentDeployedApplication = getDeploymentContext().getCurrentDeployedApplication();
if (currentDeployedApplication != null) {
return currentDeployedApplication.getEnvironment();
} else {
return getDeploymentContext().getPreviousDeployedApplication().getEnvironment();
}
}
}

This is an example how we would use the above utility class in one of our processors:

    processor:
- type: freemarker
description: "Adding the labels to the resulting YAML file"
parameters:
template: |
{ "metadata": {
"environment": {
"id":"${environmentId}",
"name":"${environmentName}",
}
}
}
variables:
environmentId: "Environment ID is: #{@env.id}"
environmentName: "Environment name is: #{@env.name}"

Important: The implementations of InputAdapter, OutputConsumer, and ProcessorHandler along with all dependencies that aren't inside the xl-deploy server. They must be packaged in to a jar and then placed inside the lib directory of the xl-deploy server.