This document provides documentation on how to extend the Slacker functionality. The Slacker is divided into three parts - synchronous API, command API, and Kafka-PubSub convertor. The first two are handled by the slacker-manager service and the convertor is handled using Kafka Connect with plugins.

Adding a new synchronous endpoint

This document provides high-level documentation on the steps needed to add a new synchronous Slacker endpoint. The technical details are documented at How to create sync endpoints in slacker manager service.

The estimation for adding a synchronous endpoint is 2 story points. The main part of the task is to define the DTOs validation of the input and output data.

  1. Create DTOs describing input and output data.

    • The data are described using Kotlin data class with validation annotations.

  2. Add the Risk cloud function URL to the service configuration.

    • The configuration is defined using application.yml and read by SlackerManagerConfiguration configuration property.

    • The configuration for different environments is stored in values.yaml at ArgoCD.

  3. Add call to the Risk microservice.

    • The call should be implemented by the simple calling of RestCallService.reactivePost with configuration and DTOs as parameters.

  4. Add a controller method for accepting the request.

    • This method executes the Risk microservice calls.

Adding new command to Command API

The Slacker Command API commands are described at GCP Command Pub/Sub subscribe logic for POC purpose.

The estimation for adding a new command is 1-3 story points depending on the complexity of the target command. For the most commands it would be 1-2 story points.

The new command is created by adding a new subclass of SlackerCommandExecutor. It has to implement the supports and execute methods. The supports method determines if the executor should process the command, usually it just tests the name of the command. The execute method executes the actual command.

Adding of a new Kafka topic to PubSub convertor

The process is described at How to create Kafka connectors. Right now it is about sending one HTTP request. It might change after the final Kafka Connect to the Confluent platform is deployed.

The adding of a new topic is a simple process. Adding of a several topics is 1 story point.