arrow
Back to blog

Using Code Generation to Maintain the Microservices Architecture

clock

10 min read

In today’s software development world, microservices have become a de-facto industry standard, especially when it comes to the evolution of service-oriented approaches. And while they’re meant to be decoupled by means of architectural solutions, in most cases there still remains a layer of common code, libraries, and configuration shared between services, specific for each solution: authentication, logging, Open API, external (particularly cloud) services, storage connection, distributed messaging, and others. Furthermore, it may not only affect the code itself — services may require to be configured in docker-compose or nginx as well.

Scale Software: The Traditional Approach to Adding a New Service

So, how do we deal with adding a new service? Let’s consider a typical case, when, by the time it’s needed, we already have a new solution and an established way of the general configuration.

In the best-case scenario, we have relevant documents that define what should be done to comply with currently used solution-wide approaches. So we can manually add a service and follow internal docs to achieve the goal.

In the worst-case scenario (and, unfortunately, rather common), we have either outdated or no documentation at all, so we go manual all the way. We start by adding files for the new service. Then comes the most unpleasant part: manual inspection of existing services and iterative modifications to achieve the working state, all while fixing errors until they’re finally gone with lucky copy-pasting.

Adding a New Service — The Better Way

Considering how time-consuming and tedious the process may be and that it may sophisticate the software lifecycle development, are there any ways to make it better? — Sure enough, and in this article, we’d like to share our best practices and knowledge in creating templates for .NET solutions and services, with applications and examples. Even though the idea of templating and code generation isn’t new, there is little information on the Internet, especially the one we will review here — namely dotnet templating.

Solution template

This template holds a solution with a service, common solution configuration, and tools — a set that is also shared by all services.

The contents can be basically divided into two parts: the configuration is located at solution/.template.config/template.json, and other files are included in the resulting template as the subject of modification according to the configuration rules.

Let’s take a brief look at what’s inside. The image below shows the raw template controller with a solution content tree.

If we run it (of course, even a template can be run, because, first of all, it’s a working solution), we get a simple response:

And the next image shows us the docker-compose YAML:

Template configuration

Let’s go through the template engine config and look at the sections that are of our main interest and actually do everything we need.

"shortName": "dashdevs_sln_standard" - this is the name to be used as an alias for your template after the installation, i.e., here it will be dotnet new dashdevs_sln_standard.

In this template, we use the TemplateCompany/templatecompany, TemplateProduct/templateproduct, and TemplateService/templateservice names for a company, a product, and a service, respectively. The next section will serve as an example of how we modify the default company name (TemplateCompany/templatecompany).

"symbols": {
    ...
    "company": {
        "type": "parameter",
        "datatype": "string",
        "defaultValue": "TemplateCompany",
        "replaces": "TemplateCompany",
        "fileRename": "TemplateCompany"
    },
    "companyLowerCase": {
       "type": "generated",
       "generator": "casing",
       "parameters": {
         "source": "company",
         "toLower": true
       },
       "replaces": "templatecompany",
       "fileRename": "templatecompany"
    }
    ...

Here we first introduce the template string parameter symbol named company, with default value TemplateCompany. A replaces field instructs the engine to substitute all text occurrences of the value, which is TemplateCompany (it’s case-sensitive). fileRename field uses a similar logic, but for files and folder names and not for file contents modification. Thus, if the parameter is not set, nothing changes, because we’re using TemplateCompany across the template files already.

Then we process the lowercase occurrences of the company parameter by using a generated type of new companyLowerCase symbol and specifying the casing type of the generator. Generated symbols are using predefined values as a source, so we determine the company parameter value as the one and set the toLower field to true because we want to generate the lowercase string. replaces and fileName fields logic is described above.

The next section allows defining the options for template sources.

"sources": [
  {
    "exclude": [ "**/[Bb]in/**", "**/[Oo]bj/**", ".template.config/**/*", "**/*.filelist", "**/*.user", "**/*.lock.json", ".git/**", ".vs/**", "_ReSharper*/", "*.[Rr]e[Ss]harper" ]
  }
]

Here we use the exclude option, which can be easily modified according to your needs.

Template installation and usage

To install the template locally from the folder, run dotnet new -i pathToSolutionTemplateFolder. Once it’s done, you will see the list of all templates, including the new one.

To use the template, run dotnet new templateShortName --company YourCompanyName --product YourProductName --service YourServiceName.templateShortName is the value of the shortName field; product, company, and service are parameter symbols from solution/.template.config/template.json (company example is described above).

Thus, the current example would require running dotnet new dashdevs_sln_standard --company YourCompanyName --product YourProductName --service YourServiceName. Here’s how it looks like:


Now let’s compare what we’ve got with the raw template. The controller and solution tree:

Running a service:

And a docker-compose YAML:

As you can see, all text and folders have been renamed as expected.

Tools folder

The tools folder contains instruments to be used for service templates post-processing, which will be revealed below.

Service template

The particular template is used in conjunction with the solution template to add new services to existing solutions. In essence, it’s derived from the solution template (obviously, services should be the same) and slightly modified.

Template configuration

In general, the service template is very similar to the solution one, though the former comprises some important additions.

The one is the choice parameter type, where we can specify the list of available params. Here it’s used to distinguish the OS that is required for specific post-processing tasks to be described later.

"symbols": {
    ...
    "OS": {
       "type":"parameter",
       "datatype": "choice",
       "defaultValue":"nix",
       "choices": [
         {
           "choice": "win"
         },
         {
           "choice": "nix"
         }
       ]
    }
    ...
}

The other is the use of a post-actions concept to be executed after the main template engine actions are finished. Our interest lies in running external scripts by specifying the actionID value “3A7C4B45-1F5D-4A30-959A-51B88E82B5D2” and other parameters.

The reason to use custom scripts is quite trivial: unfortunately, the dotnet template engine can’t modify existing files. Here you can see how the action for adding newly-created service from a template to a solution is defined.

"postActions": [
   ...
   {
    "actionId": "3A7C4B45-1F5D-4A30-959A-51B88E82B5D2",
    "condition": "(OS == \"win\")",
    "args": {
      "executable": "./tools/add_projects.bat",
      "args": ""
    },
    "continueOnError": false
  },
  {
    "actionId": "3A7C4B45-1F5D-4A30-959A-51B88E82B5D2",
    "condition": "(OS == \"nix\")",
    "args": {
      "executable": "./tools/add_projects.sh",
      "args": ""
    },
    "continueOnError": false
  },
  ...
]

Depending on the OS parameter, either “add_projects.bat” or “add_projects.sh” is called. Please, note that these files are located in the solution template because they’re not service-specific and depend on a solution. Also, they actually search for all projects in the ./src/services folder to be added.

Problems with adding a new service

As you can see, the particular problem with adding projects to a solution can be solved quite simply using dotnet CLI. But what about other use cases? Many of them imply the absence of a ready-to-use tool, and due to the inability of the existing file modification, the only option left is making a custom one for each use case.

For example, one of the most-used templates would be the docker-compose configuration modification. To achieve this, we’ve implemented custom template post-processor, which actually does the following:

  1. Accepts a custom template path (we put all of them into the service’s post-processing folder). This is a text file that is a subject of template source files modification like any other, so the in-text replacement is made by the engine, thus we can read the file that is already modified.
  2. Accepts path to the existing file to be modified (docker-compose YAML in our case) and reads it.
  3. Searches for a place to insert text from a template and embeds it.

This tool is simple and extendable, so it’s easy to add new post-processors. Let’s see how it correlates with service/.template.config/template.json:

"postActions": [
  ...
  {
    "actionId": "3A7C4B45-1F5D-4A30-959A-51B88E82B5D2",
    "args": {
      "executable": "dotnet",
      "args": "publish ./tools/DashDevs.TemplatePostProcessor/DashDevs.TemplatePostProcessor.csproj -o ./tools/DashDevs.TemplatePostProcessor/publish"
    },
    "continueOnError": false
  },
  {
    "actionId": "3A7C4B45-1F5D-4A30-959A-51B88E82B5D2",
    "args": {
      "executable": "dotnet",
      "args": "./tools/DashDevs.TemplatePostProcessor/publish/DashDevs.TemplatePostProcessor.dll --docker-compose ./postprocessing/docker-compose/docker-compose.txt ./docker-compose.yml"
    },
    "continueOnError": false
  },
  {
    "actionId": "3A7C4B45-1F5D-4A30-959A-51B88E82B5D2",
    "args": {
      "executable": "dotnet",
      "args": "./tools/DashDevs.TemplatePostProcessor/publish/DashDevs.TemplatePostProcessor.dll --docker-compose ./postprocessing/docker-compose/docker-compose.Development.txt ./docker-compose.Development.yml"
    },
    "continueOnError": false
  },
  {
    "actionId": "3A7C4B45-1F5D-4A30-959A-51B88E82B5D2",
    "args": {
      "executable": "dotnet",
      "args": "./tools/DashDevs.TemplatePostProcessor/publish/DashDevs.TemplatePostProcessor.dll --docker-compose ./postprocessing/docker-compose/docker-compose.Production.txt ./docker-compose.Production.yml"
    },
    "continueOnError": false
  },
  {
    "actionId": "3A7C4B45-1F5D-4A30-959A-51B88E82B5D2",
    "condition": "(OS == \"nix\")",
    "args": {
      "executable": "rm",
      "args": "-drf ./postprocessing"
    },
    "continueOnError": false
  },
  {
    "actionId": "3A7C4B45-1F5D-4A30-959A-51B88E82B5D2",
    "condition": "(OS == \"win\")",
    "args": {
      "executable": "powershell",
      "args": "-command Remove-Item -Recurse -Force \"postprocessing\""
    },
    "continueOnError": false
  }
]

The flow is as follows:

  1. Publishing the template post-processor;
  2. Applying the templates to relevant existing files;
  3. Cleanup. By the way, tool or script execution context may get broken for some reason, so in Windows environment, we had to use Powershell workaround to avoid additional scripts.

Template installation and usage

The template can be installed following the same procedure, as with the solution one, but as we’ve reviewed two templates already, there’s a tip: you can run dotnet new -i pathToRootTemplateFolder within the root folder, and both of them will be installed at once.

To use the template, please run dotnet new templateShortName --company YourCompanyName --product YourProductName --service YourServiceName --OS yourOS. Another tip is to add --allow-scripts yes for templates which use scripts in the post-actions to prevent the engine from asking your confirmation before each separate script is run.

Let’s see how it looks by adding a new service to an existing solution:


Now the solution includes the second service:


Which can be run just as the first one:

And it’s added to the docker-compose YAML by our custom post-processing tool:

NuGet

In the real working environment, you may want to use your templates as a package. This is an easy thing to do, and you may refer to the official docs for more detail.

Customization

Given the current information, you can easily prepare your own solutions and services, add or remove symbol parameters, extend the post-processing, or exclude additional unwanted sources according to the specific needs.

One may ask if there’s a ready-to-use software for existing file modification instead of writing a new post-processor for each new format? — Well, it exists and can be found here: StringTemplate/ANTLR. However, in our opinion, it requires too much effort to get the desired results since it’s a rather sophisticated instrument. You’ll have to spend significant time making use of it, and for more or less easy use cases, we suggest that custom post-processors are more cost-efficient. Moreover, using such a tool, in the long run, requires other developers to dig into it too, so it’s not only a one-time shot but a cost of support to consider. Still, if your use case implies text transformation with complex formats and rules, StringTemplate may come in handy.

Why use code generation techniques

Code generation and dotnet templating can be applied to both large and small applications, as well as various industries or business types. So let’s cut to the chase, what are the benefits of such an approach to custom software development? Among the most significant ones are:

  • Consistency: The system follows the pre-configured rules and principles, thus improving the code quality and ensuring consistency within the entire application. With the manual approach, we should still take into account the human factor, but with automatic code generation tools, the code always runs as expected.
  • Productivity: In fact, you write a generator once and reuse it as much as necessary. Moreover, it eliminates the need to re-create the infrastructure features — like authentication or logging — manually, again and again. Consequently, such an approach saves development time, accelerates a product’s time-to-market, and allows developers to focus on more complicated and critical tasks.
  • Facilitation: Since code is generated from an abstract description, the latter actually serves as the main source of truth. Consequently, in the case of mismatching, it’s much easier and faster to check the description than dozens of lines of the generated code.

However, despite all the advantages of dotnet core code generation, it can hardly be called a silver bullet, and, as we’ve already mentioned above, the approach to software application development is highly dependent on the project or use case.

The benefits of microservices-based architecture

The use of microservices design patterns is one of the major underlying techniques in agile software development. It’s widely adopted by such giants BBC, Netflix, Amazon, Twitter, and many others.

In a nutshell, microservices development means building a software solution as a set of small services where each runs in its own process. So why microservices design principles are so popular among not only developers but also business owners?

  • Easy co-development and maintenance: The adoption of microservices architecture patterns implies app modular structure, and it’s particularly helpful for large and distributed teams. Engineers can build or scale services independently, thus ensuring faster time-to-market and consistent code quality.
  • Fast deployment: Smaller components are easier to test and deploy, and since they are autonomous, you can make releases independently of others. As a result, releases are more safe, quick, and frequent, so businesses keep their customers more engaged and happy.
  • Better focus on business needs: One of the central benefits of microservices is that they encourage teams to create products but not just projects. In fact, engineers are centered around business needs and capabilities, since services should be adaptable and reusable. Consequently, this contributes to the development of a cross-functional and smart team with a keen insight into business operations and offerings.

Summary

.NET templating is a powerful tool that can streamline the developers’ working process, enabling them to define standard execution units, and then reuse them, thus reducing the required time for adding consequent units such as services. The complexity lies in existing configuration files that can’t be modified by .NET template engine, thus requiring custom post-processing for the time being.

The full source code of examples described above is available on GitHub.

Share article

Table of contents