Written by Filip Skibiński
Published December 21, 2022

Developing inside a box

The idea of an executable “box” – a container with encapsulated application code and system-related dependencies – seems to become increasingly popular and even perceived as a standard nowadays.

Utilising the Remote – Containers extension for VSCode

Such a container is easy to recreate if something goes wrong. It’s also much easier to maintain its security because the application itself is enclosed in a strictly specified environment with only several access points enabling it to communicate with the “world” outside. Depending on the user’s needs, the “embedded” system can be customised easily – from choosing the OS, downloading necessary packages, installing certificates, and launching the application.

A developer can benefit from most of these advantages not only during the deployment of an app but also when developing it. This possibility is accessible, e.g. by using the Remote – Containers extension for Visual Studio Code.

This article describes some of the benefits of using a development container and offers a way through a basic configuration.

💻 Operating system

Firstly, the extension is designed to work only with a selection of *nix-family systems – both on the host (the machine’s OS you are working on) and inside the container. Of course, MacOS (as a popular *nix system) is also supported on the host.

According to the official documentation, among them are the most popular distributions like Ubuntu, Debian, and CentOS. It also works with the Alpine distro, so you can quickly get a lightweight environment. However, an Alpine-based environment can be more challenging to configure.

Windows containers are not supported at all. Also, when your host is a Windows machine, it’s necessary to use the WSL2 Backend for Docker. You can read further about this in the following article. At the end of the day, you stay with a *nix.

For the record, WSL2 can achieve a near-native performance (source). Despite the additional virtualisation layer, the development process should still stay smooth – especially when using the Ubuntu distribution (thanks to the success of cooperation between Microsoft and Canonical).

🧰 Required software

To build a Docker image and run a container, you need a local Docker installation. In case you use a Windows or Mac machine, you can install, e.g. the Docker Desktop app.

On a Linux machine, it’s necessary to install the Docker engine. It’s described in detail in the example of the Ubuntu distro in the following article in the official Docker documentation. According to it, the Linux version of Docker Desktop should be available shortly.

Moreover, if you want to utilise the Remote – Containers extension, you should install Visual Studio Code and, certainly, the extension itself. On Windows, you can also install the Remote – WSL extension to make the navigation inside of it more accessible.

It’s not required, but a local Git (or another VCS) installation would be helpful, too.

For a summary: You need a local installation of Docker (or Docker engine), VSCode with extension(s), and a VCS of your choice (e.g. Git). SDKs (in the appropriate versions) will be installed automatically inside the container while (re-)building it.

This should make the adoption process of new team members much easier and quicker. They have only to install the latest versions of the software listed above.

Also, an upgrade of dependencies such as SDKs comes down to rebuilding the container after getting the newest changes on the master branch.

⚙️ Container configuration

Let’s get our hands dirty. Open your project’s root directory in VSCode. If you haven’t already had a project, create a new one in a web-related technology of your preference.

The Remote – Containers extension reads files placed in the .devcontainer folder in the root directory – similarly to, e.g. those containing a workspace-specific configuration located in the .vscode folder.

Here’s a simple example of a project’s structure with all files supported by the extension included:

📦sample-project
┣ 📂.devcontainer
┃ ┣ 📜Dockerfile
┃ ┣ 📜devcontainer.json
┃ ┗ 📜docker-compose.yml
┗ 📜index.js

 

In this case, the index.js file contains a simple Node.js server setup:

const http = require(‘http’);

const requestListener = (_, res) => {
  res.writeHead(200);
  res.end(‘Hello, World!’);
}

const server = http.createServer(requestListener);
server.listen(8080);

 

The devcontainer.json file is mandatory and can contain various options determining how the containerized environment gets built and started. You can find the detailed list under this link.

The options include passing docker run arguments, running a script on the host before creating a container, or inside it when finalising the whole process. It’s also possible to specify the ports that should be forwarded, mount additional volumes to the container, and many more.

What’s also essential, the project’s root directory is mounted to the container after building it. As a result, all changes performed on the files inside the container also affect the host (so on the local disc).

Also, if you have any git credentials configured (e.g. stored locally or using a manager) on the host, they should be moved to the container so you will not have to authenticate every time after rebuilding the container.

Here’s a sample configuration that enables running the server specified above:

{
  “name”: “sample-project”,
  “image”: “mcr.microsoft.com/vscode/devcontainers/javascript-node:14”,
  “remoteUser”: “node”
}

 

This configuration downloads the javascript-node:14 Docker image provided by Microsoft in its container registry. This image (as well as the others in the registry) is intended for general use and includes many pre-configured optimisations and facilities, offering us a good development experience. It’s also being actively maintained, so it should stay up-to-date.

If you want to customise the containerised environment, you can create your configuration in a Dockerfile. Of course, you can use Microsoft’s images as a base or even run many containers at once by providing an appropriate docker-compose.yml file. Unless the configuration from a chosen image doesn’t suit you, both files are optional.

After you finish setting up your environment (as described above or adapting it to your needs), you can open the project in a container. To do that, you can click the Reopen in Container option in the notification, which should pop up:

or select the analogical option in the VSCode’s Command Palette:

As a result, the extension will build your container based on the provided configuration. When it’s finished, you can enjoy your containerised environment:

📝 Conclusion

This article is only a short introduction to the subject of a virtualised development environment. Its purpose was to give you insights on this topic, show several advantages of such a solution, and show how to get started. Hopefully, it will make your work easier and encourage you to explore the topic further 😉

Resources for further reading:

Written by Filip Skibiński
Published December 21, 2022