Redefining Development Environments

15 min read


“It works on my computer” is a longstanding challenge in software development: the quest for consistent environments across development teams. This challenge has led to countless hours of troubleshooting and has often complicated the onboarding of new team members. Docker provided a groundbreaking pathway to standardize environments. However, these environments were typically virtualized for production or general use cases rather than specific project needs.

Certainly, the issue of environment inconsistency isn’t new, nor was it without a workaround. Developers have long had the option to work directly within Docker containers to mirror production settings or create an image that is distributed through a virtual machine. However, these approaches, while effective, often require a steep learning curve, significant setup time, and much more capital to maintain, making it less accessible for all development teams.

This is where VSCode Devcontainers push the boundaries of environment setup by not only addressing the consistency challenge but doing so in a manner that significantly enhances the user experience and allows a tailored experience at a project level. They offer a streamlined, accessible, and convenient method to set up development environments that everyone on the team can easily adopt and use.

The consistency brought forth by Devcontainers extends beyond the replication of environments. They simplify project documentation and significantly ease the onboarding process. The moment new team members step into a project within a Devcontainer, they are presented with a system that is fully equipped and configured, markedly reducing the potential for bugs and discrepancies caused by environment differences.

Devcontainers introduce unprecedented flexibility, allowing for project-specific setups. This granular approach enables developers to seamlessly transition between projects with diverse requirements without the headache of reconfiguring their local development setup for each project’s peculiarities.

If you’ve felt the friction of maintaining consistent development environments or sought a smoother way to welcome new team members into your projects, you’re in the right place. In this article, we’ll explore how VSCode Devcontainers are making significant strides in addressing the age-old consistency challenge. We’ll discuss the practicalities of setting up and using Devcontainers, the benefits they offer to team collaboration and project onboarding, and their pivotal role in fostering a more unified development experience.

Devcontainers are development environments encapsulated within a containerized environment. Defined by a Docker image that sets up the environment’s base and a devcontainer.json file that tailors the development experience, Devcontainers orchestrate everything required for a project—from the operating system to programming languages, software dependencies, external tools, databases, and anything else needed for development.

Visual Studio Code (VSCode) integrates these Devcontainers through the Remote - Containers extension. This powerful integration allows developers to immerse themselves in a development environment that is consistent across contributors, mitigating environment inconsistency and setup issues.

Integrating Devcontainers into Visual Studio Code introduces several significant benefits, each contributing to a more streamlined and efficient software development process:

One of the key advantages of using Devcontainers is the consistency they bring to development environments. Let’s be honest: In traditional setups, most teams do not have enough onboarding guides to avoid the infamous “it works on my machine” scenario. Devcontainers eliminate this problem by providing a uniform development environment that is replicated across all team members’ setups. This uniformity guarantees that code operates flawlessly across all setups, cutting down on the time devoted to troubleshooting environment-specific problems, regardless of whether your team members prefer an obscure Linux distribution or, dare we say, Windows.

This consistency is crucial not just for current team members but also for maintaining the integrity of the project over time. As dependencies update and development tools evolve, Devcontainers help ensure that all team members are working with the same versions, further enhancing the stability and reliability of the development process.

Another key benefit of Devcontainers is the isolation they provide. Each Devcontainer is isolated from the host machine and other containers, ensuring that dependencies and configurations for one project do not interfere with those of another. This isolation is particularly important in a landscape where a developer may be working on multiple projects with differing requirements. It prevents “dependency hell”, where conflicting dependencies can cause projects to break unexpectedly.

Isolation also means that developers can experiment with new tools, libraries, and frameworks without the risk of destabilizing their main development environment. This freedom to experiment is crucial for innovation and learning, allowing developers to explore new technologies in a safe, controlled environment.

We’ve all been on that team where it takes a week to properly set up your development environment. The initial setup of a development environment can be a daunting task, particularly for complex projects or when new team members join. Devcontainers simplify this process significantly. With a Devcontainers, new team members can have their development environment up and running in minutes after cloning the project repository. This ease of setup allows developers to focus on becoming familiar with the project itself rather than spending time configuring their tools and environment.

This streamlined setup process is equally beneficial for setting up new projects. Developers can quickly bootstrap new projects with all the necessary configurations and dependencies predefined, ensuring that the project starts on the right foot with a consistent environment from day one.

Think of your project as a gourmet dish, and DevOps has prepped your kitchen with all the essentials for baking. But what if you’re in the mood to grill? Here’s where Devcontainers come in, offering you the specialized tools to ensure your culinary masterpiece doesn’t end up half-baked.

With Devcontainers, you’re no longer limited to the utensils and ingredients chosen by DevOps. If your project needs some features of the latest version of Go that DevOps hasn’t upgraded to yet, or if you’re looking to swap out Node.js for Bun, Devcontainers let you make it easy to make those swaps. Giving you the power to buy your own groceries.

This is the essence of Devcontainers—providing the flexibility to customize your development environment to fit the unique tastes of your project. It’s about enabling your team to cook up the right product for their use case instead of fitting it into a one-size-fits-all development environment. If you still need certain standard tools across your company, Devcontainers just use a docker image so you can create a base image that allows for the flexibility to customize, balancing the need for organizational standards with the freedom to experiment and innovate.

A standout feature of Devcontainers is their configuration setup, neatly integrated right alongside your project code. This organization not only makes it easy to access and adjust your environment setup but also invites all project contributors to explore the setup details. They can grasp the environment’s intricacies or propose enhancements, benefiting everyone involved. By embedding your Devcontainer configuration in source control, you directly link your development environment to your project, making the fully configured setup accessible to all. This setup empowers contributors to immediately start adding value as soon as they clone the repository. Moreover, it enables tracking the environment’s evolution over time, providing transparency and the option to revert changes if necessary.

This methodology ensures your development environment adapts and grows with your project. Updates to the .devcontainer configuration become instantly available to the team upon repository synchronization, encouraging collaboration. This approach not only strengthens team dynamics but also promotes the sharing and adoption of new practices discovered by other team members. Discovered a tool that’s become essential to your workflow but isn’t in your current setup? Incorporating it into your Devcontainer lets the entire team share in your discovery.

Devcontainers address the crucial need for environment consistency during product development while offering the flexibility to tailor to project-specific demands.

Transitioning to Devcontainers involves a few straightforward steps, starting with ensuring you have all necessary tools installed, followed by configuring your first Devcontainer to utilize a basic Docker image of your choice.

Before embarking on your Devcontainer journey, ensure you have:

  • Visual Studio Code (VSCode): The core editor offering seamless integration with Devcontainers.
  • Docker: The backbone of Devcontainers, used to create and manage the containerized environments. Install and run Docker Desktop (for Windows and Mac) or Docker Engine (for Linux) on your machine.
  • Dev Containers extension: This essential extension connects VSCode with Docker, simplifying the management and use of Devcontainers. Available in the VSCode Marketplace.

Creating your first Devcontainer is straightforward with these steps:

Create a Devcontainer Configuration: At the root of your project, add a .devcontainer directory. Inside this directory, you’ll create a Devcontainer.json file. This file will directly reference an Ubuntu image, allowing you to bypass the need for a custom Dockerfile initially.

Sample devcontainer.json Configuration:

{
  "name": "Rust Devcontainer",
  "image": "ubuntu:latest",
  "settings": {
    "terminal.integrated.shell.linux ": "/bin/bash"
  },
  "extensions": ["rust-analyzer.rust-analyzer"],
  "forwardPorts": [8000],
  "postCreateCommand": "curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh; source $HOME/.cargo/env; rustup toolchain install stable; cargo build",
  "remoteUser": "root"
}

This configuration utilizes the ubuntu:latest image to establish a fundamental Linux environment. It automatically installs the Rust extension for VSCode, configures port forwarding for local web server testing, and executes a command to install Rust, set up the environment, and perform an initial build to confirm everything is set up correctly. You can find some default devcontainer images intended to be used in Devcontainers that are maintained by Microsoft here.

Open Your Project in the Devcontainer: With the devcontainer.json in place, use the Remote - Containers extension by opening the Command Palette in VSCode (Ctrl+Shift+P or Cmd+Shift+P on Mac) and selecting “Remote-Containers: Reopen in Container.” VSCode will retrieve the specified Ubuntu image and arrange your development environment within this container.

Begin Development: Now that you’re inside your Rust Devcontainer, you’re all set to dive into development. Go ahead and run cargo new hello_world to create a rust project. This containerized setup guarantees that you and your team work in a consistent, isolated environment, fully prepared with the tools detailed in your configuration.

You’ve now set up a Devcontainer, showcasing both the straightforward nature and the robust capabilities of containerized development environments. While this example provides a foundational glimpse, in practice, your Devcontainer will likely be more tailored, moving beyond a basic Ubuntu image to include the specific tools and configurations your project needs. This simple example illustrates some of the key pieces of a Devcontainer. Building on this basic configuration, you can enhance your Devcontainer with the precise tools and settings required for your development work.

Devcontainers revolutionize the development environment setup by going beyond the Docker model, which traditionally focuses on mirroring the runtime environment. Instead, Devcontainers enrich how contributors interact with a project, incorporating crucial development tools and settings directly into the workspace configuration. At the core of this approach is the devcontainer.json file, which facilitates a deeper integration with Visual Studio Code, enabling teams to specify necessary extensions and settings that align with the project’s technical requirements and coding standards.

This shift signifies a move away from a generic container setup to a more nuanced and project-specific configuration. Devcontainers are not just about ensuring compatibility across different machines; they’re about creating an optimized and unified development experience. By leveraging the devcontainer.json configuration, projects can achieve a level of consistency and efficiency that supports high-quality contributions right from the project’s source control, making it straightforward for any team member to immediately contribute without worrying about additional setup.

Let’s explore essential elements of automating your Devcontainer setup: choosing extensions for team-wide standards, setting up port forwarding, standardizing VSCode settings for project consistency, and securely managing environment variables. Each plays a crucial role in elevating your Devcontainer from a basic setup to a fully customized development workspace.

The selection of VSCode extensions within your Devcontainer configuration should focus on enhancing the collective efficiency and quality of the team’s work rather than catering to individual preferences. Extensions play a pivotal role in ensuring that all team members are equipped with tools that support the project’s specific requirements and adhere to its coding standards.

Key Considerations for Extension Selection:

  • Common Use Cases: Include extensions that address the needs common to your project, such as language support or integrated development tools that align with your tech stack.
  • Team or Company-Developed Extensions: Leverage custom extensions developed in-house for specialized tools or technologies unique to your project, ensuring seamless integration and functionality.
  • Coding Standards and Best Practices: Adopt plugins that aid in maintaining your project’s coding standards, such as code formatters and linters. These tools help keep the codebase clean and consistent, which is crucial for collaborative development.

Practical example:

To incorporate extensions into your devcontainer.json, you simply add them to the extensions list. For instance, to include support for Go development and ensure consistent code formatting with Prettier, your configuration would look something like this:

{
  "extensions": ["golang.go", "esbenp.prettier-vscode"]
}

This strategic approach to extension selection ensures that the Devcontainer serves as a comprehensive, shared development workspace optimized for both productivity and quality.

Port forwarding is a key feature in Devcontainer setups that facilitates direct access to services and tools running within the container, such as web servers, databases, or debuggers. This capability is crucial for several reasons:

  • Debugging: When debugging applications, especially those running on the Java Virtual Machine (JVM) or developed in languages like Go, developers often need to attach debuggers that listen on specific ports. Port forwarding allows these tools to connect seamlessly to the application as if it were running on the host machine, providing an integral part of the development workflow.
  • Testing: Accessing your application through a forwarded port mimics how users interact with it in production, offering a realistic environment for running functional or integration tests. This setup is particularly useful for web development projects that require interaction with APIs or third-party services.
  • Tool Integration: Many development tools and IDEs offer features that rely on accessing the application over a network, such as performance monitoring tools or API testing utilities. Forwarding ports from the Devcontainer ensures these tools can connect to your application without additional configuration.

Practical Example:

Configuring port forwarding within your Devcontainer involves specifying the required ports in the forwardPorts array of your devcontainer.json. For example, if you need to forward the port for a web application running on port 3000 and a debugger attached to port 5005, your configuration would be as follows:

{
  "forwardPorts": [3000, 5005]
}

Utilizing allows for integration with the tools you need to use to aid your development experience. The devcontainer.json makes it easy to configure.

Implementing Devcontainers within your project provides an opportunity to set default VSCode settings through the devcontainer.json file. This approach enables project teams to define a standard environment that sets up all of the necessary tools to get started right away. Setting defaults, rather than imposing rigid rules, allows for a foundation upon which individual contributors can build and customize according to their preferences.

Purpose of Setting Defaults:

The aim of setting default settings in your Devcontainer configuration is to establish coding standards and practices effortlessly across the team. For example, if your team chooses to keep consistent ordering of imports you can enable the setting similar to how it works in the standard vscode settings.json file. This helps to ensure that there is a working option built into the environment for each of the tools your project uses.

While it’s crucial to maintain a cohesive coding environment, recognizing the difference between enforceable project standards and individual developer preferences is key. Default settings should focus on enhancing project cohesion—like formatting rules or linting—leaving room for developers to adjust their personal workflow preferences, such as themes or keyboard shortcuts, locally.

Providing a Solid Foundation:

Setting thoughtful defaults in devcontainer.json aims to equip contributors with a development environment that minimizes setup time and maximizes compliance with project standards. This strategy ensures that developers can focus on contributing to the project from the outset.

By now, you might be thinking of the neovim enthusiast arguing for individual environment setup, suggesting that instead of imposing your settings, you should allow users to customize their editor to their liking. The goal of setting default settings is to set up an environment that sets a foundation on which to build. It should not restrict contributors from modifying their own settings to customize their editor as they wish. Contributors retain the ability to personalize their setup by adding a local .vscode/settings.json file for their specific tweaks, which should remain untracked by source control. This level of customization enables developers to modify their environment as needed while preserving the project—wide standards defined in the Devcontainer. The Devcontainer’s design, which stores code within a volume, allows these personalized settings to remain intact on your local disc for the next time you open the project across container sessions.

Practical example:

Settings in your devcontainer.json match the vscode settings.json keys. For example, to apply a project-wide standard such as organizing imports on save, your devcontainer.json could include the following:

{
  "settings": {
    "editor.codeActionsOnSave": {
      "source.organizeImports": true
    }
  }
}

Oftentimes, your project or environment will make use of different environment variables. Environment variables usually contain sensitive information, so it’s crucial to adopt strategies that protect these details while ensuring they are accessible for the container’s needs.

Secrets and Authentication:

Directly incorporating secrets or tokens into the devcontainer.json file or Dockerfile is a security risk, as these files are often checked into source control. Instead, prefer using authentication tokens that can be passed into the container at runtime, ensuring that sensitive information remains secure. For services requiring authentication, consider implementing tokens or other mechanisms that don’t rely on the user interaction to login to specific tools within the container. Ideally, when the contributor opens the container, they have immediate access to all the tools included in the project.

Using a .env File:

A .env file, not tracked by source control, is an effective way to manage environment variables locally. This file can be referenced by your Devcontainer configuration to dynamically inject variables into your container at startup. Ensure that your .env file is included in your .gitignore to prevent accidental exposure of secrets.

To incorporate the .env file into your Devcontainer, specify it in your devcontainer.json. This is essential for projects using a Dockerfile or a direct image reference. Include the runArgs property to point to your .env file:

{
  "runArgs": ["--env-file", ".devcontainer/devcontainer.env"]
}

This configuration tells Docker to use the environment variables defined in your .env file when creating the Devcontainer, ensuring your development environment is configured with the necessary variables right from the start.

Error Handling for Missing Variables:

To enhance the developer experience, configure your environment to generate informative error messages if required environment variables are missing, rather than failing silently. This practice aids in quicker troubleshooting and ensures that developers are aware of any missing configuration needed for the application to run properly.

Documenting Environment Variables:

Documentation is key to managing environment variables effectively. Consider the following approaches:

  • Link to Secure Key Storage: For projects utilizing cloud services or external key management systems, link directly to these services in your documentation, ensuring that access to these keys is tightly controlled and only available to authorized personnel.
  • Detailed README: Include a README file that outlines each environment variable, its purpose, and instructions for obtaining necessary values. This README can then be referenced from the main project documentation to guide developers in setting up their local environment.

By adopting these strategies for managing environment variables within Devcontainers, teams can ensure that their development environments are both secure and functional, facilitating smooth project setup and ongoing development work.

While devcontainer.json provides various hooks for executing commands during the Devcontainer’s lifecycle, I would generally recommend to put scripts within the Dockerfile whenever possible.

Why Prefer Dockerfile Scripts:

Scripts included in the Dockerfile are executed during the build process, which means they run only once, making the build slightly longer but the container startup quicker. This approach not only streamlines the container initialization but also ensures that all necessary configurations and dependencies are baked into the image itself, leading to a more predictable and stable development environment. Furthermore, Dockerfile scripts can leverage Docker’s layer caching to minimize rebuild times, enhancing overall efficiency.

In contrast, initialization commands in devcontainer.json are better suited for adjustments or setups that specifically relate to the Devcontainer’s lifecycle, complementing the foundational configurations established in the Dockerfile.

Example of a Lifecycle Hook Command:

Despite the preference for Dockerfile scripts, there are scenarios where devcontainer.json hooks are necessary for the final setup stages. For instance, you might want to signal the completion of the setup process:

{
  "postCreateCommand": "echo 'Devcontainer setup complete!'"
}

This postCreateCommand provides a straightforward way to execute tasks right after the container is created, offering a flexible mechanism for handling setups that cannot be pre-configured in the Dockerfile.

When developing applications that depend on multiple services, such as databases, caching systems, or additional microservices, Docker Compose becomes invaluable. It allows you to define and run multi-container Docker applications, configuring each service within a single docker-compose.yml file for simplicity and coherence.

Key Advantages of Using Docker Compose:

  • Simplified Configuration: Docker Compose consolidates the configuration of multiple containers into a single file, making it easier to manage complex environments.
  • Networked Containers: Automatically sets up a network that allows containers to communicate with each other, crucial for applications that rely on interconnected services like web applications with databases.
  • Volume Management: Simplifies data persistence and sharing between containers and the host machine, ensuring that your development environment and data remain intact across container rebuilds.

Practical Example:

In order to get a better understanding of the differences between using a dockerfile or image vs a docker compose setup, let’s look at an example set up for a project. For this project, lets assume we have a Go website that utilizes a MariaDB database, Docker Compose can concisely describe the entire environment. We can start by giving an overview of the docker-compose file setup.

docker-compose.yml:

Our Docker Compose file will include two services. Our first service will be our service to host our code for our Go website code and develop, which we will call dev. This service is intended to be our main environment for development. Our second service will be our MariaDB database, which we will connect to from our application.

For our Golang service, several crucial configurations are necessary. To start, we point to our dev.Dockerfile to specify the base Docker image or Dockerfile. This file includes all necessary tools for development, building, testing, and project maintenance, similar to what we’d have in a standalone Dockerfile setup. Additionally, we set up the port bindings we will use for our application. For the following example, we are using port 8080. Our environment variables (which we will move to a .env file later in the setup) are added, and we set up the volume to mount our source code into the /workspace folder of our Devcontainer.

services:
  dev:
    build: ./docker/dev.Dockerfile
    ports:
      - '8080:8080'
    depends_on:
      - db
    environment:
      MARIADB_USER: devuser
      MARIADB_PASSWORD: devpass
      MARIADB_DATABASE: example_db
      DB_HOST: db
    volumes:
      - ./:/workspace

Our db service specifies the MariaDB database, including version, environment variables for setup (e.g., MARIADB_USER), and volume configuration for data persistence.

services:
  db:
    image: mongodb:11.4
    environment:
      MARIADB_USER: devuser
      MARIADB_PASSWORD: devpass
      MARIADB_ROOT_PASSWORD: super_secret_password
      MARIADB_DATABASE: example_db
    volumes:
      - postgres_data:/var/lib/postgresql/data

As we discussed before, we do not want to keep the environment variables in our source control. Because of this, we will move the environment variables to a .env file instead of specifying environment variables directly in the docker-compose.yml. For this example, we will use a single shared .env file to store all environment variables used by each service.

MARIADB_USER=devuser
MARIADB_PASSWORD=devpass
MARIADB_DATABASE=htmx_db
MARIADB_ROOT_PASSWORD=super_secret_password
DB_HOST: db

Once we create the .env file, we can reference them in our Docker Compose configuration for both services by adding the following key to each:

env_file: .env

Docker Compose automatically creates a network for the defined services, allowing the Golang application to communicate with PostgreSQL using the service names as hostnames.

devcontainer.json:

The dockerComposeFile key within our devcontainer.json is designated to identify the Docker Compose file we’re utilizing. Given the multiple services outlined in this file, it’s imperative to choose the specific service that initiates the container hosting our project’s code. For our purposes, this will be the dev container, where our working code is situated.

{
  "name": "HTMX Golang + PostgreSQL",
  "dockerComposeFile": "docker-compose.yml",
  "service": "web",
  "workspaceFolder": "/workspace",
  "forwardPorts": [8080],
  "postCreateCommand": "echo 'Devcontainer setup complete!'"
}

Remote workstations can be necessary for projects or companies that require secure development environments, especially in cases where source code should not be downloaded on a contributor’s local machine. GitHub Codespaces is an excellent option for such cases. Leveraging Devcontainers’ as their foundation, Codespaces takes the concept to the cloud, providing a fully configured development environment accessible from anywhere via a web browser. With Codespaces, you can be confident that you won’t have to worry about maintaining and configuring your development environment. Instead, you can focus on your work and collaborate with others without compromising the security of your source code.

GitHub Codespaces is a cloud-based development environment powered by Visual Studio Code, enabling you to develop within Devcontainers in the cloud. It integrates seamlessly with GitHub, allowing you to launch a development environment directly from a repository with the same configurations and tools you would use locally. This means you can write, run, and debug your code without needing to set up anything on your local machine.

The introduction of GitHub Codespaces brings several key advantages to the software development process, including:

  • Accessibility: Access your development environment from any device, anywhere, without the need for local setup. All you need is a web browser.
  • Consistency: Just like local Devcontainers, Codespaces ensures that your development environment is consistent with your team’s configurations, mitigating the “works on my machine” problem.
  • Speed: Spin up development environments in seconds, ready to code, without the overhead of local environment setup.
  • Resource Efficiency: Offload the resource demands of your development environment to the cloud, keeping your local machine free and responsive.

Getting started with GitHub Codespaces for your project involves a few straightforward steps:

  • Prepare Your Repository: Ensure your repository has a .devcontainer directory with a devcontainer.json file, just as you would for local Devcontainers. This file will determine the setup of your Codespace environment.
  • Launch Codespaces: Navigate to your GitHub repository in a web browser and click on the “Code” button. Then, select “Open with Codespaces” and “New codespace” to create a new environment. GitHub will automatically configure your Codespace based on the settings defined in your .devcontainer directory.
  • Develop in the Cloud: Once your Codespace is ready, you’ll be presented with a VSCode environment in your browser that is fully configured and ready to code. You can start developing immediately, with access to all the features and extensions of VSCode, just as you would locally.
  • Commit and Collaborate: Work within your Codespace just like any local development setup. Commit changes, push to your repository, and collaborate with your team, all directly from your browser.

While Devcontainers and GitHub Codespaces can help streamline and enhance development workflows. Being aware of their limitations and how to navigate them is essential for maximizing their potential.

  • Resource Usage: Devcontainers require a considerable amount of system resources, which can affect your local machine’s performance, especially when running multiple or complex environments. This is due to the fact that they are build on top of Docker containers which use the resources of your local device to host virtualized containerized containers environments.
  • Lock-in to VSCode: Devcontainers offer seamless integration within Visual Studio Code, potentially limiting users of other editors or IDEs. Connecting to Devcontainers through Docker for use with other platforms requires a manual setup, potentially detracting from the convenience offered by VSCode.
  • GitHub Codespaces Free Tier Limitations: The GitHub Codespaces free tier provides limited compute resources and usage hours per month. This constraint can impact the performance of demanding applications and might not accommodate extensive development work, presenting a challenge for developers requiring more robust resources.

  • Optimizing Resource Consumption: For those experiencing performance issues with Devcontainers, adjusting Docker’s resource allocation settings may help improve your system’s performance.
  • Navigating Editor Lock-in: For developers using editors and IDEs other than VSCode, exploring Docker integration plugins for their preferred platform can provide alternative ways to connect to and use Devcontainers. Most modern editors now have setups for working within containerized or remote environments. Learning Docker’s command-line interface also offers a versatile approach to managing containers outside of VSCode. Consider using a Docker compose setup, even if you only have one service. This allows you to configure volumes and set up environment variables that can be shared between the Devcontainer setup and directly using Docker with your editor. However if using Codespaces, I am not aware of a workaround that would work. If using a terminal based text editor like neovim, you can install it and set it up within the codespace or Devcontainer itself.

Understanding these limitations can help you better understand whether Devcontainers and possible Codespaces fits the needs of your company or team for your development goals.

I hope you have discovered some use cases for Devcontainers within your development environment on your projects from the insights in this article. If not I hoped you enjoyed learning about a tool that can help shape your development environment across your team to be more consistent and make it easier for new contributors to set up their environment and quickly contribute. I believe many projects can benefit from Devcontainers or even from developing in project specific containerized environments.

If you enjoyed the content and want to keep up to date on all of the articles, . Where I’ll to inform you of content fresh off the press. To ensure that every new article we post finds its way to your inbox—without the worry of excessive spam, you will only receive at most one email per week and only when there is new content for you to enjoy!

If you’re looking to further support or have a specific topic you’re itching to see explored in depth, consider checking out my Patreon. It’s a space where I not only take article suggestions but also engage in different conversations. Additionally, patrons gain early access to articles, offering a sneak peek into the latest developments and insights before they go public.

Your support and curiosity drive the content and discussions that unfold, helping to shape a community rich in knowledge and shared exploration. If you’re interested in becoming a part of this journey, contributing ideas, or simply want to show your support, visit my Patreon.

See you next time, happy coding!