Today, I would like to discuss an issue many developers might be facing when having multiple computers. Let’s say one for personal use and one provided by the employer.

Developers are usually putting a lot of care into setting up their environment. Whether this means the configuration of tools like their shell, git, …​, or adding custom scripts or bash/zsh/fish/…​ aliases. In both cases, this is usually achieved through files present in the user’s $HOME directory.

Why should those files be synchronized?

One of the main reasons developers put a lot of attention to those files is the productivity it gives them. Being in a familiar and tailored to your need environment helps in being more efficient.

In my experience, the cost of not having a consistent experience between my personal machine and my work one was high: Different user experiences and expectations due to different shell prompts, different/missing aliases/scripts, and so on.

The cost of maintaining those two environments in sync by hand was just too high and led to many mistakes.

Time to automate this!

The Cloud file sync options

In order to synchronize those files between the two machines, the first option that came to my mind was to use a file synchronization mechanism like those offered by Tresorit, Dropbox, Google Drive, …​

Yet, this has some major inconveniences.

  • Configuration of synchronization tools for a sparse files tree is at the very least extremely cumbersome.

  • It might not be allowed to install, configure or use a Cloud synchronization tool on company-provided devices

  • It is also not possible to backup files that should not be synchronized because they are device, OS, or context (personal versus professional) dependent like a ~/.ssh/config or ~/.gitconfig.

Rethinking the problem

The main problem was the sparse file tree to keep in sync. So what if there could be a way to centralize them in a dedicated folder which we will call a repository.

At this stage, there are two options to install those files at the desired location which we will call the directory. Either install those files to their destination using a file copy via cp, rsync, or any other tool. Or install them by creating a symbolic link. For example: $HOME/.bashrc → $HOME/config/.bashrc.

Let’s take a look at the two strategies: copying versus linking.

Copying strategy

The copying strategy creates a copy of the repository file in the directory. This means that the workflow to operate is to:

  • Modify a file in the repository

  • Trigger the synchronization from repository to directory

This strategy has the following limitations:

  • The visibility of where the file is coming from is limited.

  • Changes done by tools modifying the files directly in the directory will not be visible in the repository. A tool would be necessary to perform a comparative analysis of the directory and the repository content and perform a merge.

  • It is hard to find out that a file was deleted from the repository and should thus be deleted from the directory on the next synchronization. A history of the synchronization would need to be kept to tackle such use cases.

Linking strategy

The linking strategy creates a symbolic of from the directory file to the file in the repository. This means that the workflow to operate is to:

  • Modify a file in the repository or in the directory and changes are immediately effective

  • Only when adding new files or deleting old ones is it necessary to trigger the synchronization from repository to directory

This strategy has the following advantages:

  • The visibility of where the file is coming from is explicit and held by the symbolic link.

  • Changes done by tools modifying the files in the directory will be immediately visible in the repository.

  • It is possible to create symbolic links to directories and not only to files.

  • It is easy to find out that a file was deleted from the repository as there would be a dangling symbolic link in the directory pointing to the repository. No synchronization log is needed as the links hold all the information themselves.

It is possible to use hard links instead of symbolic links, but then, the limitations of the copy strategy about deletions would kick back in.

Introducing Symly: from a sparse file tree to a centralized one

Symly is a tool I created that replicates the repository files tree structure in the directory. It applies the linking strategy to create links to files in a repository in the desired directory.

Symly is free and Open Source released under the Apache License Version 2.0.

It provides the following features:

  • Replication of the repository files tree structure in the directory

  • Symbolic link creation/update/deletion for files (default mode)

  • Symbolic link creation/update/deletion for directories (on demand only)

  • Orphan symbolic links deletion

  • Support for multiple repositories

  • Layering of repositories allowing for custom files and defaults ones

  • Not limited to dotfiles, can be used for any other directory/file types.

It is based on the following principles:

The repository file tree is the state

This principle has the following implications:

  • No command to add a file to a repository, just drop it there!

  • No command to delete a file from a repository, just delete it!

  • No command to edit a file in a repository, just edit it directly or through its symbolic link in the _directory! This allows for seamless integration with tools modifying dotfiles directly on the directory (like git config …​, …​)

  • Immediate visibility on modifications made on the directory files

Symly is not a synchronization tool

Symly is not a synchronization tool. It enables any synchronization tools by providing them a single centralized folder to work on: the repository. Whether you chose a cloud-based solution, a developer-like solution like git, or a simple rsync, it’s up to you. Updates, diffs, and conflicts management are thus under the responsibility of those tools.

Here is a simple example where we consider the following ~/config/dotfiles repository:

 |-- .bashrc
 |-- .gitconfig
 \-- .config
     |-- starship.toml
     \-- fish

Let’s call Symly with the ~/config/dotfiles repository and use the user’s home folder as the directory .

> symly link --dir ~ --repositories ~/config/dotfiles

This will create the following symbolic links:

$HOME/.bashrc                   ->  $HOME/config/dotfiles/.bashrc
$HOME/.gitconfig                ->  $HOME/config/dotfiles/.gitconfig
$HOME/.config/starship.toml     ->  $HOME/config/dotfiles/.config/starship.toml
$HOME/.config/fish/  ->  $HOME/config/dotfiles/.config/fish/

Combining Symly and a synchronization tool

One of Symly’s strengths, or weaknesses, depending on the point of view, is that it is not a synchronization tool. Hence, you can use the tool of your choice, or the one best suited for the job.

You do not need multiple computers to experience the diffs, backups, and other benefits of using the tool of your choice for this task.

In my case, even if I could use any cloud files synchronization tool like Tresorit, Dropbox, Google Drive, or even a simple rsync, I decided to use git.

Why git? Well, git is not a file synchronization tool, but I realized that the way I work with my dotfiles is actually more like a development workflow, hence git.

With the Symly + git combination I get:

  • Overview of current changes with git diff/git status: no risk of a silent modification of my config files by some external tool to be unnoticed.

  • History of changes with git log and revert to older versions in case of issues.

  • Branches for experimental changes.

In short, all the power of git for managing my dotfiles without the need to learn a new tool.

Going further

In this article, I explained the basic capabilities of the tool. In the next article, I will walk you through what I consider the major strength of Symly: repository layering. This amazing feature allows you to perform the linking in a different way depending on the context (work/personal, macOS/Linux, …​).

Until then, you can take a look at the documentation, installation instructions, and roadmap on GitHub.

I’d be happy to hear your feedback and opinion.