Making an exact copy of an object isn’t an everyday use case for most programming APIs, but there are certain tasks where the ability to do so is vital. It’s not an easy task per se, and it requires a lot of thought when executed on a larger scale. When designing a modular framework like Duality, where every user can easily add custom classes into the realm of conveniently automated behavior, things get even worse. Fingers crossed, I might finally have found a solution.

Different Concepts

Before we begin, let’s talk about cloning in general. It is a complex topic that deserves some thought. Let’s say you encounter the following code snippet:

Although these few lines seem fairly straightforward, the actual result could vary greatly, depending on the implementation of the clone method in question. Cloning as such isn’t strictly bound to one definition and what exactly it means often depends on use case requirements. In my case, I was searching for a cloning algorithm that satisfies the following definition:

An clone is a new object that, upon creation, equals the original in data and behavior, without being bound to the context in which it was created or the object it was initialized from.

As it turns out, this goal is surprisingly hard to reach. There are several common methods for implementing cloning algorithms, but unfortunately, most of them break fairly quick even if they looked promising at first.

Method 1: Shallow Copy

The target object can be created by simply creating a new instance of the appropriate type and assigning all of its fields the same values as the source object. This is what’s called a flat, or shallow copy. It is what you get when calling MemberwiseClone, and it is very rarely what you really want. Just imagine the object internally holding a List of strings: After a shallow copy, both source and target refer to the same list and modifying either original or the cloned object now has the potential to affect both.


Method 2: Deep Copy

All target fields could be assigned a value that is equal to the one from the source object, but not the same. This is what’s called a deep copy. It can be achieved using a reflection-driven algorithm, or by exploiting serialization to write an object to memory and reading it again. If the source object is internally holding a List of strings, its clone holds a new List of strings with equal contents, which is a lot better than a shallow copy – but just as likely to break your stuff: If there is an internal reference to a singleton class, there are now two instances of that singleton, and the cloned object has its own, private one.

Method 3: Manual Copy

Examining these options, none of them seems like a suitable candidate for a system that should be able to clone any kind of object. In most cases, it seems the easiest way to clone an object is to let it implement its own custom method. After all, each object knows best how it wants to be handled, right? Let’s call this approach the manual copy. It can just about do anything and you simply have to trust the documentation about its internal mechanisms and public results, but it has by far the highest success rate of all the above methods.


For the longest time, Duality has relied on the manual copy approach, combined with a shallow copy fallback for user-defined classes that handled all ICollection types in a hardcoded deep copy special case. The fallback worked reasonably well and in case something was a little off, the user could always implement an explicit cloning method using the manual copy approach to save the day. The biggest drawback was the impact of maintaining all those copy methods, where each new field had to be copied using some hand-written lines of code. I wasn’t very happy with it, but I accepted the fact that it was the only cloning method that worked reliably.

That was until the day it broke.

The Use Case That Broke It

As it turns out, even when doing a manual copy for every single object and completely relying on user-written code, cloning can still break under certain circumstances, and the only practicable way around it is an automated cloning system. In order to explain why, let’s see what happened first:


The image above shows the (simplified) object graph that first exposed problems with the previous system. When cloning the Scene at the root of that graph, its manual copy method will be invoked, and being the owner of its GameObjects, the Scene obviously needs to clone them as well, instead of just assigning their references. Being a good object-oriented fellow, it doesn’t attempt to do so itself, but calls each GameObjects cloning method. The GameObject acts in the same way with its Components, so their manual cloning method will be called as well. So far, so good.

Our problem arises when the Component encounters a reference to a different Component: Since it doesn’t own that other Component, it will choose to simply reference it instead – we don’t want to end up with additional Components after all! However, this reference now points to an original Component, not the cloned one, which is obviously not what we want!


This cannot be fixed from within a custom Clone method: The one provided by Component doesn’t have the required overview to handle problems like this, and the Scenes Clone method doesn’t have the required knowledge about the internals of each Component. To fix this, there needs to be a cloning system keeping track of the operation as a whole. A manual copy approach alone can’t solve this.

That said, the presence of a cloning system isn’t something new in the realms of the Duality framework, but despite certain mechanics to solve problems like the one above, it still failed in the example case. It maintained a mapping between objects from the source graph and objects from the target graph, so each objects Clone method could request the target equivalent of an arbitrary source reference. This is cool so far, and it works in a lot of cases. However, as soon as there is a circular reference like the one between the two Components, one of those is going to be copied first – and at this point, the cloning system simply hasn’t seen the other one yet, or their target equivalent. This problem was only the tip of the iceberg, so I finally decided to completely rewrite the cloning system in Duality, and replace it with a better one.

A Better Cloning System

Over the course of the last months, I’ve spent a lot of thought on how an »ideal« cloning system could look like and came to the conclusion that the question of ownership is central to its implementation: If object A has a field that references object B, does A own B, or does it simply refer to B? The answer to that question makes the difference between deep-cloning a child object and simply referring to it – and in order to find the correct answer, cloning needs to be a context dependent two-step process:

  1. Explore the source graph and allocate the target graph.
  2. Copy data from source to target using the previously gathered ownership and mapping information.

In the first step, no data will be copied at all. Its purpose is to determine which objects are in a direct or indirect ownership relation to the cloning operations source object, and to provide default-initialized target instances for all of those. Exploring the source graph can be done by using reflection to recursively follow all references of the root object.

By default, when an object references another one, ownership is assumed. Keep in mind that ownership is only relevant to us with regard to the cloning operations root object: If root object R owns object A, which owns object B, then ownership of both A and B will be assumed. Objects that are not assumed to be owned by the root object will not be investigated further, since they are potentially mighty goblin gods and definitely none of the cloning systems business.

Note that without any additional knowledge specified by the programmer, all object references are ownership references and thus, the whole cloning process equals a deep copy operation as described above. The trick is that there is a special CloneBehavior attribute that can be applied to classes and fields in order to change, how certain references are handled. When applied to a class, the attribute is treated like a global setting for all instances of that type. When applied to a field, the attribute is treated like a local override for the reference represented by that field. These local overrides can also target specific Types in order to correctly handle objects stored within collections.

Additionally, there is the concept of weak references: Imagine a tree structure of Nodes, where each Node can contain N child Nodes and 1 parent Node:

When cloning a certain Node from a Node tree, we expect all of its child Nodes to be cloned as well, but not its parent Node. However, we don’t even want to reference that parent Node, since this would introduce an inconsistency to the hierarchy, where our cloned Node sees itself as child of a (source) object that doesn’t even know it. At the same time, we can’t simply ignore the parent field in general, because the cloned child Nodes should definitely refer back to their cloned parent Node. This is where weak references can help:

A weak reference will behave exactly the same as references, as long as at least one ownership relation to the object in question is detected. If the object isn’t owned by any object in the source graph, weak references will simply be skipped during cloning and affected fields retain their default value. In this case, it means that the parent field of a cloned Node will be null, if the parent isn’t part of the cloned object hierarchy itself.

CloneNodeGraphIn addition to these attributes, it is also possible to ignore certain fields completely using a special flag, and to provide a manual implementation of cloning behavior by implementing a certain interface. However, due to the available field and class attributes, custom implementations have rarely been necessary yet.

I could probably spend a lot more time talking about automated cloning using reflection, but this is what’s left when boiled down to its essence. Feel free to take a look at it on GitHub, but keep in mind that this is still a work-in-progress topic: You’ll find it in the cloning  branch, subfolder  Duality/Cloning.

Due to the ongoing transition from the old system, that specific Duality branch shouldn’t be used in production. I’ll merge it back as soon as the transition is complete.