Performance in .NET – Part 1
Updated: thanks, Paulo Morgado!
Updated: see the second post here and the third here.
Introduction
Along the years I wrote a couple of posts about performance in the .NET world. Some were more tied to specific frameworks, such as NHibernate or Entity Framework, while others focus on the generic bits. In this series of posts I will summarize my findings on .NET in general, namely:
-
Object creation (this post)
-
Object cloning
-
Value Types versus Reference Types
-
Collections
-
Possibly other stuff
I won’t be talking about object serialization, as there are lots of serializers out there, each with its pros and cons. In general, I’d say either serializing to and from JSON or from a binary format seem to be the most demanded ones, and each has quite a few options, either provided by Microsoft or from third parties. The actual usage also affects what we want – is it a general-purpose serializer or one for a particular usage, that needs classes prepared accordingly? Let’s keep it out of this discussion.
As always, feel free to reach out to me if you want to discuss any of these! So, lets start with object creation.
Object Creation
Let’s start with object creation and by defining our purpose: we want to be able to create object instances of a certain type as fast as possible. We have a couple of strategies:
-
Using the new operator
-
Using Reflection
-
Using System.Reflection.Emit code generation
-
Using Activator.CreateInstance
-
Using LINQ expressions
-
Using delegates
-
Using Roslyn
Let’s cover them all one by one.
Using the new Operator
This is the most obvious (and fast), but does not play well with dynamic instantiation, meaning, the type to instantiate needs to be hardcoded. I call it direct instantiation, and it goes as this (you know, you know…):
var obj = new Xpto();
This should be the baseline for all performance operations, as it should offer the best possible performance.
Using Reflection
Here I’m caching the public parameterless constructor and invoking it, then casting the result to the target type:
var ci = typeof(Xpto).GetConstructor(Type.EmptyTypes);
var obj = ci.Invoke(null) as Xpto;
Just avoid getting the constructor over and over again, do it once for each type then cache it somewhere.
Using FormatterServices.GetUninitializedObject
The GetUninitializedObject method is used internally by some serializers and what it does is, it merely allocates memory for the target type and zeroes all of its fields, without actually running any constructor. This has the effect that any explicitly declared field and property values will be lost, so use with care. It is available in .NET Core:
var obj = FormatterServices.GetUninitializedObject(typeof(Xpto)) as Xpto;
Pay attention that none of the constructors of your type are executed, and no fields or properties have their initial values set, other than the default value for each type (null for reference types, the default for value types).
Using System.Reflection.Emit code generation
This one uses the code generation library that is built-in with .NET (but not .NET Core, for the time being):
var m = new DynamicMethod(string.Empty, typeof(object), null, typeof(Xpto), true);
var ci = typeof(Xpto).GetConstructor(Type.EmptyTypes);
var il = m.GetILGenerator();
il.Emit(OpCodes.Newobj, ci);
il.Emit(OpCodes.Ret);
var creator = m.CreateDelegate(typeof(Func<object>)) as Func<object>;
var obj = creator() as Xpto;
As you can see, we are just generating code for a dynamic method, providing a simple content that does “new Xpto()”, and execute it.
Using Activator.CreateInstance
This is essentially a wrapper around the reflection code I’ve shown earlier, with the drawback that it does not cache each types' public parameterless constructor:
var obj = Activator.CreateInstance(typeof(Xpto)) as Xpto;
Using LINQ expressions
The major drawback of this approach is the time it takes to build the actual code (the first call to Compile). After that, it should be fast:
var ci = typeof(Xpto).GetConstructor(Type.EmptyTypes);
var expr = Expression.New(ci);
var del = Expression.Lambda(expr).Compile();
var obj = del.DynamicInvoke() as Xpto;
Of course, if you are to call this a number of times for the same type, it may be worth caching the constructor for each type.
Using Delegates
The LINQ expressions approach actually compiles to this one, but this is strongly typed:
Func<Xpto> del = () => new Xpto();
var obj = del();
Using Roslyn
This one is relatively new in .NET. As you may know, Microsoft now uses Roslyn to both parse and generate code dynamically. The scripting capabilities are made available through the Microsoft.CodeAnalysis.CSharp.Scripting NuGet package. The actual code for instantiating a class (or actually executing any code) dynamically goes like this:
var obj = CSharpScript.EvaluateAsync("new Xpto()").GetAwaiter().GetResult() as Xpto;
Do keep in mind that Roslyn is asynchronous by nature, so you need to wait for the result, also, do add the full namespace of your type, which I omitted for brevity. There are other APIs that allow you to compile code and reuse the compilation:
var script = CSharpScript.Create<Xpto>("new Xpto()", ScriptOptions.Default.AddReferences(typeof(Xpto).Assembly));
var runner = script.CreateDelegate();
var obj = runner().GetAwaiter().GetResult();
Conclusion
Feel free to run your tests, with a few iterations, and look at the results. Always compare with the normal way to create objects, the new operator. Do not forget the problems with each approach, like the need to cache something or any limitations on the instantiated object.
In my machine, for 1000 iterations, a couple times for the same run, I get these average results (elapsed ticks):
Technique | Delay |
Direct | 0.148 |
FormatterServices.GetUninitializedObject | 0.324 |
Activator.CreateInstance | 0.296 |
Reflection | 0.6 |
IL | 0.557 |
LINQ Expression | 4.085 |
Delegate | 0.109 |
Roslyn | 2400.796 |
Some of these may be surprising to you, as they were to me! It seems that reflection is not that much slower than direct instantiation as one might think… hmmm…
As usual, I’d love to hear your thoughts on this! More to come soon!