I thought I would share a few video/podcast training resources that I use from time to time.
PDC 2010: http://player.microsoftpdc.com/session
PDC 2009 : http://www.microsoftpdc.com/2009/KEY01
DEVDAYS 2010: http://channel9.msdn.com/Tags/devdays-2010-nl
DEVDAYS 2011: http://channel9.msdn.com/Events/DevDays/DevDays-2011-Netherlands
NDC 2011: http://www.ndc2011.no/agenda.aspx?cat=1071
NDC 2010: http://www.ndc2010.no/agenda.aspx?cat=1071
Silverlight Videos
Silverlight TV : http://channel9.msdn.com/Shows/SilverlightTV
Expression Blend Videos
Expression Blend 4 Videos : http://expression.microsoft.com/en-us/cc197141.aspx
General
Dot Net Rocks TV: http://www.dnrtv.com/
Channel 9: http://channel9.msdn.com/
Podcasts
Dot Net Rocks: http://www.dotnetrocks.com/
Wednesday, 14 September 2011
Monday, 13 June 2011
Strategy Pattern
I thought I would share in this post my thoughts on a design pattern that has changed the way I write code. The beauty is in the simplicity of this pattern. It's easy to implement and doesn't unnecessarily complicate the problem domain, in fact it simplifies it and leads you gently towards the wonderful world of Test Driven Development(TDD). The pattern is the strategy pattern.
The Strategy Pattern is also known as "Constructor injection" (or just called a "plug-ins" by those that haven't read Design Patterns).
The strategy pattern is taken to the next level by the Dependency Injection Pattern and can be implemented programmatically (or declaratively) using an IoC container. IoC (Inversion of Control) containers are powerful in the sense that all the services in your application that have access to the container have "access" to all the services that have subscribed to the container. These patterns also facilitate ... and make it easier to write tests for the services in your application by relying heavily on Interfaces.
How Dependency Injection is "abused":
The problem is that your code is no longer self documenting. You are never sure just by looking at methods where services are extracted out of the container which service they will receive from the container. This makes your code feel a little like "black magic". Extracting services from a container is also more of a performance hit than "wiring" up your services programmatically.
TDD (Test Driven Development):
This concept is about writing a programmatic test for a method/function call to ensure that the expected result is returned from the provided parameters, be it an exception, or a physical data value. Unit tests also allow for behavioural testing by recording the inner service calls expected using a Mocking Framework such as Rhino Mocks.
This can be extremely useful if you have a deep framework with multiple layers of functionality where a change in one area can have a ripple effect on other areas.
Misconceptions, and mistakes developers make with regards to TDD:
1. Writing/generating unit tests that test every single property of every object in the System. This is a waist of time and causes code bloat.
2. Unit tests are generated/written for the System AFTER the application is has been completed. This is useful, although only from a "white box" testing perspective.
The key ideas to keep in mind are:
1. What are the root or core features of my application that I need to have unit tests for. These services will normally contain the more complex code.
2. Unit tests are more useful if they are written BEFORE the system has been built. This will facilitate the design of your services and give you a deeper insight into their future extensibility requirements.
3. The unit tests serve as "specifications" for future developers that would like to use your libraries. The unit tests also provide insight into the behaviour of the API's your libraries expose.
TDD should be used to drive out the flow and design of the system you are working on.
The Strategy Pattern is also known as "Constructor injection" (or just called a "plug-ins" by those that haven't read Design Patterns).
The strategy pattern is taken to the next level by the Dependency Injection Pattern and can be implemented programmatically (or declaratively) using an IoC container. IoC (Inversion of Control) containers are powerful in the sense that all the services in your application that have access to the container have "access" to all the services that have subscribed to the container. These patterns also facilitate ... and make it easier to write tests for the services in your application by relying heavily on Interfaces.
How Dependency Injection is "abused":
The problem is that your code is no longer self documenting. You are never sure just by looking at methods where services are extracted out of the container which service they will receive from the container. This makes your code feel a little like "black magic". Extracting services from a container is also more of a performance hit than "wiring" up your services programmatically.
TDD (Test Driven Development):
This concept is about writing a programmatic test for a method/function call to ensure that the expected result is returned from the provided parameters, be it an exception, or a physical data value. Unit tests also allow for behavioural testing by recording the inner service calls expected using a Mocking Framework such as Rhino Mocks.
This can be extremely useful if you have a deep framework with multiple layers of functionality where a change in one area can have a ripple effect on other areas.
Misconceptions, and mistakes developers make with regards to TDD:
1. Writing/generating unit tests that test every single property of every object in the System. This is a waist of time and causes code bloat.
2. Unit tests are generated/written for the System AFTER the application is has been completed. This is useful, although only from a "white box" testing perspective.
The key ideas to keep in mind are:
1. What are the root or core features of my application that I need to have unit tests for. These services will normally contain the more complex code.
2. Unit tests are more useful if they are written BEFORE the system has been built. This will facilitate the design of your services and give you a deeper insight into their future extensibility requirements.
3. The unit tests serve as "specifications" for future developers that would like to use your libraries. The unit tests also provide insight into the behaviour of the API's your libraries expose.
TDD should be used to drive out the flow and design of the system you are working on.
Labels:
TDD
WCF ProtectionLevel performance test
Normally you should always have your security settings turned on even within an intranet environment.
There are times though when performance is critical and you may want to tweek the relevant WCF security settings.
I thought that I would have a look and see what the effect is of the ProtectionLevel security setting which can be set imperatively or declaratively... so I wrote the following performance test to evaluate the three different possible settings.
The results from my performance test are below:
As you can see from the above tests the when using Transport security with the NetTcpBinding ProtectionLevel Sign performance is not much different to EncryptAndSign, although there is a definite performance improvement when using Protection Level None.
Additional resources:
Understanding Protection Level
There are times though when performance is critical and you may want to tweek the relevant WCF security settings.
I thought that I would have a look and see what the effect is of the ProtectionLevel security setting which can be set imperatively or declaratively... so I wrote the following performance test to evaluate the three different possible settings.
using System;
using System.Linq;
using System.ServiceModel;
using System.Net.Security;
using System.Diagnostics;
namespace ProtectionLevel1
{
//This is the minimum protection level that the binding must comply with.
[ServiceContract(ProtectionLevel = ProtectionLevel.None)]
public interface IConvertString
{
[OperationContract]
string ConvertString(string input);
}
// Simple service which reverses a string and returns the result
public class ConvertStringService : IConvertString
{
#region IConvertString Members
public string ConvertString(string input)
{
char[] chars = input.ToCharArray();
chars.Reverse<char>();
return chars.ToString();
}
#endregion
}
class Program
{
// Setup the Service host
private static ServiceHost StartServer<T>(string Uri, NetTcpBinding binding)
{
Uri uri = new Uri(Uri);
ServiceHost host = new ServiceHost(typeof(ConvertStringService));
host.AddServiceEndpoint(typeof(T), binding, uri);
host.Open();
Console.WriteLine("Service opened using NetTcpBinding with {0} protection level.", Enum.GetName(typeof(ProtectionLevel), binding.Security.Transport.ProtectionLevel));
return host;
}
// Create channel using NetTcp binding with Uri and execute performance test.
private static double RunPerformanceTestOnNetTcpBinding(NetTcpBinding binding, string endpointUri)
{
const int iterations = 300;
const string stringToConvert = "12";
binding.Security.Mode = SecurityMode.Transport;
Stopwatch watch = new Stopwatch();
watch.Start();
for (Int32 i = 0; i < iterations; i++)
{
ChannelFactory<IConvertString>.CreateChannel(binding,
new EndpointAddress(endpointUri))
.ConvertString(stringToConvert);
}
watch.Stop();
Console.WriteLine("ConvertString with ProtectionLevel {2} : {0} times; Time taken : {1} ",
iterations.ToString(),
watch.Elapsed.TotalSeconds.ToString(),
Enum.GetName(typeof(ProtectionLevel), binding.Security.Transport.ProtectionLevel));
return watch.Elapsed.TotalSeconds;
}
static void Main(string[] args)
{
string uri = "net.tcp://localhost:7000/IConvertString";
NetTcpBinding bindingNoEncryption = new NetTcpBinding();
bindingNoEncryption.Security.Transport.ProtectionLevel = ProtectionLevel.None;
NetTcpBinding bindingWithSign = new NetTcpBinding();
bindingWithSign.Security.Transport.ProtectionLevel = ProtectionLevel.Sign;
NetTcpBinding bindingWithEncryption = new NetTcpBinding();
bindingWithEncryption.Security.Transport.ProtectionLevel = ProtectionLevel.EncryptAndSign;
Console.WriteLine("Run performance test or press q to Quit...");
while (Console.ReadLine() != "q")
{
double noEncryptionTime;
double signTime;
double encryptionTime;
using (ServiceHost host = StartServer<IConvertString>(uri, bindingNoEncryption))
{
noEncryptionTime = RunPerformanceTestOnNetTcpBinding(bindingNoEncryption, uri);
host.Close();
}
using (ServiceHost host = StartServer<IConvertString>(uri, bindingWithSign))
{
signTime = RunPerformanceTestOnNetTcpBinding(bindingWithSign, uri);
host.Close();
}
using (ServiceHost host = StartServer<IConvertString>(uri, bindingWithEncryption))
{
encryptionTime = RunPerformanceTestOnNetTcpBinding(bindingWithEncryption, uri);
host.Close();
}
Console.WriteLine("Run performance test again or press 'q' to Quit...");
}
}
}
}
As you can see from the above tests the when using Transport security with the NetTcpBinding ProtectionLevel Sign performance is not much different to EncryptAndSign, although there is a definite performance improvement when using Protection Level None.
Additional resources:
Understanding Protection Level
Labels:
WCF
Sunday, 12 June 2011
Text search and replace in large files using .net framework
When performing search and replace in large text files you should avoid loading the entire file into memory and rather make use of the StreamReader/StreamWriter classes provided by the .Net Framework. I thought I would share this short code snippet below which makes use of these two useful classes.
using System.IO;
namespace LoadXmlTestConsole
{
public class TextReplaceConsoleApp
{
public static void Main()
{
// File to perform search and replace on
string inputFileLocation = @"D:\largeTextFileInput.xml";
// File to create with replaced text
string outputFileLocation = @"D:\largeTextFileOutput.xml";
using (StreamReader reader = new StreamReader(File.Open(inputFileLocation, FileMode.Open)))
{
using (StreamWriter writer = new StreamWriter(File.Open(outputFileLocation, FileMode.Create)))
{
ReplaceInStreams(reader, writer);
}
}
}
private static void ReplaceInStreams(StreamReader reader, StreamWriter writer)
{
while (reader.Peek() != -1)
{
string line = reader.ReadLine();
// If search text found
if (line.IndexOf("Male") > -1)
{
//Perform text replace on line
line = line.Replace("Male", "Alien");
writer.WriteLine(line);
}
else
{
writer.WriteLine(line);
}
}
}
}
}
Labels:
.Net Framework
Saturday, 11 June 2011
The importance of simplicity and common sense
I would like to share a short letter I sent to the .NetRocks show many years ago which they read out:
"I find that the two most coveted attributes in a programmer is the appreciation for simple elegant solutions and plain common sense.
It seems as though there are too many of us that enjoy making life unnecessarily complicated within the systems we write/design.
It’s easy to complicate things nowadays with the wide array of abstraction mechanisms we have at our disposal. I always respect programmers/architects that can come up with a simple and scalable solutions. This inevitably pushes the system's complexity into the business rules where it belongs.
Unfortunately, I still find that a lot of programmers/architects make life unnecessarily complicated for the sake of job security. I feel that the one of the most overlooked issues by non-technical/technical management is the responsibility of the programmer (on top of getting the work done on time and within budget) to write self documenting and maintainable code."
END
“I value simplicity over everything; I always look for simplicity.” - Quote from Anders Hejlsberg
Opportunities to reduce technical debt (or delete code) occur when I can identify the following:
a) Code that has been is duplicated, not properly encapsulated into re-usable units.
b) Areas where the wheel has been re-invented. An API or set of tools/libraries already existed that do the job in a more elegant (and probably more efficient) manner.
c) Solutions where there are unnecessary abstractions or layers of indirection.
I do believe that if a software solution is working correctly and extensibility is not an immediate requirement there is no reason to refactor the code. "If it works, don't change it."
"I find that the two most coveted attributes in a programmer is the appreciation for simple elegant solutions and plain common sense.
It seems as though there are too many of us that enjoy making life unnecessarily complicated within the systems we write/design.
It’s easy to complicate things nowadays with the wide array of abstraction mechanisms we have at our disposal. I always respect programmers/architects that can come up with a simple and scalable solutions. This inevitably pushes the system's complexity into the business rules where it belongs.
Unfortunately, I still find that a lot of programmers/architects make life unnecessarily complicated for the sake of job security. I feel that the one of the most overlooked issues by non-technical/technical management is the responsibility of the programmer (on top of getting the work done on time and within budget) to write self documenting and maintainable code."
END
“I value simplicity over everything; I always look for simplicity.” - Quote from Anders Hejlsberg
Opportunities to reduce technical debt (or delete code) occur when I can identify the following:
a) Code that has been is duplicated, not properly encapsulated into re-usable units.
b) Areas where the wheel has been re-invented. An API or set of tools/libraries already existed that do the job in a more elegant (and probably more efficient) manner.
c) Solutions where there are unnecessary abstractions or layers of indirection.
I do believe that if a software solution is working correctly and extensibility is not an immediate requirement there is no reason to refactor the code. "If it works, don't change it."
Labels:
Daily thoughts
Thursday, 9 June 2011
Strong references - Reactive Extensions
I thought I would have a quick look to determine wether Rx would create a strong reference under the hood or not when one object subscribed to another object's events.
In the example below the Consumer object subscribes to the SampleData objects's SampleDataChanged event. This type of strong reference causes memory leaks in applications because the garbage collector will not clean up the event consumer even though the consumer may be out of scope. The result of my testing indicates that unfortunately there is still a strong reference because once the Consumer object goes out of scope and once Garbage collection has been called, calling the event on SampleData still causes the event to be raised in the Consumer meaning that the consumer has not been garbage collected.
Can I use the System.WeakReference object to resolve this issue? I will follow up on this shortly...
Console output with memory leak:
Console output without memory leak:
In the example below the Consumer object subscribes to the SampleData objects's SampleDataChanged event. This type of strong reference causes memory leaks in applications because the garbage collector will not clean up the event consumer even though the consumer may be out of scope. The result of my testing indicates that unfortunately there is still a strong reference because once the Consumer object goes out of scope and once Garbage collection has been called, calling the event on SampleData still causes the event to be raised in the Consumer meaning that the consumer has not been garbage collected.
Can I use the System.WeakReference object to resolve this issue? I will follow up on this shortly...
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace StrongReferenceTestConsole
{
class Program
{
static void Main(string[] args)
{
SampleData sampleData = new SampleData();
for (int i = 0; i < 3; i++)
{
// Create a consumer which registers a method with the sample data objects SampleDataChanged event
using (Consumer consumer = new Consumer(sampleData))
{
consumer.Name = "Consumer " + i.ToString();
//Raise the sample data object's data changed event
sampleData.RaiseSampleDataChanged("Work on data - iteration : " + i.ToString());
}
}
sampleData.RaiseSampleDataChanged("Sample Data change - Consumer outside scope");
// Force garbage collection
GC.Collect();
GC.WaitForPendingFinalizers();
GC.Collect();
Console.ReadLine();
}
}
public class Consumer : IDisposable
{
IDisposable _subscription;
public Consumer(SampleData sampleData)
{
// The usual method for event subscription
//sampleData.SampleDataChanged += new SampleData.SampleDataChangedEventHandler(Consumer_SampleDataChanged);
//USE RX to subscribe to events
_subscription = Observable.FromEvent<SampleDataChangedEventArgs>(sampleData, "SampleDataChanged").Subscribe(
args =>
Consumer_SampleDataChanged(sampleData, args.EventArgs)
);
}
public string Name { get; set; }
void Consumer_SampleDataChanged(object sender, SampleDataChangedEventArgs e)
{
Console.WriteLine("Sample Data Changed : {0} - Notified Consumer : {1}", e.Message, this.Name);
}
~Consumer()
{
Console.WriteLine("Consumer finalizer");
}
#region IDisposable Members
public void Dispose()
{
//UNCOMMENT TO REMOVE MEMORY LEAK - IDisposable is used in Rx to release strong reference.
//if (_subscription != null) _subscription.Dispose();
Console.WriteLine("Consumer Dispose");
}
#endregion
}
/// <summary>
/// SampleData object which exposes event SampleDataChanged
/// </summary>
public class SampleData
{
public event EventHandler<SampleDataChangedEventArgs> SampleDataChanged;
public void RaiseSampleDataChanged(string message)
{
if (SampleDataChanged != null)
{
Console.WriteLine("SampleDataChanged event has " + SampleDataChanged.GetInvocationList().Count() + " delegates wired up.");
SampleDataChangedEventArgs e = new SampleDataChangedEventArgs(message);
SampleDataChanged(this, e);
}
}
}
public class SampleDataChangedEventArgs : EventArgs
{
public SampleDataChangedEventArgs(string message)
{
this.Message = message;
}
public string Message { get; set; }
}
}
Console output with memory leak:
Console output without memory leak:
Labels:
Reactive Extensions
Type discovery in Assemblies using Attributes
I thought I would share a useful class I wrote a long time ago that I used to load a file from the disc, and assuming it was a .Net assembly would identify all the Types in the file which were decorated with a specific attribute.
Unfortunately, and surprisingly the Assembly.ReflectionOnlyLoadFrom(string filename) method also locks files, which was a surprise.... but Jeffry Richter explains the reasoning behind this in his book "CLR via C#". The reason is that the loaded assembly may contain types which derive from types in a different assembly..... So I had to fall back on the strategy of loading the assembly into a temporary application domain, reflect on the assembly and unload the application domain thus unlocking the file and unloading it from memory.
Unfortunately, and surprisingly the Assembly.ReflectionOnlyLoadFrom(string filename) method also locks files, which was a surprise.... but Jeffry Richter explains the reasoning behind this in his book "CLR via C#". The reason is that the loaded assembly may contain types which derive from types in a different assembly..... So I had to fall back on the strategy of loading the assembly into a temporary application domain, reflect on the assembly and unload the application domain thus unlocking the file and unloading it from memory.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Reflection;
namespace AssemblyDiscoveryConsole
{
// The attribute class you are looking for within external assemblies
public class SomeAttribute : Attribute
{
}
public class TypeFinder
{
public string[] LoadTypes(string assemblyFilename)
{
AppDomain appDomain = AppDomain.CreateDomain("TypeFinder");
try
{
// Load and query asseblies in domain and return results
AssemblyReflector ar = (AssemblyReflector)appDomain.CreateInstanceAndUnwrap(
typeof(AssemblyReflector).Assembly.FullName,
typeof(AssemblyReflector).FullName);
return ar.LoadTypes(assemblyFilename);
}
finally
{
// unload the temporary domain
AppDomain.Unload(appDomain);
}
}
/// <summary>
/// Class used to load and query assemblies within "TypeFinder" Domain
/// </summary>
private class AssemblyReflector : MarshalByRefObject
{
private IEnumerable<string> EnumerateTypes(string assemblyFilename)
{
Assembly asm = Assembly.LoadFile(assemblyFilename);
if (asm == null) throw new Exception("Assembly not found");
foreach (Type t in asm.GetTypes())
{
if (t.GetCustomAttributes(typeof(SomeAttribute), true).FirstOrDefault() != null)
{
yield return t.AssemblyQualifiedName;
}
}
}
public string[] LoadTypes(string assemblyFilename)
{
return EnumerateTypes(assemblyFilename).ToArray();
}
}
}
}
IIS7 Site Administration
IIS 7 Site creation and administration has never been easier with the new IIS7 management API. Like the managed Directory services API (System.DirectoryServices.AccountManagement) this library has saved me a lot of time. Here is a code snippet to show you how to create a site, its application, pool and root virtual folder.
Add a reference to the Microsoft.Web.Administration assembly.
Note:
1) Make sure you have IIS7 Features turned on for this library to be available (Windows features).
2) Ensure that you have a reference to System.ServiceModel.
3) Ensure that your target framework is .Net Framework 4.
Location:
C:\Windows\System32\inetsrv\Microsoft.Web.Administration.dll
Resources:
http://blogs.msdn.com/b/carlosag/archive/2006/04/17/microsoftwebadministration.aspx
http://learn.iis.net/page.aspx/165/how-to-use-microsoftwebadministration/
Add a reference to the Microsoft.Web.Administration assembly.
Note:
1) Make sure you have IIS7 Features turned on for this library to be available (Windows features).
2) Ensure that you have a reference to System.ServiceModel.
3) Ensure that your target framework is .Net Framework 4.
Location:
C:\Windows\System32\inetsrv\Microsoft.Web.Administration.dll
Resources:
http://blogs.msdn.com/b/carlosag/archive/2006/04/17/microsoftwebadministration.aspx
http://learn.iis.net/page.aspx/165/how-to-use-microsoftwebadministration/
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.Web.Administration;
using System.IO;
using System.Diagnostics;
namespace IIS7Administration
{
class Program
{
static void Main(string[] args)
{
using (ServerManager manager = new ServerManager())
{
string siteName = "TestSite";
string poolName = "TestSitePool";
try
{
Site site = manager.Sites[siteName];
if (site == null)
{
//create the site
site = manager.Sites.CreateElement();
site.Id = 1;
if (manager.Sites.Count > 0)
site.Id = manager.Sites.Max(s => s.Id) + 1;
site.ApplicationDefaults.EnabledProtocols = "http,net.tcp";
site.SetAttributeValue("name", siteName);
manager.Sites.Add(site);
// create the site's application
Application app = site.Applications.CreateElement();
app.SetAttributeValue("path", "/");
app.SetAttributeValue("applicationPool", poolName);
app.EnabledProtocols = "http,net.tcp";
site.Applications.Add(app);
// create the sites root virtual directory
VirtualDirectory vdir = app.VirtualDirectories.CreateElement();
vdir.SetAttributeValue("path", "/");
string sitePath = Path.Combine(@"c:\inetpub\wwwroot", siteName);
if (!Directory.Exists(sitePath))
Directory.CreateDirectory(sitePath);
vdir.SetAttributeValue("physicalPath", sitePath);
app.VirtualDirectories.Add(vdir);
}
// Create the default site pool
ApplicationPool apppool = manager.ApplicationPools[poolName];
// Create the application pool if necessary
if (apppool == null)
{
apppool = manager.ApplicationPools.Add(poolName);
Debug.WriteLine("Creating new application pool : " + poolName);
}
string appPoolUserName = "jean";
string appPoolUserPass = "test";
// set the pool's identity if necessary
apppool.ProcessModel.IdentityType = ProcessModelIdentityType.SpecificUser;
apppool.ProcessModel.UserName = appPoolUserName;
apppool.ProcessModel.Password = appPoolUserPass;
// set the pool's runtime version & pipeline mode
apppool.ManagedRuntimeVersion = "v4.0";
apppool.ManagedPipelineMode = ManagedPipelineMode.Integrated;
apppool.AutoStart = true;
// setup the bindings for the site
Binding httpBinding = (from b in site.Bindings
where b.Protocol == "http"
select b).FirstOrDefault<Binding>();
if (httpBinding == null)
{
httpBinding = site.Bindings.CreateElement();
httpBinding.SetAttributeValue("protocol", "http");
httpBinding.BindingInformation = "*:" + Convert.ToString("9090") + ":";
site.Bindings.Add(httpBinding);
}
// commit the changes to IIS 7
manager.CommitChanges();
}
catch (Exception ex)
{
Debug.WriteLine("Unable to commit changes. Reason:" + ex.Message);
}
}
}
}
}
Wednesday, 25 May 2011
Silverlight 4 Memory Leaks
Although the .Net Garbage collector reduces the need to concern yourself with memory you still need to pay attention to memory consumption and the causes of memory leaks or GC Roots. This will ensure you have well behaving & efficient application.
Libraries/Controls which I recently used that caused memory leaks:
a) Silverlight 4 runtime:
Some major memory leaks have been resolve with the Runtime drop in early 2011. Make sure you are using the latest version.
b) Caliburn Micro:
Make sure you use the latest version of the source code from codeplex since a major memory leak was fixed on May 6 2011.
c) Telerik controls:
Silverlight Telerik Controls with memory leaks.
d) Silverlight 4 Toolkit Controls:
A good example of a control with memory leaks here was the Context menu control. I resolved the memory leak in the ContextMenu control by implementing the patch in the context menu source code and manually calling Dispose() on the Context Menu control in the View's unload event. Keep your eye on the patches available for this control and various other controls which also have memory leaks!! (sigh)
Patches for the Silverlight 4 Toolkit.
Additional causes of memory leaks:
1) If you are using MVVM Shared resources declared in your Views will create GCRoots. Make sure all your shared resources are declared in your App.Xaml only. An example of this is when you are Merging your resources unecessarily in your View eg:
<UserControl.Resources>
<ResourceDictionary>
<ResourceDictionary.MergedDictionaries>
<ResourceDictionary Source="/Themes/XXX.xaml"/>
<ResourceDictionary Source="/Themes/YYY.xaml"/>
</ResourceDictionary.MergedDictionaries>
</ResourceDictionary>
</UserControl.Resources>
2) Timer callbacks. Creating a timer with an event that keeps ticking for infinity will create this issue. Make sure you call the Dispose() on the Timer when you unload the View.
3) Not unsubscribing from a Rx feed. Rx creates strong references under the hood.When one ViewModel subscribes to a feed from another view model a strong reference is created.
4) Strong references between shared objects and Service Models.
5) WPF Behaviours : By unwiring the behaviours for the Views in the Unload event using a helper class this issue was resolved.
6) WPF Fluid Move behaviours : Unfortunately there was no clear resolution to this.
7) Interaction Event Triggers which use Expression Blend's Interaction library to raise event triggers. IF these triggers use a ControlStoryBoardAction which references a static resource containing an animation that has a RepeatBehaviour set to Forever, this will cause a memory leak.
8) Each ViewModel needs to implement the Dispose Pattern. When the dispose for a ViewModel is called it needs to call Clear() on all observable collections that have been bound to child view's(user controls). This will ensure that the View's are unloaded.
9) When using MEF make sure understand how your container setup affects the lifetime of your application's objects. There may be scenarios where you will manually need to release your Exports.
RedGate Memory profiler:
This is a great tool although it only shows you the main memory leaks so you may think you have resolved all of them only to find that more leaks pop out of the woodwork. The ANTS Redgate Memory Profiler is able to provide a graphical display of strong references between objects on the managed heap.
Libraries/Controls which I recently used that caused memory leaks:
a) Silverlight 4 runtime:
Some major memory leaks have been resolve with the Runtime drop in early 2011. Make sure you are using the latest version.
b) Caliburn Micro:
Make sure you use the latest version of the source code from codeplex since a major memory leak was fixed on May 6 2011.
c) Telerik controls:
Silverlight Telerik Controls with memory leaks.
d) Silverlight 4 Toolkit Controls:
A good example of a control with memory leaks here was the Context menu control. I resolved the memory leak in the ContextMenu control by implementing the patch in the context menu source code and manually calling Dispose() on the Context Menu control in the View's unload event. Keep your eye on the patches available for this control and various other controls which also have memory leaks!! (sigh)
Patches for the Silverlight 4 Toolkit.
Additional causes of memory leaks:
1) If you are using MVVM Shared resources declared in your Views will create GCRoots. Make sure all your shared resources are declared in your App.Xaml only. An example of this is when you are Merging your resources unecessarily in your View eg:
<UserControl.Resources>
<ResourceDictionary>
<ResourceDictionary.MergedDictionaries>
<ResourceDictionary Source="/Themes/XXX.xaml"/>
<ResourceDictionary Source="/Themes/YYY.xaml"/>
</ResourceDictionary.MergedDictionaries>
</ResourceDictionary>
</UserControl.Resources>
2) Timer callbacks. Creating a timer with an event that keeps ticking for infinity will create this issue. Make sure you call the Dispose() on the Timer when you unload the View.
3) Not unsubscribing from a Rx feed. Rx creates strong references under the hood.When one ViewModel subscribes to a feed from another view model a strong reference is created.
4) Strong references between shared objects and Service Models.
5) WPF Behaviours : By unwiring the behaviours for the Views in the Unload event using a helper class this issue was resolved.
6) WPF Fluid Move behaviours : Unfortunately there was no clear resolution to this.
7) Interaction Event Triggers which use Expression Blend's Interaction library to raise event triggers. IF these triggers use a ControlStoryBoardAction which references a static resource containing an animation that has a RepeatBehaviour set to Forever, this will cause a memory leak.
8) Each ViewModel needs to implement the Dispose Pattern. When the dispose for a ViewModel is called it needs to call Clear() on all observable collections that have been bound to child view's(user controls). This will ensure that the View's are unloaded.
9) When using MEF make sure understand how your container setup affects the lifetime of your application's objects. There may be scenarios where you will manually need to release your Exports.
RedGate Memory profiler:
This is a great tool although it only shows you the main memory leaks so you may think you have resolved all of them only to find that more leaks pop out of the woodwork. The ANTS Redgate Memory Profiler is able to provide a graphical display of strong references between objects on the managed heap.
Labels:
Silverlight 4
Subscribe to:
Posts (Atom)