20 July 2012

Customizing a Sharepoint 2010 Search Center

Almost all Sharepoint sites make use of Sharepoint Search Capabilities. Sharepoint also provides OOB site templates for setting up search centers. Today I would like to show how we customized the OOB Basic Search Center site to meet the overall site’s L&F and the customer’s needs. All the customization is done programmatically in a feature event receiver so it can be automatically applied when deploying to the different environments.

We are creating the Search Center as a subsite of the main site. The OOB Basic Search Center has three pages: Default, Advanced Search and SearchResults. The customization steps consists of:

  • Create Search Center subsite
  • Apply custom master page to search center
  • Configure the Web Parts on the Advanced and Search Results Page
  • Create custom Metadata Properties, Search Scopes and Search Rules

Search Center Creation

The search center creation is done programmatically on the feature activated event:

Code Snippet

public override void FeatureActivated(SPFeatureReceiverProperties properties)
        SPSite site = properties.Feature.Parent as SPSite;

        //create search center subsite
        using (SPWeb web = site.OpenWeb())
                web.AllowUnsafeUpdates = true;
                using (SPWeb existingSearchCenter = site.AllWebs["Search"])
                    if (existingSearchCenter != null && existingSearchCenter.Exists)
                using (SPWeb searchCenter = site.AllWebs.Add("Search", "Search", "Basic Search Center", 1033, "SRCHCENTERLITE#0", false, false))
                    //customize search center
                    //set search center in root web:
                    web.AllProperties["SRCH_ENH_FTR_URL"] = searchCenter.ServerRelativeUrl;
                    //set search drowpdown mode
                    web.AllProperties["SRCH_SITE_DROPDOWN_MODE"] = "HideScopeDD";//Do not show scopes dropdown, and default to target results page
                web.AllowUnsafeUpdates = false;


In order to redirect search results to the Search Results Page on the Search Center sub site, some settings need to be configured on the main site. Under Site Settings - Site Collection Settings - Search Settings, the Search Center is connected to main site:



This is what the following lines in the feature receiver do:

//set search center in root web:
web.AllProperties["SRCH_ENH_FTR_URL"] = searchCenter.ServerRelativeUrl;
//set search drowpdown mode
web.AllProperties["SRCH_SITE_DROPDOWN_MODE"] = "HideScopeDD";//Do not show scopes dropdown, and default to target results page


Search Master Page

The Search Center has some particularities that prevent them to use the same master page as the main site. So a custom master page needed to be provisioned. In order to provide a master page compatible with the page layouts of the OOB Basic Search Center, we follow the instructions provided in this article: Converting a Custom SharePoint 2010 Master Page into a Search Center Master Page.

Advanced Search Page customization

The requirements for the advanced search web part, were to hide the language picker, display the scopes picker (with custom scopes, like Articles, News, Audio, Video, etc) and add some custom properties to the “Add property restrictions” filter.


This customization is performed by the following code:

Code Snippet
private void EditAdvancedSearchBoxWebPart(SPLimitedWebPartManager manager)
            System.Web.UI.WebControls.WebParts.WebPart wp = GetWebPart(manager, "Advanced Search Box");
            if (wp != null)
                AdvancedSearchBox webpart = wp as AdvancedSearchBox;
                if (webpart != null)
                    webpart.ShowLanguageOptions = false;
                    webpart.ShowScopes = true;
                    webpart.Properties = GetFromResources("SearchCenter.WebPartsConfig.AdvancedSearchBoxProperties.xml");


Even if the code is not complete, you can see what we are doing, editing the page, getting the Advances Search Box web part by name and modifying its properties. We are loading the Properties from an xml file, which includes the custom properties we added for filtering.

Custom Managed Properties and Scopes

In order to add custom properties and scopes to the Advanced Search Web Part , we first need to create them. The matadata properties are mapped to crawled properties. This means that, when Sharepoint crawls the site content and finds data in a particular list and field, if will create a crawled property for that field. For example, in the image below we can see the 0ws_keywords crawled property mapped to the Keywords Managed Property on the Admin site.


The creation and mapping of the managed property is done programmatically thru the feature activation event receiver. The crawled property is created by Sharepoint within a crawl, and must exist before activating the feature.

Here is the code for creating a managed property:

Code Snippet
private static void CreateManagedProperty(Schema schema, string managedPropertyName, string crawledPropertyCategory, string crawledPropertyName)
            if (!schema.AllManagedProperties.Contains(managedPropertyName))
                Category category = schema.AllCategories[crawledPropertyCategory];
                var crawledProps = category.QueryCrawledProperties(crawledPropertyName, 1, Guid.NewGuid(), String.Empty, true).Cast<CrawledProperty>();
                var crawledProp = crawledProps.FirstOrDefault();
                if (crawledProp != null)
                    ManagedDataType managedPropertyType = GetManagedPropertyType(crawledProp);
                    ManagedProperty managedProperty = schema.AllManagedProperties.Create(managedPropertyName, managedPropertyType);
                    var mappings = managedProperty.GetMappings();
                    mappings.Add(new Mapping(crawledProp.Propset, crawledProp.Name, crawledProp.VariantType, managedProperty.PID));


For the custom scopes, we are mostly using Property Query rules using the contentclass for the restriction:


The example of the figure is the rule of the News scope. In this case the news are stored in a list with a custom list definition. Below is the code of the list definition. Look at the Type number: 10314

Code Snippet
<?xml version="1.0" encoding="utf-8"?>
<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
        Description="My List Definition"
</ Elements >


That is the same number that we are setting the contentclass property to match: STS_ListItem_10314. This means that only items from that list, the news list, will match that scope. This enables to restrict the search to the News scope on the Advanced Search Web Part. In order for that to happen, we also need to include this scope into the Advanced Search display group.


The code for the complete customization is too much to explain everything here, but the idea was to show how we can make use of the existing Search Center site templates and perform different customizations depending on the project’s requirements. So with little effort we can profit of the OOB advanced search and search results pages that Sharepoint provides.

09 June 2012

Where is my Sharepoint 2010 Custom Timer Job running?

When building a custom timer job for Sharepoint 2010, special attention should be put on where do we need this job to run. When we have a farm environment, we can choose to run the job on all servers, in one particular server, only the front end servers, etc. Depending on our requirements the timer job implementation and installation approach will change, so we should decide where we want it to run at the first place.

All Sharepoint timer jobs ultimately inherit from the SPJobDefinition class. This class provides 3 constructors:

SPJobDefinition() Default constructor needed for serialization purposes.
SPJobDefinition(String, SPService, SPServer, SPJobLockType) Instantiates a timer job associated with the given SPService.
SPJobDefinition(String, SPWebApplication, SPServer, SPJobLockType) Instantiates a timer job associated with the given SPWebApplication.

The first constructor is required for serialization and is for internal use only. One of the other two constructors will be invoked from our custom timer job constructor. The parameters passed to it will define where the timer job will run.
Here is a sample code of a custom timer job definition:
class MyTimerJob1 : SPJobDefinition{
    public MyTimerJob1()
        : base()
    { }

    public MyTimerJob1(string name, SPService service, SPServer server, 
        SPJobLockType lockType) : base(name, service, server, lockType)
    { }

    public MyTimerJob1(string name, SPWebApplication webApplication, SPServer server, 
        SPJobLockType lockType) : base(name, webApplication, server, lockType)
    { }

    public override void Execute(Guid targetInstanceId)
        //Execute Timer Job Tasks

Besides the required default constructor, we need to provide at least one of the other 2 constructors. Depending on which constructor we use, the timer job definition can be associated either with a service or a web application. It can also be associated with a particular server in the farm and a lock type. So, the first thing is that for a particular server to be eligible to run the job, it must be provisioned with the service or web app associated with the job. Then, if a particular server is passed to the constructor, the job will run only on that server (if it has the associated service or web app, otherwise the job won’t run at all). If no server is associated, then it will run on one or many servers depending on the lock type.

The SPJobLockType enumeration can take one of the following values:
  • None: Provides no locks. The timer job runs on every machine in the farm on which the parent service is provisioned, unless the job I associated with a specified server in which case it runs on only that server (and only if the parent service is provisioned on the server).
  • ContentDatabase: Locks the content database. A timer job runs one time for each content database associated with the Web application.
  • Job: Locks the timer job so that it runs only on one machine in the farm.
So, if we instantiate a timer job passing null as the associated server and None as the lock type, we will expect it to run on every machine in the farm on which the parent service is provisioned. If we passed an SPService to the constructor, we now which service we are talking about, and now on which servers it is provisioned. But, if we passed an SPWebApplication to the constructor, in which servers will the job run? The answer is on every web font-end server, that is the servers where the Web Application service is running on.

Remember that the different server roles that we can found on a Sharepoint farm are:
  • Database Server: the server that hosts the Microsoft SQL Server database for the farm. Since Sharepoint Foundation is not typically installed in this server, no jobs will run here.
  • Web Front End Server: server where the Microsoft SharePoint Foundation Web Application service is running on.
  • Application Server: Any other Sharepoint server.
Here are a couple of examples on where the jobs will run depending on the parameters passed to the constructor:

//Job associated with a web app, no server in particular and none lock:
//  will run on all fron end servers.var jobRunningOnAllFrontEndServers = new MyTimerJob1("mytimerjob", 
    SPWebApplication.Lookup(webAppURI), null, SPJobLockType.None);

//Job associated with a web app, one front end server and job lock:
//  will run only in the frontEndServer1 server.var jobRunningOnAParticularFronEndServer = new MyTimerJob1("mytimerjob", 
    SPWebApplication.Lookup(webAppURI), fronEndServer1, SPJobLockType.Job);

//Job associated with a webApp, and an app server and lock type job: 
//  it won't run on any server since the server specified is NOT running 
//  the Web Application Servicevar jobRunningOnNoServer = new MyTimerJob1("mytimerjob", 
    SPWebApplication.Lookup(webAppURI), appServer1, SPJobLockType.Job);

//Job associated with the timer service, a particular app server and none lock:
//  will run on the appServer1 server only.var jobRunningOnAppServer = new MyTimerJob1("mytimerjob", 
    SPFarm.Local.TimerService, appServer1, SPJobLockType.None);

Using Subclases

There are some other classes on the Sharepoint Object Model that inherit from the SPServiceJob definition and can be used to inherit our custom timer jobs from. For example:
  • SPContentDatabaseJobDefinition: This job is executed by all WFE servers in the farm. Each content database is processed by only one job so that work is distributed across all the running jobs.
  • SPFirstAvailableServiceJobDefinition: An abstract base class for a timer job that will be run on the first available server where the specified service is provisioned.
  • SPServerJobDefinition:This job definition is executed on a specific server within the SharePoint farm.
  • SPServiceJobDefinition: A timer job that runs on every server in the farm where the service is provisioned.
So, for example, if you need a job to run on all servers (including the app servers) it would be better to derive directly from the SPServiceJobDefinition class and, if you need a job to run in one particular app server, to derive from SPServerJobDefinition. 

27 May 2012

Azure Flavor for the Sharepoint Media Component

Some time ago I wrote about a Media Processing Component for Sharepoint that I was working on. It is a Media Assets list for Sharepoint that lets you choose where to store the blob files. It provides also intelligence for encoding videos, generating thumbnail and poster images, obtaining media metadata, etc. On that first post the component was explained in detail, with the original 3 storage flavors: Sharepoint list, Virtual Directoy or FTP. The storage manager is extensible, so a new flavor was added later that enabled to store the media files on the cloud, in particular to AmazonS3. Today I’m going to talk about a new cloud flavor based on Azure Blob Storage.

Windows Azure Blob storage is a service for storing large amounts of unstructured data that can be accessed from anywhere in the world via HTTP or HTTPS. This kind of storage is special for storing large amounts of unstructured text or binary data such as video, audio and images. You can find documentation here on how to create a Windows Azure Storage Account and access it programmatically.

For the Media Processing Component I just needed to create a class (AzureStorageManager) that implemented the IAssetStorageManager interface. The interface has the methods to save and delete files from the external storage. It also needs four configuration parameters to be passed to its constructor:

  • AccountName: Name of the account that will be used to authenticate to the Blob Storage
  • AccountKey: Key that will be used to authenticate to the Blob Storage
  • BlobEndpoint: URL of the Blob endpoint, like http://someaccount.blob.core.windows.net
  • Container: Name of a container to store your application’s media files.

For example, this is the code required to upload a file to an Azure Blob Storage:

Code Snippet AzureStorageManager class - Save method
  1. public string Save(System.IO.FileInfo file)
  2. {
  3. //Create service client for credentialed access to the Blob service.
  4. CloudBlobClient blobClient = new CloudBlobClient(blobEndpoint, new StorageCredentialsAccountAndKey(accountName, accountKey));
  5. //Get a reference to a container, which may or may not exist.
  6. CloudBlobContainer container = blobClient.GetContainerReference(containerAddress);
  7. //Create a new container, if it does not exist
  8. container.CreateIfNotExist();
  9. //Get a reference to a blob, which may or may not exist.
  10. CloudBlob blob = container.GetBlobReference(file.Name);
  11. //Upload content to the blob, which will create the blob if it does not already exist.
  12. blob.UploadFile(file.FullName);
  13. return blob.Uri.ToString();
  14. }

You can download the complete code for the Media Component with all the storage options from gitgub.

20 March 2012

Sharepoint Guidance Logger: usage, setup and extension


Log records are essential to any application for troubleshooting problems. When beginning a new Sharepoint project, one of the first needs is to have a core logging component that can be used throughout the application code. In this post I will talk about the logging solution that we are using (based on the Patterns & Practices Sharepoint Logger), how to set it up, configure and read logs.

SharePoint 2010 includes enhanced functionality for logging and tracing. You can now throttle reporting to the Windows Event Log and the Office Server Unified Logging Service (ULS) trace logs by area and by category. Areas represent broad regions of SharePoint functionality, such as Access Services, Business Connectivity Services, and Document Management Server. Within each area, categories define more specific aspects of functionality. In our solution we use a single Area for the whole project’s custom code, and different categories inside it for the different components, like Caching, UserProfile, News, Workflows, etc…

In addition to areas and categories, SharePoint 2010 makes substantial improvements to event correlation. Every trace log entry now includes a correlation ID that identifies all the events that correspond to a particular action, such as a user uploading a document. Administrators can use tools such as the ULS Viewer to filter the trace logs by correlation ID. The Sharepoint Error Page shows the user the date and correlation Id. It is important to capture this information when an error occurs, to make it easier for administrators to track for a particular issue.

The Framework

In the project I’m working on we are using a common framework for logging, which is based on the Patterns & Practices Sharepoint Logger, included in the SharePoint Guidance Library. The P&P logger provides support to write to the Windows Event log and the ULS trace logs and support for creating your own areas and categories and use them when you write to the event log or the trace logs. Two external assemblies are needed for using it: Microsoft.Practices.ServiceLocation.dll and Microsoft.Practices.SharePoint.Common.dll.

Examples of how to call the logger thru code can be found on the SP logger documentation, but just as a sample, it is something like this:

P&P Logger usage
// Log an event with a message, an event ID, a severity level, and a category.
string area = "Custom Area";
string category = "Execution";
string areaCategory = string.Format("{0}/{1}", area, category);
logger.LogToOperations(msg, (int) EventLogEventId.MissingPartnerID, EventSeverity.Error, areaCategory);

Our logging component adds a small utility layer above the P&P logger to simplify this lines a little more. For example, as we have our custom area and categories predefined, we included them in an enumeration, and use the enums as parameters to the LogToOperations method. This way developers don’t need to type the area and categories names every time, which can lead to a lot of errors. It also provides the option to add parameters to the message and then does the String.Format to combine the message with the parameters inside the utility class, this saves some repeated code to the developers as well. Regarding the log event ids, they are also set automatically by the utility class, using a different number for each category, starting with 5000.

This leads to the logger utility having two overrides of the LogToOperations method, one for logging messages and the other for logging exceptions:

Custom logger utility

public class Logger


    private ILogger logger = SharePointServiceLocator.GetCurrent().GetInstance<ILogger>();

    public void LogToOperations(MyCustomCategories category, EventSeverity severity, string message, params object[] args)




            logger.LogToOperations(String.Format(message, args), GetEventId(category),

                    severity, category.ToLoggerString());




            //don't want the app to fail because of failures in logging



    public void LogToOperations(Exception ex, MyCustomCategories category, EventSeverity severity, string message, params object[] args)




            logger.LogToOperations(ex, String.Format(message, args), GetEventId(category),

                    severity, category.ToLoggerString());




            //don't want the app to fail because of failures in logging



    private int GetEventId(MyCustomCategories category)


        return 5000 + (int)category;



  And this is a simple example of usage:

Usage example

static void Main(string[] args)


    string user = "Peter John";

    int id = 3;

    Logger logger = new Logger();

    logger.LogToOperations(MyCustomCategories.Test, Microsoft.SharePoint.Administration.EventSeverity.Verbose,

        "Testing logging with user '{0}' and id '{1}'", user, id);



        string firstArg = args[0];


    catch (Exception ex)


        logger.LogToOperations(ex,MyCustomCategories.Test, Microsoft.SharePoint.Administration.EventSeverity.Error, "Error ocurred in test program for user '{0}' and id '{1}'",

            user, id);



Setup and Configuration

In order for all this to work, we first need to create our custom diagnostic areas and categories. We are doing it inside a feature event receiver. The code adds the custom area and then iterates thru the Categories enum to add all the associated categories. The advantage is that when a developer needs to add a category, he just adds the value to the enum and the feature will automatically add it to Sharepoint. At the moment of crating the categories, the severity level must be set. We are setting all of them as verbose which is the most useful at a development stage, but this should be changed after deployment, especially in production. A level of Warning is recommended to save disk space and increase general performance of the application.

Installer helper called by the feature event receiver

class LoggingInstallerHelper


    public static void AddCustomAreaAndCategories()


        DiagnosticsArea customArea = new DiagnosticsArea(Areas.MyCustomArea.ToString());

        var categories = Enum.GetNames(typeof(MyCustomCategories));

        foreach (var category in categories)


            customArea.DiagnosticsCategories.Add(new DiagnosticsCategory(category, EventSeverity.Verbose, TraceSeverity.Verbose));




    public static void RemoveCustomAreaAndCategories()




    #region Private Methods

    private static void AddArea(DiagnosticsArea newArea)


        DiagnosticsAreaCollection areas = GetCurrentAreas();

        var existingArea = areas.FirstOrDefault(area => area.Name == newArea.Name);

        if (existingArea == null)






            int index = areas.IndexOf(existingArea);

            foreach (DiagnosticsCategory item in newArea.DiagnosticsCategories)


                var existingCateg = areas[index].DiagnosticsCategories.FirstOrDefault(categ => categ.Name == item.Name);

                if (existingCateg == null)








    private static DiagnosticsAreaCollection GetCurrentAreas()


        IServiceLocator serviceLocator = SharePointServiceLocator.GetCurrent();

        IConfigManager config = serviceLocator.GetInstance<IConfigManager>();

        DiagnosticsAreaCollection areaCollection = new DiagnosticsAreaCollection(config);

        return areaCollection;


    private static void RemoveArea(string areaName)


        DiagnosticsAreaCollection areas = GetCurrentAreas();

        if (ExistArea(areas, areaName))


            while (areas[areaName].DiagnosticsCategories.Count != 0)








    private static bool ExistArea(DiagnosticsAreaCollection collection, string areaName)


        foreach (DiagnosticsArea item in collection)


            if (item.Name.Trim().ToUpper() == areaName.Trim().ToUpper())

                return true;


        return false;




To verify the proper installation of the custom area and categories and modify their severity level, you can go to the Admin Site -> Monitoring -> (Reporting) Configure diagnostic logger. In the Event Throttling section select desired area or particular categories and change the event and trace levels. Then click ok.


You can also configure the logs folder path (the default is c:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\LOGS\), storage day and size limits on this page under the trace logs section.

Viewing logs

You can see the log messages on the Event Viewer and trace logs using the ULS Viewer.

In the Event Viewer you will find the custom events under Windows Logs/ Application. You can filter by Source = MyCustomArea, by severity and/or Event ID.


You can also find trace logs in the ULS log files, and use a tool, like the ULS Viewer for a friendlier interface and filtering capabilities. The ULS Viewer is very useful for filtering by correlation id.


This are the basic tools. There are also tools for viewing log information remotely, or for aggregating log entries in a farm environment.

17 March 2012

Sharepoint Expiration Policy not working

I have a Sharepoint 2010 list with a custom content type with an associated retention policy. The policy consists of a custom formula and then the Send to Recycle Bin action. However, I realized that the items were not being deleted. I verified the list settings and the retention policy was configured:


I run the Expiration Policy job several times but no items were deleted and no errors found in the logs. I also added logging to the custom formula but no logging was found neither. Finally, I found one thing that I didn’t know: There is another job that needs to be run before the Expiration Policy job, it is the Information Management Policy job.

The two are Sharepoint OOB jobs:

  • Expiration Policy: This job processes items that are due for a retention action, such as deleting items passed their expiration date.
  • Information Management Policy: This job performs background processing for information policies, such as calculating updated expiration dates for items with a new retention policy.

So as in this case I have a new retention policy, I needed to run this job first. When I did it, I got an error in the logs:

Error processing policy updates for site http://mysite.com for list MyList.

Error: Object reference not set to an instance of an object.

Source: Document Management Server

Event ID: 7997

Task Category: Information Policy Management

The error didn’t tell much, but after digging a little more I realized that the problem was that the custom expiration formula was not installed, so the Information Management Policy job couldn’t calculate the expiration date.

22 January 2012

Media from Sharepoint to the Cloud

On my last post I talked about a media processing component developed for Sharepoint 2010. Besides the processing features (video encoding, thumbnail generation, validations, etc…) there was an asset storage manager that enabled us to store the files on a configurable place, inside or outside Sharepoint. I said we initially started with three storage flavors: a Sharepoint library, an FTP server or a shared folder/virtual directory.

My co-worker Dwight Goins has extended the options by adding a cloud based storage manager. In this case he uses the Amazon Simple Storage Service (Amazon S3), which resulted in a great performing storage solution. In his post, he digs deeper in the issues of storing BLOBs (Binary Large Objects) in Sharepoint and the solution that Sharepoint provides, SQL’s Remote BLOB Storage (RBS). Then he talks about our solution and the details of the AmazonS3 client. Check this out!

12 January 2012

Media Processing Component for Sharepoint

I have been working recently on the task of building a media processing component for the Sharepoint project I am working on. The requirements for the component are more or less the following:

  • The Site Content Creators must be able to upload media assets (images, audio and video) to a Sharepoint list, and these assets must accomplish certain validation rules.
  • The final destination where the assets are saved after upload must be configurable and extensible. To begin with, we are supporting saving to a Sharepoint library, a network share or an FTP server.
  • The videos must be encoded to MP4 format, and thumbnail and poster images must be generated. The encoding process must be run asynchronously and the user must be notified by email when it is finished.
Today I want to share the design of the component and the key pieces of code. I will focus on the video upload process which is the more complex one because of the encoding. Audio and image uploads are more straightforward.
The main parts that build the solution are:
  1. Custom Upload Process: This is the front end of the solution. It consists of a custom list with a custom upload form. The list has the link to the media file and more metadata fields (title, author, date, keywords, etc). When you click on create a new item on the list the custom upload form is opened and you can browse for a file to upload. The form has the required validation logic and it serves to save the assets to the configured location, which can be a Sharepoint library or an external location, like File System or FTP server. When the upload finishes you are redirected to the list item edit form so you can enter the metadata. The experience is similar to uploading a file to a Sharepoint document library.
  2. Media Processing Backend Process: This consists of a timer job that queries the Media Assets list for items to process. It encodes the videos, generates thumbnail and poster images and uploads everything to the final destination. Finally, it notifies the user of the result of the process by email. For the video encoding we used the Microsoft Expression Encoder SDK. As I will explain later, this SDK cannot be used inside a Sharepoint process, so it runs in a separated process that is invoked from the timer job.
  3. Storage Manager: this is a flexible and extensible component that abstracts the logic of saving (and deleting) a file to the final location depending on the flavor chosen thru configuration (File System, Sharepoint library or FTP). This component is used both by the front end upload mechanism and the back end media processing job.
Here is a diagram of the overall design for the video processing:
Now I will explain in a little more detail each component:

1. Custom Upload Process

The Media Assets List
This is a Sharepoint list that stores the metadata of the media assets, but not the asset itself (the assets are stored in the definite storage, which can be a Sharepoint assets library, a network shared folder, or an FTP server). The list is based on three custom content types, WebVideo, WebAudio and WebImage, all three inheriting from a base MediaAsset content type. This content type has the required fields for saving the asset metadata. The more important ones for the processing being:
  • Location: the URL of the asset in its definite location (in the example on the picture it is a sharepoint library called MediaAssetsLib).
  • Temp Location: As videos needs asynchronous processing, they are saved in a temporary location on upload. It is the timer job that uploads them to the definite location after encoding. The temp location is a shared folder on the network.
  • Processing Status: It is Success for assets successfully uploaded to the definite storage, Pending for assets waiting for encoding in the back end process and Error in case of encoding fail.
The list has an event receiver attached in order for deleting the assets from the final destination or temporary folder when the items are deleted from the list.
To achieve the storage flexibility, a custom upload form was developed and hooked to the MediaAssets list. When you click on the “Add new item” link of the picture above, the custom upload form is launched. The form is attached to the base content type definition in the Elements file as this:
<?xml version="1.0" encoding="utf-8"?>
<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
  <!-- Parent ContentType: Item (0x01) -->
  <ContentType ID="0x01004e4f21afc14c487892253cb129dd5001"
               Name="MyMediaAsset" Group="MyContent Types"
               Description="My Media Asset" Inherits="TRUE"
    <FieldRefs>    </FieldRefs>
      <XmlDocument NamespaceURI="http://schemas.microsoft.com/sharepoint/v3/contenttype/forms/url">
        <FormUrls xmlns="http://schemas.microsoft.com/sharepoint/v3/contenttype/forms/url">
The Upload Form
The form was created as an Application Page (Upload.aspx) in the Layouts folder. It contains the browse control to upload the file.


A useful tip here is how to achieve the same look and feel as the Sharepoint OOB forms. The InputFormSeccion, InputFormControl and ButtonSection controls were used for that matter.

In order to use these controls, you need to register the namespace on the top of the page:

<%@ Register TagPrefix="wssuc" TagName="ButtonSection" Src="/_controltemplates/ButtonSection.ascx" %>
<%@ Register TagPrefix="wssuc" TagName="InputFormSection" Src="/_controltemplates/InputFormSection.ascx" %>
<%@ Register TagPrefix="wssuc" TagName="InputFormControl" Src="/_controltemplates/InputFormControl.ascx" %>
And then include them in the page like this:

<wssuc:InputFormSection ID="InputFormSection1" runat="server"
      Title="Upload Document" Description="Browse to the media asset you intend to upload." >
    <wssuc:InputFormControl runat="server" LabelText="" >
        <div class="ms-authoringcontrols">
          Name: <br />
          <input id="FileToUpload" class="ms-fileinput" size="35" type="file" runat="server">    
          <asp:RequiredFieldValidator id="RequiredFieldValidator" runat="server" ErrorMessage="You must specify a value for the required field." ControlToValidate="FileToUpload"></asp:RequiredFieldValidator>
          <br />
          <asp:RegularExpressionValidator id="FileExtensionValidator" runat="server" ErrorMessage="Invalid file name." ControlToValidate="FileToUpload"></asp:RegularExpressionValidator>

You can read more about how to use these controls here.

So, what happens in the code behind?

On the OK button submit handler, the file to upload is processed. The logic is different depending on the asset type. Images are copied to the final destination by using the Storage Manager. A thumbnail is also generated and uploaded for them. Duration is calculated for Audio and Video files (Microsoft.WindowsAPICodePack API is used for that). Audio files are also copied to the final destination. Videos instead are leaved in a temporary storage (a network shared folder), because they need to be processed later by the timer job.

In all three cases, a list item is created with the asset metadata, and inserted into the MediaAssets list. Then the user is redirected to the list item Edit form, so he can complete filling the rest of the metadata.


Since the upload process may take a long time, all this happens in the context of an SPLongOperation. This is a Sharepoint class provided to display the “Processing…” dialog with the rotating gear image in it.

Here is part of the code:

protected void btnOk_Click(object sender, EventArgs e)
    if (FileToUpload.PostedFile == null || String.IsNullOrEmpty(FileToUpload.PostedFile.FileName))
        return;     //FileToUpload is the HtmlInputFile control
    var originFileInfo = new FileInfo(FileToUpload.PostedFile.FileName);
    SPWeb web = SPContext.Current.Web;
        //create a MediaAsset object to save all asset metadata
        MediaAsset asset = MediaAsset.FromFile(originFileInfo, web.Url, mediaConfig);
        //start long operation to show the user the "Processing..." message
        using (SPLongOperation longOperation = new SPLongOperation(this.Page))

            string newFileUniqueName = String.Concat(Guid.NewGuid().ToString(), originFileInfo.Extension);

                //save to file system. Need to elevate privileges for that
                var tempFileInfo = new FileInfo(Path.Combine(mediaConfig.TempLocationFolder, newFileUniqueName));

                asset.TempLocation = tempFileInfo.FullName;
                asset.Duration = MediaLengthCalculator.GetMediaLength(tempFileInfo);
            var list = web.Lists[mediaConfig.MediaAssetsListName];
            int id;
            string contentTypeId;
            //insert new item in the MediaAssets list
            mediaRepository.Insert(list, asset, out id, out contentTypeId);
            //build url of Edit Form to redirect 
            string url = String.Format("{0}?ID={1}&ContentTypeId={2}", list.DefaultEditFormUrl, id, contentTypeId);
            //long operation ends, redirecting to the Edit Form of the new list item
    catch (ThreadAbortException) { /* Thrown when redirected */}
    catch (Exception ex)
        logger.LogToOperations(ex, Categories.Media, EventSeverity.Error,
            "Error uploading file to MediaAssets list. FileName: '{0}'.", originFileInfo.Name);


Ok, now let’s see how the asynchronous processing part works.

2. Media Processing Backend Process

The backend process consists of a Sharepoint Timer Job that orchestrates the video processing and a console application that performs the actual encoding and generate the images. The console application is invoked by the timer job.

Encoder Console Application
The tool chosen for encoding videos was Microsoft Expression Encoder 4. We used the Pro version (paid) which includes support for H.264 (this means can encode videos to mp4 format as required by our client).

The encoder comes with an SDK, so you can programmatically encode your videos. The thing is that this API depends upon .Net Framework 4.0, and it is also 32-bit only. This is incompatible with a Sharepoint process (either web or timer job), since Sharepoint relies upon .Net 3.5 and runs in 64 bits. Hence the need to build a separate process outside of the Sharepoint Timer Job. The console application was a simple solution, and it could be configured to target .Net Framework 4.0 and x86 Platform.

This application expects four input parameters: the path to the original video, the desired path for the thumbnail image to generate, the desired path of the poster image to generate and the desired path of the encoded video to generate.

The Encoder SDK provides full flexibility for setting the encoding parameters (like format, size, bitrate, etc). It also provides a set of presets, that let’s you implement the encoding very easily. For example, here is the code for encoding using the H264VimeoSD Preset:

public void EncodeVideo(FileInfo inputFile, FileInfo outputFile)
    Microsoft.Expression.Encoder.MediaItem mediaItem = new MediaItem(inputFile.FullName);
    int bitrate = GetBitrate(mediaItem);

    using (Microsoft.Expression.Encoder.Job job = new Job())
        job.OutputDirectory = outputFile.Directory.FullName;
        job.CreateSubfolder = false;
        //H264VimeoSD preset settings: Output Format: MP4. Container: MP4. Video Codec: H.264 - Main. 
        //Video size: 640, 480. Video Bitrate: 2500 Kbps. Video Encoding: CBR SinglePass. 
        //Audio Codec: AAC. Audio Channels: Stereo. Audio Bitrate: 128 Kbps. Audio Encoding: CBR Single Pass

And here is the code for generating the thumbnail or poster images:

public void GenerateVideoImage(FileInfo mediaFile, string imageFilePath, int width, int height)
    var video = new MediaItem(mediaFile.FullName);
    using (var bitmap = video.MainMediaFile.GetThumbnail(
        new TimeSpan(0, 0, 5),
        new System.Drawing.Size(width, height)))
        bitmap.Save(imageFilePath, ImageFormat.Jpeg);
Media Processing Timer Job
Sharepoint supports asynchronous processing of data through Timer Jobs. These jobs run within the context of a windows service, and are easily managed and deployed using the same tools as any other Sharepoint solution.

As the requirement was to run the job only in one of the application servers, it inherits from SPServerJobDefinition. Here is the Timer Job code:

public class MediaProcessingTimerJob: SPServerJobDefinition { 
    private Logger logger = new Logger(); 

    public MediaProcessingTimerJob() : base() 
    public MediaProcessingTimerJob(string name,SPServer server):base(name,server) 
        this.Title = "MediaProcessingTimerJob"; 

    public override void Execute(SPJobState jobState) 
        string webUrl = String.Empty; 
            webUrl = this.Properties["webUrl"].ToString(); 
            var mediaProcessor = new MediaProcessor(webUrl, MediaConfig.FromConfigRepository(webUrl)); 
        catch(Exception ex) 
            logger.LogToOperations(ex,TRSCategories.Media, EventSeverity.Error, "Error executing MediaProcessingTimerJob in web '{0}'", webUrl); 

During the deployment process, the MediaProcessingTimerJob is installed on the required server. The URL of the website for processing the assets are passed thru the job properties.

Here is part of the code of the helper tool that installs the job and sets it to run every 15 minutes:

private static void CreateMediaJob(string webUrl,SPServer server)
    var job = new MediaProcessingTimerJob("my-job-media-processing", server);
    job.Properties.Add("webUrl", webUrl);
    var schedule = new SPMinuteSchedule();
    schedule.BeginSecond = 0;
    schedule.EndSecond = 59;
    schedule.Interval = 15;
    job.Schedule = schedule;

The logic of the timer job resides in the MediaProcessor::ProcessMedia method. It essentially queries the Media Assets List for assets in the “Pending” status and for each of these items it invokes the Encoder process and then uploads the resulting mp4 video and the generated thumbnail and poster images to the final destination. Finally it notifies the user of the result by email.

This is the code that the job uses to call the console application. Since the job tells the console application the output parameters (the path to the images and encoded video files), it doesn’t need to read any output from the console application. It must only read the standard error in case the console application fails.

private void ExecuteMediaProcess(string inVideoPath,string outThumbnailPath, string outPosterPath,string outVideoPath)
    string args = String.Format("\"{0}\" \"{1}\" \"{2}\" \"{3}\"", inVideoPath, outThumbnailPath, outPosterPath, outVideoPath);

    ProcessStartInfo startInfo = new ProcessStartInfo(config.EncoderExePath);
    startInfo.Arguments = args;
    startInfo.CreateNoWindow = true;
    startInfo.UseShellExecute = false;
    startInfo.RedirectStandardError = true;
    startInfo.RedirectStandardOutput = true;

    Process process = new Process();
    process.StartInfo = startInfo;
    string error = process.StandardError.ReadToEnd();

    if (process.ExitCode != 0)
        //the application failed, get error from message from standard error
        throw new MediaProcessingException(String.Format("Video encoder process returned with exit code '{0}'. Error was: '{1}'",
            process.ExitCode, error));

3. Storage Manager

The Storage Manager is the piece of code used by both the upload media form and the backend process. It is just a file manager used for abstracting from the actual destination of the files, which is configurable. As I said, we started supporting saving assets to File System, to a Sharepoint library or to an FTP server, but this can be further extended to support other places, like some storage on the Cloud.

The need for a flexibility in the location to store the files may come from bandwidth or space limitations, for a need to share assets with other applications or a need to manage a centralized file store.

Anyway, the interface is very simple:

public interface IAssetStorageManager{
    void Delete(string fileUrl);
    string Save(System.IO.FileInfo file);
    string Save(string fileName, System.IO.Stream fileStream);

There is a factory that will give you the particular storage manager framework depending on configuration (all configuration is saved in a Sharepoint list, and the Sharepoint Config Store is used for retrieving it). Here is part of the factory code:

public class AssetStorageFactory{    …
    static public IAssetStorageManager GetStorageManager(string configCategory,string webUrl)
        var configHelper = new ConfigHelper(webUrl);
        string storageMethod = configHelper.GetValue(configCategory, StorageMethodConfigKey);
        if ("SPLibrary".Equals(storageMethod, StringComparison.InvariantCultureIgnoreCase))
            return new SPLibraryAssetStorageManager(webUrl, mediaLibraryName);
        else if ("FileSystem".Equals(storageMethod, StringComparison.InvariantCultureIgnoreCase))
            return new FileSystemAssetStorageManager(storageFolderPath,storageBaseAddress);
        else if ("FTP".Equals(storageMethod, StringComparison.InvariantCultureIgnoreCase))
            return new FTPAssetStorageManager(ftpServerUrl,ftpServerPullAdress,ftpServerUsername,ftpServerPassword);
        throw new ArgumentException(String.Format("Incorrect configuration Value '{0}' in ConfigStore for category '{1}' and key '{2}'. Supported options are: '{3}'",
            storageMethod, configCategory, StorageMethodConfigKey, "FileSystem|FTP|SPLibrary"));

The implementation for a particular flavor is simple. For example, this is how the FTPAssetStorageManager saves a file stream:

public string Save(string fileName, System.IO.Stream fileStream)
    string fileUrl = ftpServerUrl + fileName;
    FtpWebRequest request = (FtpWebRequest)WebRequest.Create(fileUrl);
    request.Method = WebRequestMethods.Ftp.UploadFile;
    request.Credentials = new NetworkCredential(username, password);
    using (Stream ftpStream = request.GetRequestStream())
        FileUtils.CopyStream(fileStream, ftpStream);
    return pullBaseAddress + fileName;

And this is how the storage manager is invoked from the Upload form or the Timer Job:

//save to final locationIAssetStorageManager storage = AssetStorageFactory.GetStorageManager("Media", web.Url);
asset.Location = storage.Save(newFileUniqueName, fileInputStream);


Having talked about the most important parts of the Media Processing Component for Sharepoint, I think I’m done here. The code is too much to show everything in a post, but I’ve chosen the most important parts. I might dig deeper in some other post. I will probably write about how to display videos in a web page, too.

The interesting part is that even if this whole component doesn’t apply to another project, maybe some of its pieces can be reused, like the video encoding thing, or the custom upload form and storage manager to save files outside Sharepoint. So I hope it results useful to someone else!