20 July 2012

Customizing a Sharepoint 2010 Search Center

Almost all Sharepoint sites make use of Sharepoint Search Capabilities. Sharepoint also provides OOB site templates for setting up search centers. Today I would like to show how we customized the OOB Basic Search Center site to meet the overall site’s L&F and the customer’s needs. All the customization is done programmatically in a feature event receiver so it can be automatically applied when deploying to the different environments.

We are creating the Search Center as a subsite of the main site. The OOB Basic Search Center has three pages: Default, Advanced Search and SearchResults. The customization steps consists of:

  • Create Search Center subsite
  • Apply custom master page to search center
  • Configure the Web Parts on the Advanced and Search Results Page
  • Create custom Metadata Properties, Search Scopes and Search Rules

Search Center Creation

The search center creation is done programmatically on the feature activated event:

Code Snippet

public override void FeatureActivated(SPFeatureReceiverProperties properties)
{
        SPSite site = properties.Feature.Parent as SPSite;

        //create search center subsite
        using (SPWeb web = site.OpenWeb())
        {
            try
            {
                web.AllowUnsafeUpdates = true;
                using (SPWeb existingSearchCenter = site.AllWebs["Search"])
                {
                    if (existingSearchCenter != null && existingSearchCenter.Exists)
                    {
                        existingSearchCenter.Delete();
                    }
                }
                using (SPWeb searchCenter = site.AllWebs.Add("Search", "Search", "Basic Search Center", 1033, "SRCHCENTERLITE#0", false, false))
                {
                    //customize search center
                    cust.CustomizeEnterpriseSearchCenter(searchCenter);
                    searchCenter.Update();
                    //set search center in root web:
                    web.AllProperties["SRCH_ENH_FTR_URL"] = searchCenter.ServerRelativeUrl;
                    //set search drowpdown mode
                    web.AllProperties["SRCH_SITE_DROPDOWN_MODE"] = "HideScopeDD";//Do not show scopes dropdown, and default to target results page
                    web.Update();
                }
            }
            finally
            {
                web.AllowUnsafeUpdates = false;
            }
        }
}

 

In order to redirect search results to the Search Results Page on the Search Center sub site, some settings need to be configured on the main site. Under Site Settings - Site Collection Settings - Search Settings, the Search Center is connected to main site:

 

image

This is what the following lines in the feature receiver do:

//set search center in root web:
web.AllProperties["SRCH_ENH_FTR_URL"] = searchCenter.ServerRelativeUrl;
//set search drowpdown mode
web.AllProperties["SRCH_SITE_DROPDOWN_MODE"] = "HideScopeDD";//Do not show scopes dropdown, and default to target results page

 

Search Master Page

The Search Center has some particularities that prevent them to use the same master page as the main site. So a custom master page needed to be provisioned. In order to provide a master page compatible with the page layouts of the OOB Basic Search Center, we follow the instructions provided in this article: Converting a Custom SharePoint 2010 Master Page into a Search Center Master Page.

Advanced Search Page customization

The requirements for the advanced search web part, were to hide the language picker, display the scopes picker (with custom scopes, like Articles, News, Audio, Video, etc) and add some custom properties to the “Add property restrictions” filter.

image

This customization is performed by the following code:

Code Snippet
private void EditAdvancedSearchBoxWebPart(SPLimitedWebPartManager manager)
        {
            System.Web.UI.WebControls.WebParts.WebPart wp = GetWebPart(manager, "Advanced Search Box");
            if (wp != null)
            {
                AdvancedSearchBox webpart = wp as AdvancedSearchBox;
                if (webpart != null)
                {
                    webpart.ShowLanguageOptions = false;
                    webpart.ShowScopes = true;
                    webpart.Properties = GetFromResources("SearchCenter.WebPartsConfig.AdvancedSearchBoxProperties.xml");
                    manager.SaveChanges(webpart);
                }
            }
        }

 

Even if the code is not complete, you can see what we are doing, editing the page, getting the Advances Search Box web part by name and modifying its properties. We are loading the Properties from an xml file, which includes the custom properties we added for filtering.

Custom Managed Properties and Scopes

In order to add custom properties and scopes to the Advanced Search Web Part , we first need to create them. The matadata properties are mapped to crawled properties. This means that, when Sharepoint crawls the site content and finds data in a particular list and field, if will create a crawled property for that field. For example, in the image below we can see the 0ws_keywords crawled property mapped to the Keywords Managed Property on the Admin site.

image

The creation and mapping of the managed property is done programmatically thru the feature activation event receiver. The crawled property is created by Sharepoint within a crawl, and must exist before activating the feature.

Here is the code for creating a managed property:

Code Snippet
private static void CreateManagedProperty(Schema schema, string managedPropertyName, string crawledPropertyCategory, string crawledPropertyName)
        {
            if (!schema.AllManagedProperties.Contains(managedPropertyName))
            {             
                Category category = schema.AllCategories[crawledPropertyCategory];
                var crawledProps = category.QueryCrawledProperties(crawledPropertyName, 1, Guid.NewGuid(), String.Empty, true).Cast<CrawledProperty>();
                var crawledProp = crawledProps.FirstOrDefault();
                if (crawledProp != null)
                {
                    ManagedDataType managedPropertyType = GetManagedPropertyType(crawledProp);
            
                    ManagedProperty managedProperty = schema.AllManagedProperties.Create(managedPropertyName, managedPropertyType);
                    var mappings = managedProperty.GetMappings();
                    mappings.Add(new Mapping(crawledProp.Propset, crawledProp.Name, crawledProp.VariantType, managedProperty.PID));
                    managedProperty.SetMappings(mappings);
                    managedProperty.Update();
                }
            }
        }

 

For the custom scopes, we are mostly using Property Query rules using the contentclass for the restriction:

image

The example of the figure is the rule of the News scope. In this case the news are stored in a list with a custom list definition. Below is the code of the list definition. Look at the Type number: 10314

Code Snippet
<?xml version="1.0" encoding="utf-8"?>
<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
    <ListTemplate
        Name="NewsItemList"
        Type="10314"
        BaseType="0"
        OnQuickLaunch="TRUE"
        SecurityBits="11"
        Sequence="410"
        DisplayName="NewsItemList"
        Description="My List Definition"
        Image="/_layouts/images/itgen.png"/>
</ Elements >

 

That is the same number that we are setting the contentclass property to match: STS_ListItem_10314. This means that only items from that list, the news list, will match that scope. This enables to restrict the search to the News scope on the Advanced Search Web Part. In order for that to happen, we also need to include this scope into the Advanced Search display group.

Conclusion

The code for the complete customization is too much to explain everything here, but the idea was to show how we can make use of the existing Search Center site templates and perform different customizations depending on the project’s requirements. So with little effort we can profit of the OOB advanced search and search results pages that Sharepoint provides.

09 June 2012

Where is my Sharepoint 2010 Custom Timer Job running?

When building a custom timer job for Sharepoint 2010, special attention should be put on where do we need this job to run. When we have a farm environment, we can choose to run the job on all servers, in one particular server, only the front end servers, etc. Depending on our requirements the timer job implementation and installation approach will change, so we should decide where we want it to run at the first place.

All Sharepoint timer jobs ultimately inherit from the SPJobDefinition class. This class provides 3 constructors:

SPJobDefinition() Default constructor needed for serialization purposes.
SPJobDefinition(String, SPService, SPServer, SPJobLockType) Instantiates a timer job associated with the given SPService.
SPJobDefinition(String, SPWebApplication, SPServer, SPJobLockType) Instantiates a timer job associated with the given SPWebApplication.

The first constructor is required for serialization and is for internal use only. One of the other two constructors will be invoked from our custom timer job constructor. The parameters passed to it will define where the timer job will run.
Here is a sample code of a custom timer job definition:
[Guid("{62FF3B87-654E-41B8-B997-A1EA6720B127}")]
class MyTimerJob1 : SPJobDefinition{
    public MyTimerJob1()
        : base()
    { }

    public MyTimerJob1(string name, SPService service, SPServer server, 
        SPJobLockType lockType) : base(name, service, server, lockType)
    { }

    public MyTimerJob1(string name, SPWebApplication webApplication, SPServer server, 
        SPJobLockType lockType) : base(name, webApplication, server, lockType)
    { }

    public override void Execute(Guid targetInstanceId)
    {
        //Execute Timer Job Tasks
    }
}

Besides the required default constructor, we need to provide at least one of the other 2 constructors. Depending on which constructor we use, the timer job definition can be associated either with a service or a web application. It can also be associated with a particular server in the farm and a lock type. So, the first thing is that for a particular server to be eligible to run the job, it must be provisioned with the service or web app associated with the job. Then, if a particular server is passed to the constructor, the job will run only on that server (if it has the associated service or web app, otherwise the job won’t run at all). If no server is associated, then it will run on one or many servers depending on the lock type.

The SPJobLockType enumeration can take one of the following values:
  • None: Provides no locks. The timer job runs on every machine in the farm on which the parent service is provisioned, unless the job I associated with a specified server in which case it runs on only that server (and only if the parent service is provisioned on the server).
  • ContentDatabase: Locks the content database. A timer job runs one time for each content database associated with the Web application.
  • Job: Locks the timer job so that it runs only on one machine in the farm.
So, if we instantiate a timer job passing null as the associated server and None as the lock type, we will expect it to run on every machine in the farm on which the parent service is provisioned. If we passed an SPService to the constructor, we now which service we are talking about, and now on which servers it is provisioned. But, if we passed an SPWebApplication to the constructor, in which servers will the job run? The answer is on every web font-end server, that is the servers where the Web Application service is running on.

Remember that the different server roles that we can found on a Sharepoint farm are:
  • Database Server: the server that hosts the Microsoft SQL Server database for the farm. Since Sharepoint Foundation is not typically installed in this server, no jobs will run here.
  • Web Front End Server: server where the Microsoft SharePoint Foundation Web Application service is running on.
  • Application Server: Any other Sharepoint server.
Here are a couple of examples on where the jobs will run depending on the parameters passed to the constructor:

//Job associated with a web app, no server in particular and none lock:
//  will run on all fron end servers.var jobRunningOnAllFrontEndServers = new MyTimerJob1("mytimerjob", 
    SPWebApplication.Lookup(webAppURI), null, SPJobLockType.None);

//Job associated with a web app, one front end server and job lock:
//  will run only in the frontEndServer1 server.var jobRunningOnAParticularFronEndServer = new MyTimerJob1("mytimerjob", 
    SPWebApplication.Lookup(webAppURI), fronEndServer1, SPJobLockType.Job);

//Job associated with a webApp, and an app server and lock type job: 
//  it won't run on any server since the server specified is NOT running 
//  the Web Application Servicevar jobRunningOnNoServer = new MyTimerJob1("mytimerjob", 
    SPWebApplication.Lookup(webAppURI), appServer1, SPJobLockType.Job);

//Job associated with the timer service, a particular app server and none lock:
//  will run on the appServer1 server only.var jobRunningOnAppServer = new MyTimerJob1("mytimerjob", 
    SPFarm.Local.TimerService, appServer1, SPJobLockType.None);

Using Subclases

There are some other classes on the Sharepoint Object Model that inherit from the SPServiceJob definition and can be used to inherit our custom timer jobs from. For example:
  • SPContentDatabaseJobDefinition: This job is executed by all WFE servers in the farm. Each content database is processed by only one job so that work is distributed across all the running jobs.
  • SPFirstAvailableServiceJobDefinition: An abstract base class for a timer job that will be run on the first available server where the specified service is provisioned.
  • SPServerJobDefinition:This job definition is executed on a specific server within the SharePoint farm.
  • SPServiceJobDefinition: A timer job that runs on every server in the farm where the service is provisioned.
So, for example, if you need a job to run on all servers (including the app servers) it would be better to derive directly from the SPServiceJobDefinition class and, if you need a job to run in one particular app server, to derive from SPServerJobDefinition. 

27 May 2012

Azure Flavor for the Sharepoint Media Component

Some time ago I wrote about a Media Processing Component for Sharepoint that I was working on. It is a Media Assets list for Sharepoint that lets you choose where to store the blob files. It provides also intelligence for encoding videos, generating thumbnail and poster images, obtaining media metadata, etc. On that first post the component was explained in detail, with the original 3 storage flavors: Sharepoint list, Virtual Directoy or FTP. The storage manager is extensible, so a new flavor was added later that enabled to store the media files on the cloud, in particular to AmazonS3. Today I’m going to talk about a new cloud flavor based on Azure Blob Storage.

Windows Azure Blob storage is a service for storing large amounts of unstructured data that can be accessed from anywhere in the world via HTTP or HTTPS. This kind of storage is special for storing large amounts of unstructured text or binary data such as video, audio and images. You can find documentation here on how to create a Windows Azure Storage Account and access it programmatically.

For the Media Processing Component I just needed to create a class (AzureStorageManager) that implemented the IAssetStorageManager interface. The interface has the methods to save and delete files from the external storage. It also needs four configuration parameters to be passed to its constructor:

  • AccountName: Name of the account that will be used to authenticate to the Blob Storage
  • AccountKey: Key that will be used to authenticate to the Blob Storage
  • BlobEndpoint: URL of the Blob endpoint, like http://someaccount.blob.core.windows.net
  • Container: Name of a container to store your application’s media files.

For example, this is the code required to upload a file to an Azure Blob Storage:

Code Snippet AzureStorageManager class - Save method
  1. public string Save(System.IO.FileInfo file)
  2. {
  3. //Create service client for credentialed access to the Blob service.
  4. CloudBlobClient blobClient = new CloudBlobClient(blobEndpoint, new StorageCredentialsAccountAndKey(accountName, accountKey));
  5. //Get a reference to a container, which may or may not exist.
  6. CloudBlobContainer container = blobClient.GetContainerReference(containerAddress);
  7. //Create a new container, if it does not exist
  8. container.CreateIfNotExist();
  9. //Get a reference to a blob, which may or may not exist.
  10. CloudBlob blob = container.GetBlobReference(file.Name);
  11. //Upload content to the blob, which will create the blob if it does not already exist.
  12. blob.UploadFile(file.FullName);
  13. return blob.Uri.ToString();
  14. }

You can download the complete code for the Media Component with all the storage options from gitgub.

20 March 2012

Sharepoint Guidance Logger: usage, setup and extension

Introduction

Log records are essential to any application for troubleshooting problems. When beginning a new Sharepoint project, one of the first needs is to have a core logging component that can be used throughout the application code. In this post I will talk about the logging solution that we are using (based on the Patterns & Practices Sharepoint Logger), how to set it up, configure and read logs.

SharePoint 2010 includes enhanced functionality for logging and tracing. You can now throttle reporting to the Windows Event Log and the Office Server Unified Logging Service (ULS) trace logs by area and by category. Areas represent broad regions of SharePoint functionality, such as Access Services, Business Connectivity Services, and Document Management Server. Within each area, categories define more specific aspects of functionality. In our solution we use a single Area for the whole project’s custom code, and different categories inside it for the different components, like Caching, UserProfile, News, Workflows, etc…

In addition to areas and categories, SharePoint 2010 makes substantial improvements to event correlation. Every trace log entry now includes a correlation ID that identifies all the events that correspond to a particular action, such as a user uploading a document. Administrators can use tools such as the ULS Viewer to filter the trace logs by correlation ID. The Sharepoint Error Page shows the user the date and correlation Id. It is important to capture this information when an error occurs, to make it easier for administrators to track for a particular issue.

The Framework

In the project I’m working on we are using a common framework for logging, which is based on the Patterns & Practices Sharepoint Logger, included in the SharePoint Guidance Library. The P&P logger provides support to write to the Windows Event log and the ULS trace logs and support for creating your own areas and categories and use them when you write to the event log or the trace logs. Two external assemblies are needed for using it: Microsoft.Practices.ServiceLocation.dll and Microsoft.Practices.SharePoint.Common.dll.

Examples of how to call the logger thru code can be found on the SP logger documentation, but just as a sample, it is something like this:

P&P Logger usage
// Log an event with a message, an event ID, a severity level, and a category.
string area = "Custom Area";
string category = "Execution";
string areaCategory = string.Format("{0}/{1}", area, category);
logger.LogToOperations(msg, (int) EventLogEventId.MissingPartnerID, EventSeverity.Error, areaCategory);


Our logging component adds a small utility layer above the P&P logger to simplify this lines a little more. For example, as we have our custom area and categories predefined, we included them in an enumeration, and use the enums as parameters to the LogToOperations method. This way developers don’t need to type the area and categories names every time, which can lead to a lot of errors. It also provides the option to add parameters to the message and then does the String.Format to combine the message with the parameters inside the utility class, this saves some repeated code to the developers as well. Regarding the log event ids, they are also set automatically by the utility class, using a different number for each category, starting with 5000.



This leads to the logger utility having two overrides of the LogToOperations method, one for logging messages and the other for logging exceptions:




Custom logger utility


public class Logger


{


    private ILogger logger = SharePointServiceLocator.GetCurrent().GetInstance<ILogger>();





    public void LogToOperations(MyCustomCategories category, EventSeverity severity, string message, params object[] args)


    {


        try


        {


            logger.LogToOperations(String.Format(message, args), GetEventId(category),


                    severity, category.ToLoggerString());


        }


        catch


        {


            //don't want the app to fail because of failures in logging


        }


    }





    public void LogToOperations(Exception ex, MyCustomCategories category, EventSeverity severity, string message, params object[] args)


    {


        try


        {


            logger.LogToOperations(ex, String.Format(message, args), GetEventId(category),


                    severity, category.ToLoggerString());


        }


        catch


        {


            //don't want the app to fail because of failures in logging


        }


    }





    private int GetEventId(MyCustomCategories category)


    {


        return 5000 + (int)category;


    }


}



  And this is a simple example of usage:




Usage example


static void Main(string[] args)


{


    string user = "Peter John";


    int id = 3;


    Logger logger = new Logger();


    logger.LogToOperations(MyCustomCategories.Test, Microsoft.SharePoint.Administration.EventSeverity.Verbose,


        "Testing logging with user '{0}' and id '{1}'", user, id);





    try


    {


        string firstArg = args[0];


    }


    catch (Exception ex)


    {


        logger.LogToOperations(ex,MyCustomCategories.Test, Microsoft.SharePoint.Administration.EventSeverity.Error, "Error ocurred in test program for user '{0}' and id '{1}'",


            user, id);


    }


}



Setup and Configuration



In order for all this to work, we first need to create our custom diagnostic areas and categories. We are doing it inside a feature event receiver. The code adds the custom area and then iterates thru the Categories enum to add all the associated categories. The advantage is that when a developer needs to add a category, he just adds the value to the enum and the feature will automatically add it to Sharepoint. At the moment of crating the categories, the severity level must be set. We are setting all of them as verbose which is the most useful at a development stage, but this should be changed after deployment, especially in production. A level of Warning is recommended to save disk space and increase general performance of the application.




Installer helper called by the feature event receiver


class LoggingInstallerHelper


{


    public static void AddCustomAreaAndCategories()


    {


        DiagnosticsArea customArea = new DiagnosticsArea(Areas.MyCustomArea.ToString());


        var categories = Enum.GetNames(typeof(MyCustomCategories));


        foreach (var category in categories)


        {


            customArea.DiagnosticsCategories.Add(new DiagnosticsCategory(category, EventSeverity.Verbose, TraceSeverity.Verbose));


        }


        AddArea(customArea);


    }





    public static void RemoveCustomAreaAndCategories()


    {


        RemoveArea(Areas.MyCustomArea.ToString());


    }





    #region Private Methods





    private static void AddArea(DiagnosticsArea newArea)


    {


        DiagnosticsAreaCollection areas = GetCurrentAreas();


        var existingArea = areas.FirstOrDefault(area => area.Name == newArea.Name);


        if (existingArea == null)


        {


            areas.Add(newArea);


        }


        else


        {


            int index = areas.IndexOf(existingArea);


            foreach (DiagnosticsCategory item in newArea.DiagnosticsCategories)


            {


                var existingCateg = areas[index].DiagnosticsCategories.FirstOrDefault(categ => categ.Name == item.Name);


                if (existingCateg == null)


                {


                    areas[index].DiagnosticsCategories.Add(item);


                }


            }


        }


        areas.SaveConfiguration();


    }


        
    private static DiagnosticsAreaCollection GetCurrentAreas()


    {


        IServiceLocator serviceLocator = SharePointServiceLocator.GetCurrent();


        IConfigManager config = serviceLocator.GetInstance<IConfigManager>();


        DiagnosticsAreaCollection areaCollection = new DiagnosticsAreaCollection(config);


        return areaCollection;


    }





    private static void RemoveArea(string areaName)


    {


        DiagnosticsAreaCollection areas = GetCurrentAreas();


        if (ExistArea(areas, areaName))


        {


            while (areas[areaName].DiagnosticsCategories.Count != 0)


            {


                areas[areaName].DiagnosticsCategories.Clear();


            }


            areas.RemoveAt(areas.IndexOf(areas[areaName]));


            areas.SaveConfiguration();


        }


    }





    private static bool ExistArea(DiagnosticsAreaCollection collection, string areaName)


    {


        foreach (DiagnosticsArea item in collection)


        {


            if (item.Name.Trim().ToUpper() == areaName.Trim().ToUpper())


                return true;


        }


        return false;


    }





    #endregion





}



To verify the proper installation of the custom area and categories and modify their severity level, you can go to the Admin Site -> Monitoring -> (Reporting) Configure diagnostic logger. In the Event Throttling section select desired area or particular categories and change the event and trace levels. Then click ok.



LoggingConfig



You can also configure the logs folder path (the default is c:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\LOGS\), storage day and size limits on this page under the trace logs section.



Viewing logs



You can see the log messages on the Event Viewer and trace logs using the ULS Viewer.



In the Event Viewer you will find the custom events under Windows Logs/ Application. You can filter by Source = MyCustomArea, by severity and/or Event ID.



EventLog



You can also find trace logs in the ULS log files, and use a tool, like the ULS Viewer for a friendlier interface and filtering capabilities. The ULS Viewer is very useful for filtering by correlation id.



ULS



This are the basic tools. There are also tools for viewing log information remotely, or for aggregating log entries in a farm environment.