Showing posts with label SharePoint. Show all posts
Showing posts with label SharePoint. Show all posts

Wednesday, November 20, 2013

Scheduled Task to Add Colleagues in SharePoint

Here is the use case in SharePoint 2010/2013: HR wants all employees to add some key people in your organization as colleagues, so that everyone inside the company would be able to see their posts and feeds. A simple powershell script can do the job:

## Add SharePoint Snap-in
Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue

## Get profiles
$site = Get-SPSite("https://my_company_portal")
$context = Get-SPServiceContext $site
$profilemanager = New-object Microsoft.Office.Server.UserProfiles.UserProfileManager($context)
 
## People who will be the common colleagues for all employees
$vipNames = "domain\user1", "domain\user2", "domain\user3"
$vipMembers = @()
foreach($vipName in $vipNames) {
    try {      
        $vipUser = $profilemanager.GetUserProfile($ecName)
        $vipMembers += $vipUser
    } catch {
        write-output $_.Exception.Message
   }   
}
 
## Colleague setting
$group = "General"
$groupType = [microsoft.office.server.userprofiles.colleaguegrouptype]::General
$privacy = [microsoft.office.server.userprofiles.privacy]::Public
$isInWorkGroup = $false
 
## Go through all users and add colleagues
$profiles = $profilemanager.GetEnumerator()
foreach($user in $profiles) {
    foreach($vipMember in $vipMembers) {
        if (($user.ID -ne $vipMember.ID) -and !($user.Colleagues.IsColleague($vipMember[0].ID))) {
            try {      
                $user.Colleagues.CreateWithoutEmailNotification($vipMember, $groupType, $group, $isInWorkGroup, $privacy)
            } catch {
                write-output $_.Exception.Message
            }
        }
    }
}
How about the new hires? We can setup a scheduled task to run the powershell script so the profile update will be applied weekly or daily. Open Task Scheduler from Administrative Tools or Control Panel, create a Task with following settings:
  • General tab:
    • Name: "Add colleagues"
    • Make sure the account running the task having sufficient permission to update the SharePoint Profile database
    • Check (enable) "Run whether user is logged on or not"
    • Check (enable) "Run as highest privileges"
  • Triggers tab:
    • Create a new trigger: Weekly or daily at midnight
  • Actions tab:
    • Select "Start a program" type
    • Program/script: powershell
    • Add arguments (optional): -Command "C:\SharePointScheduledTasks\addColleagues.ps1"

In case you don't like powershell script, following code is the equivalent c# version doing the exact same task:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
 
using Microsoft.SharePoint;
using Microsoft.Office.Server;
using Microsoft.Office.Server.UserProfiles;
 
class Program
{
    static void Main(string[] args)
    {
        UpdateUsersProfile("https://my_company_portal");
    }
 
    private static void UpdateUsersProfile(string siteUrl)
    {
        using (SPSite spSite = new SPSite(siteUrl))
        {
            SPServiceContext context = SPServiceContext.GetContext(spSite);
            UserProfileManager profileManager = new UserProfileManager(context);
 
            List<UserProfile> vipMeMembers = GetVipMembers(profileManager);
 
            foreach (UserProfile profile in profileManager)
            {
                AddColleagues(profile, vipMeMembers);
            }
        }
    }
 
    private static List<UserProfile> GetVipMembers(UserProfileManager profileManager)
    {
        List<string> vipUserNames = new List<string>() { "domain\\user1", "domain\\user2", "domain\\user3" };
        List<UserProfile> vipMembers = new List<UserProfile>();
 
        foreach (string name in vipUserNames)
        {
            try
            {
                UserProfile vipMember = profileManager.GetUserProfile(name);
                vipMembers.Add(vipMember);
            }
            catch (Exception ex)
            {
                // User doesn't exit
            }
        }
        return vipMembers;
    }
 
    private static void AddColleagues(UserProfile user, List<UserProfile> colleagues)
    {
        foreach (UserProfile colleague in colleagues)
        {
            if (user.ID != colleague.ID && !user.Colleagues.IsColleague(colleague.ID))
            {
                user.Colleagues.CreateWithoutEmailNotification(colleague, 
                    ColleagueGroupType.General, "General", false, Privacy.Public);
            }
        }
    }
}

Friday, September 06, 2013

Page Buttons Not Responding After Getting PDF in SharePoint

Today I have been working on a SharePoint 2010 WebPart in which the user can click a button to get a PDF report. Open source iTextSharp library is used to generate the PDF report. The code is quite straightforward:

        void GeneratePDF()
        {
            Document doc = new Document(PageSize.A4);
            MemoryStream pdfStream = new MemoryStream();
            PdfWriter.GetInstance(doc, pdfStream);
            doc.Open();
            // Populate document with business data
            doc.Close();

            Response.Clear();
            Response.ClearHeaders();
            Response.ContentType = "application/pdf";
            Response.AddHeader("Content-Disposition", "attachment;filename=report.pdf");
            Response.BinaryWrite(pdfStream.ToArray());
            Response.Flush();
            Response.End();   
        }

That PDF function works fine, but all other buttons on the same page are not responding (postback doesn't occur) after the PDF button is click. Such behavior only happens in SharePoint environment and everything is okay in a regular ASP.NET page. It looks like some special validation in SharePoint causing the problem. I debugged into the JavaScript and found the setting of "__spFormOnSubmitCalled" variable is the culprit.

ASP.NET validation process triggered by the click of a button includes invocation of JavaScript function called WebForm_OnSubmit. SharePoint overrides this function for each page:

<script type="text/javascript">
//<![CDATA[
    if (typeof(Sys) === 'undefined') 
        throw new Error('ASP.NET Ajax client-side framework failed to load.');
    if (typeof(DeferWebFormInitCallback) == 'function') 
        DeferWebFormInitCallback();
    function WebForm_OnSubmit() {
        UpdateFormDigest('webUrl..', 1440000);
        if (typeof(vwpcm) != 'undefined') {
            vwpcm.SetWpcmVal();
        };
        return _spFormOnSubmitWrapper();
    }
//]]>
</script>

The JavaScript function __spFormOnSubmitWrapper is defined in /_layouts/1033/init.js:

function _spFormOnSubmitWrapper() {
    if (_spSuppressFormOnSubmitWrapper)
    {
        return true;
    }
    if (_spFormOnSubmitCalled)
    {
        return false;
    }
    if (typeof(_spFormOnSubmit) == "function")
    {
        var retval = _spFormOnSubmit();
        var testval = false;
        if (typeof(retval) == typeof(testval) && retval == testval)
        {
            return false;
        }
    }
    _spFormOnSubmitCalled=true;
    return true;
}

The "_spFormOnSubmitCalled" field is false by default when the page is loaded. It's set to true when you click a button on the page. This machanism ensures only the first button click will take action and prevents other clicks from posting back to the server. The "_spFormOnSubmitCalled" field is reset to false once the page is reloaded. A postback will usually result in a page reloading, but not in above PDF out case where the server writes the PDF attachment to the client then ends the interaction. So the "_spFormOnSubmitCalled" field remains true which blocks any future postback.

So theoretically the issue is not limited to PDF output. Directly writing and ending on the Response object in the server side would result in the same problem. There're a few approaches to resolve the problem:

  • 1. Reset "_spFormOnSubmitCalled" to false after the PDF button is clicked. Note that the reset timing is important, and it must be later then the submission process (after the WebForm_OnSubmit method is called), for example:
       function resetSharePointSubmitField() {
            setTimeout(function () { _spFormOnSubmitCalled = false; }, 1000); // set the field after 1 second
            return true;
        }
  • 2. Overide the WebForm_OnSubmit function and make it always return true:
        function resetSharePointSubmitField() {
            window.WebForm_OnSubmit = function() {return true;};
        }

Apply the JavaScript to a button:

    <asp:Button ID="btnGeneratePDF" runat="server" Text="Get PDF" OnClientClick="resetSharePointSubmitField();" />

The other option is add a client side script "JavaScript: _spFormOnSubmitCalled = false;" for all buttons on the page, but that is not scalable and not recommended.

Bonus tip The regular pdf export function by Response.Write() won't work inside a modal dialog (the popup window open by window.showModalDialog() from JavaScript). To resolve this particular problem you can replace the PDF export button with a hyper link and set its target to an empty iframe:

    <a id="popupPDF" target="pdfTarget" runat="server">Get PDF</a>
    <iframe name="pdfTarget" id="pdfTarget" width="0" height="0" style="display:none;"></iframe>

Then simply assign a query string the anchor which tells the server to generate the PDF:

    protected void Page_Load(object sender, EventArgs e)
    {
        popupPDF.HRef = Request.Url.AbsoluteUri + "&pdf=true";
        if (Request.QueryString["pdf"] == "true")
            GeneratePDF();
    }

Monday, June 17, 2013

Proxy Page for New Twitter API 1.1

The old Twitter API V1.0 is just retired. The new API V1.1 requires OAuth authentication for each Twitter request. Our SharePoint Twitter WebPart is no longer working since this API update. I created a SharePoint layout page as a proxy connecting to Twitter using the new API so the Twitter services are available in our Intranet without the troublesome OAuth authentication. Also the proxy page caches the result locally for performance consideration and avoids exceeding the new Twitter request limit:

<%@ Page Language="C#"  %>

<%@ Import Namespace="System.Web.UI" %>
<%@ Import Namespace="System.Collections.Generic" %>
<%@ Import Namespace="System.Globalization" %>
<%@ Import Namespace="System.Security.Cryptography" %>
<%@ Import Namespace="System.Net.Security" %>
<%@ Import Namespace="System.Net" %>
<%@ Import Namespace="System.IO" %>
<%@ Import Namespace="System.Text" %>

<html xmlns="http://www.w3.org/1999/xhtml">
<head>
    <title>A proxy page for Twitter API V1.1</title>
    <meta name="ROBOTS" content="NOINDEX, NOFOLLOW">
    <meta http-equiv="Content-Type" content="text/html; charset=utf-8">
    <meta http-equiv="Content-language" content="en">

    <script runat="server">
            
        // oauth application keys
        string consumerKey = "xxxxxxxx";
        string consumerSecret = "xxxxxxxx";
        string accessToken = "xxxxxxxx";
        string accessTokenSecret = "xxxxxxxx";

        int cacheTime = 5; // Cache time in minutes
        string baseUrl = "https://api.twitter.com/1.1/";        
                
        void Page_Load(object sender, System.EventArgs e)
        {
            // handle parameters
            bool noCache = Request.QueryString["nocache"] == "true" ? true : false;
            string endpoint= string.IsNullOrEmpty(Request.QueryString["endpoint"]) ? "statuses/user_timeline.json" : Request.QueryString["endpoint"];
            string[] resStrings = new string[] { "endpoint", "nocache", "callback", "_" };
            List<string> reservedParameters = new List<string>(resStrings); 
            
            Dictionary<string, string> parameters = new Dictionary<string, string>();
            StringBuilder keys = new StringBuilder();
            foreach (String key in Request.QueryString.AllKeys)
            {
                if (!reservedParameters.Contains(key) && !parameters.ContainsKey(key)) 
                {
                    parameters.Add(key, Request.QueryString[key]);
                    keys.Append(key + "=" + Request.QueryString[key] + "|");
                }
            }
            string cacheKey = keys.ToString();
            if (string.IsNullOrEmpty(cacheKey)) // simply return if no parameter provided 
            {
                lblInfo.Text = "Invalid parameters";
                return;            
            }     
            
            string tweets = Convert.ToString(Cache[cacheKey]);
            if (noCache || string.IsNullOrEmpty(tweets)) // check if cache exsits
            {
                string requestUrl = baseUrl + endpoint;
                try
                {
                    tweets = GetTweets(requestUrl, parameters);
                    // Update cache
                    Cache.Insert(cacheKey.ToString(), tweets, null, DateTime.Now.AddMinutes(cacheTime), System.Web.Caching.Cache.NoSlidingExpiration);
                }
                catch (Exception ex)
                {
                    lblInfo.Text = "Error occur: " + ex.Message;
                    return;
                }
            }
            
            // prepare for writting data to Response
            Response.Clear();
            Response.ContentType = "application/json; charset=utf-8";
            Response.ContentEncoding = System.Text.Encoding.UTF8;
            if (!string.IsNullOrEmpty(Request.QueryString["callback"])) // wrap data for JSONP
                Response.Write(string.Format("{0}({1})", Request.QueryString["callback"], tweets));
            else
                Response.Write(tweets);
            Response.End();
        }

        // Reference: https://dev.twitter.com/discussions/15206
        string GetTweets(string url, Dictionary<string, string> parameters)
        {
            string responseString = string.Empty;
            StringBuilder queryStrings = new StringBuilder();

            // OAuth setting
            string oauthSignatureMethod = "HMAC-SHA1";
            string oauthVersion = "1.0";
            string oauthNonce = Convert.ToBase64String(new ASCIIEncoding().GetBytes(DateTime.Now.Ticks.ToString()));
            TimeSpan timeSpan = DateTime.UtcNow - new DateTime(1970, 1, 1, 0, 0, 0, 0, DateTimeKind.Utc);
            string oauthTimestamp = Convert.ToInt64(timeSpan.TotalSeconds).ToString();
            string compositeKey = Uri.EscapeDataString(consumerSecret) + "&" + Uri.EscapeDataString(accessTokenSecret);

            // OAauth signature: https://dev.twitter.com/docs/auth/creating-signature
            string oauthSignature;
            SortedList<string, string> authSigBaseValues = new SortedList<string, string>();
            authSigBaseValues.Add("oauth_consumer_key", consumerKey);
            authSigBaseValues.Add("oauth_nonce", oauthNonce);
            authSigBaseValues.Add("oauth_signature_method", oauthSignatureMethod);
            authSigBaseValues.Add("oauth_timestamp", oauthTimestamp);
            authSigBaseValues.Add("oauth_token", accessToken);
            authSigBaseValues.Add("oauth_version", oauthVersion);
            foreach (string key in parameters.Keys)
            {
                string escapedKey = Uri.EscapeDataString(key);
                string escapedValue = Uri.EscapeDataString(parameters[key]);
                authSigBaseValues.Add(escapedKey, escapedValue);
                queryStrings.Append("&" + escapedKey + "=" + escapedValue);
            }

            // build signagure base string
            StringBuilder oauthSigSB = new StringBuilder();
            foreach (KeyValuePair<string, string> item in authSigBaseValues)
            {
                oauthSigSB.Append("&" + item.Key + "=" + item.Value);
            }
            string signatureBaseString = "GET&" + Uri.EscapeDataString(url) + "&" + Uri.EscapeDataString(oauthSigSB.ToString().Remove(0, 1));

            // create OAuth signature 
            using (HMACSHA1 hasher = new HMACSHA1(ASCIIEncoding.ASCII.GetBytes(compositeKey)))
            {
                oauthSignature = Convert.ToBase64String(hasher.ComputeHash(ASCIIEncoding.ASCII.GetBytes(signatureBaseString)));
            }

            // create the request header
            string headerFormat = "OAuth oauth_nonce=\"{0}\", oauth_signature_method=\"{1}\", oauth_timestamp=\"{2}\", " +
                    "oauth_consumer_key=\"{3}\", oauth_token=\"{4}\", oauth_signature=\"{5}\", oauth_version=\"{6}\"";
            string authHeader = string.Format(headerFormat,
                                    Uri.EscapeDataString(oauthNonce),
                                    Uri.EscapeDataString(oauthSignatureMethod),
                                    Uri.EscapeDataString(oauthTimestamp),
                                    Uri.EscapeDataString(consumerKey),
                                    Uri.EscapeDataString(accessToken),
                                    Uri.EscapeDataString(oauthSignature),
                                    Uri.EscapeDataString(oauthVersion)
                            );
            if (queryStrings.Length > 0)
                url = url + "?" + queryStrings.ToString().Remove(0, 1);
            ServicePointManager.ServerCertificateValidationCallback = 
                new RemoteCertificateValidationCallback(delegate { return true; });
            ServicePointManager.Expect100Continue = false;

            // create request
            HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
            request.Headers.Add("Authorization", authHeader);
            request.Method = "GET";
            request.ContentType = "application/x-www-form-urlencoded";

            try
            {
                WebResponse response = request.GetResponse();
                responseString = new StreamReader(response.GetResponseStream()).ReadToEnd();
            }
            catch (Exception ex)
            {
                throw  ex;
            }

            return responseString;
        }
    </script>

</head>
<body>
    <form id="form1" runat="server">
    <div>
        <p>
            <asp:Label ID="lblInfo" runat="server"></asp:Label>
        </p>
    </div>
    </form>
</body>
</html>

The proxy page can work as a regular ASP.NET page inside IIS. It can also run as a layout page in SharePoint: simply copy the file to the layouts folder under SharePoint 12/14 hive. and it will just work.

The proxy accepts different endpoint and parameter where the endpoint is the twitter service endpoint such as "statuses/user_timeline.json" or "search/tweets.json", and parameter is the query strings you pass to twitter such as "screen_name=mytwittername". The easiest way to consume the proxy Twitter servcie is using JavaScript which can be inserted into SharePoint OOB content editor WebPart:

<script type="text/javascript">
<!-- 
    $(document).ready(function() {
        // twitter proxy URL
        var url = 'http://twitterproxy/twitter.aspx?endpoint=statuses/user_timeline.json&screen_name=myname&count=10';
       // http://twitterproxy/twitter.aspx?endpoint=search/tweets.json&q=from:myname&result_type=recent
        $.ajax({url : url, cache: false, crossDomain: true, dataType: 'jsonp'}).done(function(data) {
            $('#tweeters').html("");
            $.each(data, function(i, tweet) {
                if(tweet.text) {
                    var date = parseDate(tweet.created_at);
                    var tweet_html = '<div><div class="tweetText">' + tweet.text + '</div>';
                    tweet_html += '<div class="tweetDate">'+ date.toString().substring(0, 24) +'</div></div>';
                    $('#tweeters').append(tweet_html);
                }
            });
        });
    });
    
    // Fix IE DateTime format issue
    function parseDate(str) {
        var v=str.split(' ');
        return new Date(Date.parse(v[1]+" "+v[2]+", "+v[5]+" "+v[3]+" UTC"));
    }
// -->  
</script>

Thursday, June 06, 2013

Using FusionCharts in SharePoint

FusionCharts provide a separate product for SharePoint called Collabion Charts for SharePoint a SharePoint, where you can build charting pages easily in SharePoint by adding those charting WebParts. You have options to show FusionCharts in SharePoint if you have already purchased FusionCharts license and you don't want to pay extra for Collabion Charts. First you can use SharePoint as hosting environment, put all FusionCharts assets and the charting pages into a document library. Everything should just work fine. You can also add FusionCharts to a WebPart using out-of-box Content Editor WebPart (CEWP). First reference the FusionCharts JavaScript in one CEWP (edit HTML Source):
    <script src="/Documents/FusionCharts/FusionCharts.js" type="text/javascript"></script>
Then you can add chart in another CEWP:
    <div><div id="testChartContainer"></div></div>
    <script type="text/javascript">
    <!--  
        FusionCharts.setCurrentRenderer('javascript'); 
        var testChart = new FusionCharts( { id: "chart1", type: "Column2D", width: "400", height: "250"} );  
        testChart.setXMLUrl("/Documents/testChart.xml");      
        testChart.render("testChartContainer");
    // -->      
    </script>



Note: SharePoint 2010 could change the content you added to the CEWP via HTML Source editor. You need to make sure that the updated content and script are still valid.

Wednesday, November 28, 2012

Disable Eamil Notification When Adding Colleague in SharePoint 2010

By default SharePoint 2010 an email notification will be sent out when a user has added the other user as a colleague. This may not be a good idea for a big company. Consider the case that thousands of employees may add the CEO as colleague so they could follow what CEO's news feed. That would be a huge spam for the CEO. Of course anyone can go to his/her mysite and change that email notification option:



IT forks have no problem to figure out all that, but most business people in a company are not technical guys, they are busy for other stuff, and have no time or no willing to explore tons of options available in SharePoint settings. Then you may get a request to disable the email notification for all employees.

The easiest way to do that is use Powershell script to update each user's profile. Create a script called "CreateMysites.ps1":
if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) {
    Add-PSSnapin "Microsoft.SharePoint.PowerShell"
}
#[Reflection.Assembly]::LoadWithPartialName("Microsoft.Office.Server")

$site = Get-SPSite "http://mysites.company.com";
$ServerContext = Get-SPServiceContext $site;
$ProfileManager = new-object Microsoft.Office.Server.UserProfiles.UserProfileManager($ServerContext);
$ProfileEnumerators = $ProfileManager.GetEnumerator();

#Go through each user profile and update its setting
foreach ($profile in $ProfileEnumerators)
{
    try {
        $AccountName = $profile[[Microsoft.Office.Server.UserProfiles.PropertyConstants]::AccountName].Value
        if ($AccountName.ToLower().StartsWith("mydomain\")) 
        {
            #Three values map to three options shown in above screen-shot: 0 is ON and 1 is OFF
            $profile["SPS-EmailOptin"].Value = 010;
            $profile.Commit();
        }
    }
    catch [Exception] {
      write-host $_.Exception.Message;
    }
} 
$site.Dispose();
The script can be executed in SharePoint 2010 Management Shell directly. The powershell script can also be run by Windows Scheduled task. You just need to create a batch file "CreateMysites.bat" to start the powershell:
PowerShell.exe -command C:\Schedule\CreateMysites.ps1
To setup a schedule to run the script in Windows, go to SharePoint server => Administrative Tools => Task Scheduler, click Create Task action from the right-hand side panel, give a task name, set the schedule as daily, weekly or whatever you need in the Triggers tab, and then add a new Action in the Actions tab:

Monday, September 17, 2012

SharePoint 2010 Email Notification With Workflow

Here's a common scenario in SharePoint environment: you have a SharePoint custom list to store and trace some tasks or jobs, you want the system automatically send out email notification based on some conditions. Let's say there's a SharePoint List with following two columns:

1. Tickler Date: a DateTime field that triggers the workflow to send out the email notification
2. Tickler Contacts: a People field (multiple section) where the notification email will be sent to

The requirement is simple:

1. Email notification sent to "Tickler Contacts" whenever the "Tickler Date" is reached, if both fields are defined.
2. Notification should be aware of the change on those two fields, and updated "Tickler Date" and "Tickler Contacts" should be applied.

There are a few options to accomplish such task. We can set a Windows Scheduled Task to run a console app which scans the whole list daily and send the notification, or use SharePoint Timer Jobs to do similar things, or implement SharePoint Workflow. SharePoint Workflow is very powerful and capable of handling many SharePoint tasks easily. With SharePoint Designer, you can even visually work on workflow's conditions, steps and actions, and apply the Workflow directly to the server without any code deployment (arguably good and bad on this). Workflow approach will be discussed in this post.

SharePoint Designer Workflow has a built-in "Send an email" action. We use that as the only one action step in our first Workflow for notification. Because the recipients come from a multi-user People field, we need to set Person field returned as delimited email addresses:


By default the BCC is not available on the email form. However, when you can highlight the send email action and click the "Advanced Properties" button on the top menu you will see the BCC field. You could even fine tune the formatting of the mail content since the original body HTML can be seen and updated there:


The next step is to figure out how to start the Workflow. A list-associated Workflow can be set to start manually, or automatically start when a list item is created or updated. For the simple Workflow we defined above is not going to start automatically because we haven't defined the condition yet.

If the trigger condition is based on the date time field, then we can also utilize retention policy to start a Workflow. Go to List's settings page and click "Information management policy settings" you will see following settings (the top "List Based Retention Schedule" won't be available if "Library and Folder Based Retention" Site Collection Feature is not activated):


We are interested in the item properties, so click the Item and go to Item policy editing page, check "Enable Retention", and then add a retention stage:


The condition here is that when "Tickler Date" is passed by 0 day, or "Tickler Date" is today, then workflow will start and send the email notification. All look good except that requirement #2 can not be satisfied. The "Tickler Date" is fixed, and Retention policy won't recalculate the condition when the "Tickler Date" is changed. There's no workaround on that as far as I know. So Retention policy is not an option in our case.

The solution is let Workflow deals with the conditions by itself, and make it start automatically when a List item is created or updated:


The Workflow quite explains itself. First it checks if "Tickler Contacts" and "Tickler Date" are both valid. It would simply "quit" (WF completes) if any of those two empty or "Tickler Date" has passed. Otherwise it continues to further logic. A Workflow variable is set to the "Tickler Date" so later it has a reference to compare if that has been changed. Then two parallel branches are split to run simultaneously. The first branch is waiting until the "Tickler Date" reaches so email notification will be sent out, and the workflow completes its work; the second branch is pending and looking for any change on the "Tickler Date" field, when that occurs the workflow stops and a new workflow instance will be created by SharePoint using the updated "Tickler Date", because the workflow had set to auto-start when list item is updated.

You may ponder what happens if the fields other than "Tickler Date" have been changed, would a new Workflow instance will be created? The answer is no. SharePoint only allows one instance of Workflow for each version to run. That's why we cancel the old Workflow instance when the "Tickler Date" is changed, so a new Workflow instance can be created and start.

You may also wonder why not just simply using the general "Modified" field instead of "Tickler Date": cancel the old Workflow instance whenever the List item is changed, then there’s room for a new WF instance with updated data. You will see that won't work if you try it out. The reason is that SharePoint does modify the List item silently for some background process such as updating List item's Workflow status. That causes time change on List item's "Modified" field which stops old Workflow, but SharePoint won't start a Workflow for such internal processing.

A final note: don't use system account to test your Workflow. It won't work! Creating or updating a list item using system account won't trigger associated Workflow even although the Workflow is configured to auto-start. Under the hood List Workflow was implemented using List Item Event Receivers, and system account is just ignored there.
Reference: Create a Detailed Custom Task Notification with a SharePoint Designer Workflow: http://sharepoint-videos.com/sp10creating-a-workflow-on-a-list-using-sharepoint-designer-2010

Tuesday, September 04, 2012

An "Unsocial" Bug in Socail NewsGator

NewsGator is a Social Networking Solution for SharePoint environment. Recently our company utilizes NewsGator to make our SharePoint 2010 portal more social and Facebookish. I found an issue related to NewsGator Community API (Social Sites 2010 Suite V2.6.615) today, and missing communication between different dev teams inside NewsGator may be the culprit.

Newsgator heavily uses Feeds and Communities concepts to connect people together and encourages employees participation. Overall those NewsGator features are great. The problem is that, as most third-party components, it's very hard to do some customization on top of NewsGator Components. The NewsGator API and its documentation are so limited that I have to use reflector to figure out some issues to do my work.

Here is the secnario: for security and a few other considerations we don't allow end users to create NewsGator communities directly. Instead, a custom WebPart is built for creating a new Community programmatically so you have full control of that. One of the business requirement is that you can set the NewsGator Community as Public or Private in the custom WebPart.

It sounds straightforward, just setting Community's properties right? But I just couldn't find out the proper way to do that (the formal NewsGator API documentation has less than 20-page with just a few simple code snippets showing how to create Feeds). Thanks to reflector I found there's a SocialContextClient class often used to get/set SocialGroup (Community) inside backend web service calls. So I tried the following code:
    SocialSites.ApplicationServices.SocialContextClient contexClient = 
      new SocialSites.ApplicationServices.SocialContextClient();
    NewsGator.Social.Data.SocialGroup community = contexClient.GetCommunityByName("CommunityName");
    community.PrivacyLevel = NewsGator.Social.Data.SocialGroupPrivacyLevel.Private;
    community.Discoverable = false;
    contexClient.UpdateSocialGroup(community);
The PrivacyLevel and Discoverable properties are responsible for handling the public-visibility, and their values are stored in a SocialGroup table in backend database (NewsGator doesn't use SharePoint lists and has its own database). I could see that the code has successfully updated their values in SocialGroup table, however the created Community is still public. It's very confusing. I checked all possibilities including permission, caching, synchronization, etc. but no luck.

It turns out there’s another method to get the SocialGroup (Community) using SharePointConextManagerBase:
    using (SPSite site = new SPSite(SiteUrl))
    {
        using (SPWeb web = site.OpenWeb())
        {
            SocialGroup community = 
          new NewsGator.Social.Library.Client.SharePointContextManagerBase().GetCommunity(web);
            SocialGroupPrivacyLevel privacyLevel = community.PrivacyLevel;
            bool discoverable = community.Discoverable;
        }
    }
I found that the community’s settings in above method are not the same as what we get from the first method using SocialContextClient. By reflector I traced out that SharePointContextManagerBase actually checks web.AllProperties[NewsGator.Social.Library.SocialGroups.WebKeys.PrivacyLevel] and web.AllProperties[NewsGator.Social.Library.SocialGroups.WebKeys.Discoverable] values to populate the privacy and discoverable values, and it’s not related to any NewsGator backend database.

Apparently different teams from NewsGator had built these two methods with their own purpose. Method SharePointContextManagerBase is used by UI WebParts so updating the backend database by SocialContextClient class won’t have any effect. To workaround this problem simply assign the web’s properties which is checked by SharePointContextManagerBase mthod, then everything works as expected:
    private void UpdateNewsGatorCommunity(SPWeb web, bool isPrivateCommunity)
    {
        if (isPrivateCommunity) // Set NewsGator Community as private
        {
            try
            {
                // Direct value assignment to avoid referecing NewsGator dlls
                web.AllProperties["ng-community-privacy"] = "2";
                web.AllProperties["ng-community-discoverable"] = "False";
                web.Update();
            }
            catch (Exception ex)
            {
                LoggingService.LogError("UpdateNewsGatorCommunity error: " + ex.Message, ex.StackTrace);
            }
        }
    }
It's hacking not an elegant solution and we hope NewsGator could fix this issue in the future.

This is a perfect example of a problem causing by one piece of information defined/stored in multiple places, and reason for that is most likely missing communication between people and teams in a company.

Wednesday, May 23, 2012

SharePoint Issues Caused by Inconsistent Times

Today we got some problems in our SharePoint environment after a few Windows patches installed by operation team:

1.Missing Search Bar on main page.
2.No search results when Find an Associate is used.
3.User Audiences not working (all web parts appearing on main page).
4.Missing Navigation/Links Bar on left.

It looks like something wrong with the application service at the first glance. By looking at the SharePoint log, one specific error caught my attention:

...Exception occured while connecting to WCF endpoint: System.ServiceModel.Security.MessageSecurityException: The security timestamp is invalid because its creation time ('2012-05-23T13:53:00.168Z') is in the future...

It tells there were some time-related issue in the farm. When the system admin logged in to front-end and application servers, he found there's 10 minutes difference between two servers! After syncing the time all SharePoint issues were gone and the portal was back to normal immediately.

So the root cause was the time inconsistency in the SharePoint farm, nothing related to the Windows patches. When the front-end server got a reply from the application server, it noticed response timestamp was far ahead of its local time, and then it throw a security exception and terminated further processing.

Why that time variance happened? One major reason is that our two SharePoint front-end servers are actually virtual machines, while the application server is a physical machine. The virtual machines would lose some time when hibernate or reboot if time service is not set properly.

Thursday, April 05, 2012

ThrowIfMaxHttpCollectionKeysExceeded Error When Populating WebPart

A custom WebPart won't show in the available WebPart list when it's deployed to WebPart Gallery by a Feature. In order to use such WebPart, we need to manually populate it inside SharePoint WebPart Gallery. What you do is go to site collection's WebPart gallery, click "New" button on the toolbar, select the WebPart and click "Populate Gallery" button:



It used to be working fine but all in a sudden I got an error when populating a WebPart:



This runtime error doesn't provide any hints. There're two options to see the error detail.

1. Go to SharePoint ULS logs in SharePoint_12_Or_14_Hive\LOGS and do the search.
2. Set <customErrors mode="Off" /> in SharePoint_12_Or_14_Hive\TEMPLATE\LAYOUTS\Web.config (not the WebApplication itself).

The error found in the ULS log is:

w3wp.exe (0x21BC) 0x1EB0 SharePoint Foundation Runtime tkau Unexpected System.InvalidOperationException: Operation is not valid due to the current state of the object. at System.Web.HttpValueCollection.ThrowIfMaxHttpCollectionKeysExceeded() at System.Web.HttpValueCollection.FillFromEncodedBytes(Byte[] bytes, Encoding encoding) at System.Web.HttpRequest.FillInFormCollection() d5fef840-8ad4-46a3-ab11-0b37346e7a9f

It turns out the submit form has too many post values. Adding following appSetting to the WebApplication's web.config (C:\inetpub\wwwroot\wss\VirtualDirectories\{WebApplication}\Web.config) resolves the problem:
 <appSettings>
     <add key="aspnet:MaxHttpCollectionKeys" value="2000" />
 </appSettings>
As the key name suggested it's an ASP.NET related setting. Why the error happens in SharePoint suddenly? We have installed some Microsoft and SharePoint security patches recently. The default maximum 500 post values policy was introduced by those patches, specifically from Critical MS11-100 patches.

Thursday, March 08, 2012

A Lesson Learned From A SharePoint Patch Installation

A busy morning started from tons of reported issues on our SharePoint 2010 portal: my sites, audience, and some navigations were not working...Open up SharePoint Central Admin we first noticed saw an error "User Profile Application's connection is currently not available.." in the user profile service:


Inside profile synchronization service we saw the "An error has occurred while accessing the SQL Server database or the SharePoint Server Search service. If this is the first time you have seen this message, try again later. If this problem persists, contact your administrator.":


It looked like something wrong with the database connection. But there's no recent change on SQL database and service accounts. From the SharePoint log we noticed many unexpected errors from User Profile Web Services:

w3wp.exe (0x0B3C) 0x0268 SharePoint Portal Server User Profiles eh0u Unexpected ProfilePropertyService.GetProfileProperties Exception:
System.MissingMethodException: Method not found: 'Microsoft.SharePoint.Administration.SPIdentifierType Microsoft.SharePoint.Administration.SPAce`1.get_BinaryIdType()'. at
Microsoft.Office.Server.Administration.SPAclFormatter.Serialize[TRights](XmlWriter xmlWriter, SPAcl`1 acl) at
Microsoft.Office.Server.Administration.SPAclFormatter.Serialize[TRights](SPAcl`1 acl) at
Microsoft.Office.Server.Administration.UserProfileApplication.get_SerializedAdministratorAcl() at
Microsoft.Office.Server.Administration.UserProfileApplication.GetProperties() at
Microsoft.Office.Server.UserProfiles.ProfilePropertyService.GetProfileProperties() 4507cc64-e9bd-43d3-9191-5e10b9509083

"Method not found" implies that the service calls and web services were not matching and they had different method signature. What on earth was happening? We recalled that a SharePoint security patch KB2597124 was installed last night (described in Microsoft Security Bulletin). Would it be the culprit? Not likely at the first glance since we a few developers had already installed that patch and didn't find any issue on it.

We traced down the SharePoint update history and found the security patch was installed successfully. The only difference between development and production machines is that production servers didn't have SharePoint Oct. 2011 cumulative updates installed but development machines did:
Production environment: SharePoint2010_SP1(14.0.4763.1000) + KB_2597124_Patch
Dev environment: SharePoint2010_SP1(14.0.4763.1000) + Oct_2011_CU(14.0.6112.5000) + KB_2597124_Patch

Would that difference caused the problem? We read through the patch instruction and it clearly showed it can be applied to SharePoint 2010 or SharePoint 2010 SP1 server without any other prerequisite information.

Anyway we did some tests. We installed KB2597124 Patch in a SharePoint 2010 SP1 test machine (14.0.4763.1000 without any CU installed). Bamm, we then saw all those issues in the test machine!

The conclusion was clear. That the security patch has dependency on previous cumulative updates (Oct. 2011 CU or latest Dec. 2011 CU). But this dependency information is not included in patch’s document, and security patch doesn’t do any prerequisite check.

Those "Method not found" errors now are understandable. Those CU updates modify some user profile web services, and the security patch makes some changes on top of those updated web services. I guess Microsoft found the security vulnerability. They thought it's critical so they teamed up some developers to work on it quickly. What those developers did was grab the latest code including the CU build, fix the issue, test it and make it public. But should Microsoft be a little bit more responsible for that? At least do some testing for different environments before the release right?

The security patch can not be uninstalled. The only option for us was to apply the CU after the security patch which is not the right order. We installed Oct. 2011 CU in that test machine with security patch installed. The CU install was smooth but it failed in SharePoint configuration wizard after machine reboot as prompted:

A few errors can be found in the log file:

03/08/2012 16:08:49 14 ERR The exclusive inplace upgrader timer job failed.
03/08/2012 16:08:49 14 ERR Task upgrade has failed with a PostSetupConfigurationTaskException An exception of type Microsoft.SharePoint.PostSetupConfiguration.PostSetupConfigurationTaskException was thrown. Additional exception information: Failed to upgrade SharePoint Products.
03/08/2012 16:08:49 14 ERR An exception of type Microsoft.SharePoint.PostSetupConfiguration.PostSetupConfigurationTaskException was thrown. Additional exception information: Failed to upgrade SharePoint Products.
Microsoft.SharePoint.PostSetupConfiguration.PostSetupConfigurationTaskException: Exception of type 'Microsoft.SharePoint.PostSetupConfiguration.PostSetupConfigurationTaskException' was thrown.
at Microsoft.SharePoint.PostSetupConfiguration.UpgradeTask.Run()
at Microsoft.SharePoint.PostSetupConfiguration.TaskThread.ExecuteTask()
03/08/2012 16:08:49 14 ERR Task upgrade has failed
03/08/2012 16:08:49 1 ERR Task upgrade has stopped and failed. Total failed is now 1
03/08/2012 16:08:49 8 ERR Task upgrade SharePoint Products failed, so stopping execution of the engine
03/08/2012 16:08:49 8 ERR Failed to upgrade SharePoint Products.
An exception of type Microsoft.SharePoint.PostSetupConfiguration.PostSetupConfigurationTaskException was thrown. Additional exception information: Failed to upgrade SharePoint Products.
Microsoft.SharePoint.PostSetupConfiguration.PostSetupConfigurationTaskException: Exception of type 'Microsoft.SharePoint.PostSetupConfiguration.PostSetupConfigurationTaskException' was thrown.
at Microsoft.SharePoint.PostSetupConfiguration.UpgradeTask.Run()
at Microsoft.SharePoint.PostSetupConfiguration.TaskThread.ExecuteTask()
03/08/2012 16:08:49 8 ERR One or more configuration tasks has failed or some tasks were not run
03/08/2012 16:08:49 8 ERR Configuration of SharePoint Products failed. Configuration must be performed in order for this product to operate properly. To diagnose the problem, review the extended error information located at C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\LOGS\PSCDiagnostics_3_8_2012_15_42_40_120_1586599544.log, fix the problem, and run this configuration wizard again.

We could see there were a few configuration tasks errors during the configuration wizard, which may because "Configuration must be performed in order for this product to operate properly" (message from log error). Fortunately, we found the site was back to normal after the CU installation: user profile service was up, mysites and navigation were all working properly. Reapplying the security patch was not implemented and we just didn't have enough time to test all that.

Lesson learned: make sure the test machine has the exact same configuration, fully test before applying any SharePoint patch, and don't take Microsoft for granted.

Friday, February 17, 2012

Replacing SharePoint 2010 Content Editor WebPart Links

This console app is to replace those hard-coded URLs inside content editor web parts. In previous post we showed how to extend https://proudctionServer application to https://stagingServer, but some URLs inside the content editor web parts in https://stagingServer are still pointing to https://productionServer. This console will replace them as desired staging URLs.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Xml;
using System.Web.UI.WebControls.WebParts;
using Microsoft.SharePoint;
using Microsoft.SharePoint.WebPartPages;
using Microsoft.SharePoint.Administration;

namespace ContentEditorWPLinkUpdate
{
/// <summary>
/// This console application is to replace hard-coded links inside ContentEditorWebPart.
/// e.g.
/// <a href="https://productionServer/news/abcd">abcd news</a>
/// will be replaced by:
/// <a href="/news/abcd">abcd news</a>
/// Run the executable directly or run it with optional parameters:
/// Cmd:\>ContentEditorWPLinkUpdate [originalLink] [newLink]
/// </summary>
class Program
{
readonly static string webApplication = "https://productionServer/";
static string oldUrl = "https://productionServer/";
static string newUrl = "/";
static string hrefLinkTeReplaced = string.Format("href=\"{0}", oldUrl);
static string srcLinkToBeReplaced = string.Format("src=\"{0}", oldUrl);

static void Main(string[] args)
{
if (args.Length > 0)
{
oldUrl = args[0].Trim().TrimEnd(new char[] { '/' }) + "/";
hrefLinkTeReplaced = string.Format("href=\"{0}", oldUrl);
srcLinkToBeReplaced = string.Format("src=\"{0}", oldUrl);
if (args.Length > 1)
newUrl = args[1].Trim().TrimEnd(new char[] { '/' }) + "/";
}

Console.WriteLine("Replacing URL {0} by {1} for content editor WebParts...", oldUrl, newUrl);

SPWebApplication webApp = SPWebApplication.Lookup(new Uri(webApplication));

foreach (SPSite siteCollection in webApp.Sites)
{
Console.WriteLine("Updating Site collection: " + siteCollection.Url);

foreach (SPWeb web in siteCollection.AllWebs)
{
UpdateWebRootFolderFiles(web);
UpdatePagesFiles(web);
web.Close();
}

siteCollection.Close();
}

Console.WriteLine("Content Editor WebPart links update completed!");
}

/// <summary>
/// Update URL in content editor WebPart
/// </summary>
static void UpdateContentEditorWebPart(SPLimitedWebPartManager wpManager, ContentEditorWebPart contentEditor)
{
string origContent = contentEditor.Content.InnerText;
if (origContent.Contains(hrefLinkTeReplaced) || origContent.Contains(srcLinkToBeReplaced))
{
XmlDocument xmlDoc = new XmlDocument();
XmlElement xmlElement = xmlDoc.CreateElement("HtmlContent");
xmlElement.InnerText = origContent.Replace(hrefLinkTeReplaced, "href=\"" + newUrl).Replace(srcLinkToBeReplaced, "src=\"" + newUrl);
contentEditor.Content = xmlElement;
wpManager.SaveChanges(contentEditor);
}
}

/// <summary>
/// Update files in sites' root folder
/// </summary>
static void UpdateWebRootFolderFiles(SPWeb web)
{
using (web)
{
// Loop through all files in root folder
if (web.RootFolder.Files != null)
{
foreach (SPFile webRootFile in web.RootFolder.Files)
{
try
{
if (webRootFile.Url.ToLower().EndsWith(".aspx"))
{
using (SPLimitedWebPartManager wpManager = webRootFile.GetLimitedWebPartManager(PersonalizationScope.Shared))
{
foreach (System.Web.UI.WebControls.WebParts.WebPart webPart in wpManager.WebParts)
{
if (webPart != null && webPart.GetType().ToString().Contains("ContentEditorWebPart"))
{
ContentEditorWebPart contentEditor = (ContentEditorWebPart)webPart;
UpdateContentEditorWebPart(wpManager, contentEditor);
}
}
}
}
}
catch (Exception ex)
{
Console.WriteLine(string.Format("Error occurs when updating {0} : {1}", webRootFile.ServerRelativeUrl, ex.Message));
}
}
}
}
}

/// <summary>
/// Update Publishing pages
/// </summary>
static void UpdatePagesFiles(SPWeb web)
{
SPList list = web.Lists.TryGetList("Pages");
if (list == null)
return;

SPFile page = null;
foreach (SPListItem listItem in list.Items)
{
if (!listItem.Url.ToLower().EndsWith(".aspx"))
continue;
try
{
page = web.GetFile(web.Url + "/" + listItem.Url);
bool needToUpdate = false;
using (SPLimitedWebPartManager wpManager = page.GetLimitedWebPartManager(PersonalizationScope.Shared))
{
foreach (System.Web.UI.WebControls.WebParts.WebPart webPart in wpManager.WebParts)
{
if (webPart.GetType().ToString().Contains("ContentEditorWebPart"))
{
ContentEditorWebPart contentEditor = (ContentEditorWebPart)webPart;

string content = contentEditor.Content.InnerText;
if (content.Contains(hrefLinkTeReplaced) || content.Contains(srcLinkToBeReplaced))
{
needToUpdate = true;
}
}
}
}
if (needToUpdate)
{
if (page.CheckOutType != SPFile.SPCheckOutType.None)
page.UndoCheckOut();
page.CheckOut();

string webPartTitles = string.Empty;
using (SPLimitedWebPartManager wpManager = page.GetLimitedWebPartManager(PersonalizationScope.Shared))
{
foreach (System.Web.UI.WebControls.WebParts.WebPart webPart in wpManager.WebParts)
{
if (webPart != null && webPart.GetType().ToString().Contains("ContentEditorWebPart"))
{
ContentEditorWebPart contentEditor = (ContentEditorWebPart)webPart;
UpdateContentEditorWebPart(wpManager, contentEditor);
}
}
}

page.CheckIn("Replace hard-coded URL for content editor web part:" + webPartTitles);
page.Publish("Replace hard-coded URL for content editor web part:" + webPartTitles);
}
}
catch (Exception ex)
{
Console.WriteLine(string.Format("Error occurs when updating {0} : {1}", page.ServerRelativeUrl, ex.Message));
}
}
}
}
}

Wednesday, February 15, 2012

SharePoint 2010 Solution Package Redeployment After Site Extended

Originally we had identical SharePoint 2010 setup in production and staging environments, and both were using the same URLs. In order to visit the staging environment, we need to change machine’s hosts file to point to staging servers. However some business people who need to test the staging server simply don’t have permission to change their hosts file by their own. So we decided to separate the production and staging URLs, one is production.company.com the other is staging.company.com.

When the new DNS entries for staging servers were ready, we started setting up the SharePoint Alternate Access Mapping (AAM) for staging servers. One requirement was that all access needs to be secure so we needed to extend the original web application. Following are steps of the configuration:

1. Open up central admin, go to application management, select the web application and click Extend button on the ribbon

2. In the popup window type following:
  • a. Port: 443
  • b. Host Header: staging.company.com
  • c. Zone: Intranet
  • d. Other as default

3. Go to Application Management -> Configure alternate access mappings -> Edit Public Zone URLs, select the web application and change http://staging.company.com:443 to https://staging.company.com

4. For each front-end server, open up the IIS manager and locate the newly extended site, write down the site ID something like 34561234, the open a command prompt as administrator:

C:\inetpub\AdminScripts>cscript adsutil.vbs set /w3svc/34561234/SecureBindings ":443:staging.company.com"

For more about SSL certificate setup refer to this and this.

5. Go back to IIS manager, select the extended site, and click Edit Bindings. Delete the http binding without SSL certificate and keep the other https one, and then start the IIS site (it’s stopped by default)

6. Optionally copy the original web.config to the extended IIS site if your solution packages are not taking care of all web.config changes.

7. Optionally set the hosts file so that the new DNS entries are pointing to itself, if you have a multiple front-end server farm and NTLM authentication is used. The service calls inside webpart or custom code could possibly route to other front-end server which could lead to the double-hop authentication issue (refer to this). Hosts file setup ensures those service calls always getting to local machine.

The story wasn't stopping yet. We noticed quite a few custom webparts and applications are not working properly after the Web Application is extended. Finally we found SharePoint would redeploy those farm solution packages that have content (such as dll) with deploy target of WebApplication. The redeployment sequence is random which would lead to some problem. In our case, the dll included in the one solution package overrides the newer dll in another solution package. So we redeployed all solution packages with proper order and then the issues were gone.

Wednesday, January 18, 2012

Display Published Date in SharePoint Search Result

In SharePoint search result, the last modified date of an articel (page) is displayed after the page link by default:



How to change this default behavior? Let's say, display the date from a page's column (field) named "Published Date". You can do it without any coding! Following are the configuration steps to achieve this:

1. Verify the property internal name. You can go the library setting page, click the "Published Date" column, search the "Field" query string inside the URL, and that's column's internal name. It's "Publish%5x0020%5On":



2. Make that "PublishDate" property enable for search result web part to display:
2.1. Open SharePoint Centra Admin -> Manage Application -> Manage Service Application -> Search Application
2.2. Click "Metadate Properties" under "Queries and Results" category on the left panel
2.3. Click "New Managed Property" on the tool bar
2.4. Type "PublishDate" as property name, add "ows_Published_x0020_On" to the mapping, check "Allow this property to be used in scopes" and "Add managed property to custom results set retrieved on each query"



3. Re-crawl the whole content: click "Content Sources" under Crawling category, select the target content source and click "Start Full Crawl". Wait for crawling to be completed.

4. Update Search Core Result WebPart on search result page to display the newly added search property. Because not all content has such column (field), we need to conditionally display this field if it exists, otherwise show the original last modified date.



4.1. Edit the "Search Core Resul" WebPart on the search page, under "Display Properties" category, uncheck "Use Location Visualization".
4.2. Copy the value of "Fetched Proerpties" and paste it to a notepad, add <column name="PublishDate"> in front of <column name="Write"> then paste back the whole string (one line) as "Fetched Proerpties" value, something like:
<root xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><Columns><Column Name="WorkId"/>...
<Column Name="PublishDate"/><Column Name="Write"/>...
4.3. Click "XML Editor", search for:
<xsl:call-template name="DisplayString">
<xsl:with-param name="str" select="write"></xsl:with-param>
</xsl:call-template>
Replaced by:
<xsl:choose>
<xsl:when test="string-length(publishdate) &gt; 0">
<xsl:call-template name="DisplayString">
<xsl:with-param name="str" select="publishdate"></xsl:with-param>
</xsl:call-template>
<xsl:otherwise>
<xsl:call-template name="DisplayString">
<xsl:with-param name="str" select="write"></xsl:with-param>
</xsl:call-template>
</xsl:otherwise>
</xsl:when></xsl:choose>
Now the "Published Date" is shown for those pages with this field defined, and last modified date displays for those pages without this field.

Note that "publishdate" in the XSLT condition is all lower case. I defined it as "PublishDate" in Search property, and I assumed it remains the same in search return. But that's not the case. Search application always returns a xml with all lower case property name. This can be verified by showing the original search result. In order to see the oringal search result, simply use the following xml for the Search Core Result WebPart:
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
<xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/>
<xsl:template match="/">
<xmp><xsl:copy-of select="*"/></xmp>
</xsl:template>
</xsl:stylesheet>
The screen-shot of the original search return:

Friday, December 23, 2011

SharePoint 2010 Migration

After a few months of preparation and testing we finally upgraded our production portal from MOSS 2007 to SharePoint 2010 about two weeks ago. There’s no major issue reported/found so far, and the overall feedback from the end users is positive.

The database approach was used in our migration, i.e. install a new SharePoint 2010 farm and mount the old MOSS 2007 databases. The migration process looks straightforward: run preupgradecheck then fix the issues, and run Test-SPContentDatabase then fix the issues, and finally run Mount-SPContentDatabase. Fixing those issues wasn't too hard but just a matter of time.

We had spent quite a lot of effort on content analysis and cleanup with our 60+G of content databases. At the end all our custom WebParts, dlls, user controls and some other SharePoint resources were repackaged into one solution called "SP2010Migration" that only builds one solution package for all.



The SP2010Migration solution package was supposed to deployed only once and that’s all. It shouldn’t be used anymore unless to build a new farm from scratch again. There’s no Feature in the solution package because it may bring trouble in later deployment. So in the future we can still package any WebPart or component into a Feature without concern of Feature conflict.

One interesting thing is that all user controls loaded by SmartParts were still working fine after migration. But we got some issues with the DataFormWebPart. For example the relative link was’t wrong after migration, and we had to write a console app to replace all "{@FileRef}" by "/{@FileRef}" inside each DFWP’s XSLT across the whole farm.

Another DFWP issue seemed to be more confusing where we only saw error in the DFWP page:

Unable to display this Web Part. To troubleshoot the problem, open this Web page in a Microsoft SharePoint Foundation-compatible HTML editor such as Microsoft SharePoint Designer. If the problem persists, contact your Web server administrator. Correlation ID:…

By Correlation ID we could easily find out the error detail from ULS log:

Error while executing web part: System.StackOverflowException: Operation caused a stack overflow. At Microsoft.Xslt.NativeMethod.CheckForSufficientStack() at SyncToNavigator(XPathNavigator , XPathNavigator ) at …

I tested the setting locally and everything seemed to be okay. How come a out-of-box SharePoint WebPart got a “stack overflow” error in production and it used to be working fine before migration? It turned out that time-out scenario occurred internally during XSL transform process in DFWP in production environment. That time-out threshold is only 1-second which means anything longer than 1 second will cause the error. We have a big list in our production server and the DFWP displays hundreds of rows in a big table which caused the time-out error.

That’s something new in SharePoint 2010 and also something annoying by Microsoft. It's great to introduce new stuff but it's also important to keep old stuff work right? Why not just turned off that new “time-out” feature by default and let end users to have an option to set it?

The worst thing is that there's no way to change that 1-second time-out setting! Microsoft provided "three solutions" for this issue:

1.) Tune XSL to get shorter transform time.
2.) Don't use DFWP instead use other WebPart.
3.) Write code to inherit DFWP and build your own.

Following the instruction we finally got the DFWP back after tweaking its settings, e.g. less columns and smaller page size. That's of course not an ideal way to solve the problem. We hope Microsoft could provide a better solution on this issue.

[2012-3 Update]: The time-out value of a DFWP now is configurable in Farm level with latest SharePoint CU. Refer to this.

Wednesday, December 07, 2011

A SharePoint Double Hop Issue

A SharePoint DataForm Web Part is not working properly sometimes after migrating from SharePoint 2007 to a SharePoint 2010 environment. Oringal ShaerPoint 2007 farm only has one front-end server and the new SharePoint 2010 farm includes two front-end servers and one application server. NTLM authentication is used in both SharePoint 2007 and 2010 environment.

The DataForm web part is working okay in SharePoint designer, and it's invoking the SharePoint Profile Service to retrieve some user profile data.

The ULS log shows (401) Unauthorized error:


w3wp.exe (0x1150) Error while executing web part: System.Net.WebException: The remote server returned an error: (401) Unauthorized. at System.Net.HttpWebRequest.GetResponse() at ....


Apparently that service call was routed to the other front-end server and then got access error. We verify that the SharePoint Web Services in both front-end servers do have anonymous access enabled. So why access error still happened?

Since the user has already authenticated to the site, the service call inside the DataForm webpart would automatically impersonate the original user instead of accessing outside as anonymous user, and that service call would fail in the other front-end server due to the NTLM setup in our environment. This is a typical NTLM double-hop issue.

Why the service call is not ending at local machine? Well it does sometimes and that's why it works sometimes. The problem is caused by the round robin DNS setup. To resolve the problem, simply add related entries to front-end servers' hosts file with domain name(s) pointing to local server. Then such service calls will always go to local machine and the double-hop issue will be gone.

Wednesday, November 30, 2011

Using Powershell to Update SharePoint 2010 User Profile

In previous post a small piece of code demos a simple way to update the user profile picture by object model. The same task can also be achieved easily using Powershell:
$userAccount = "SP2010\test"
$newPictureURL = "http://SP2010Site/Photos/test.jpg"
$site = Get-SPSite("http://SP2010Site")
$context = Get-SPServiceContext $site
$profileManager = New-Object Microsoft.Office.Server.UserProfiles.UserProfileManager($context)
$userProfile = $profileManager.GetUserProfile($userAccount)
$userPicture["PictureURL"].Value = $newPictureURL
$userProfile.Commit()
Actually there's some improvement in SharePoint 2010 user photo management. Instead of one picture being used everywhere in SharePoint 2007, SharePoint 2010 maintains three different profile pictures to be used in different contexts. Three images will be created when a user uploaded a picture which are stored in a library folder inside MySite's root site: http://mysite/User Photos/Profile Picture. You can get more detailed SharePoint 2010 photo management from this MSDN blog.

What about a SharePoint 2010 environment upgraded from SharePoint 2007? All the user pictures are still pointing to the old location. However, you can use following Powershell script to migrate all those pictures:
Update-SPProfilePhotoStore -MySiteHostLocation http://SPMySite/
What it does is rebuild all user pictures from previous verion. For example, a user with AD account of SP2010\test has the picture location of http://sp2010site/Profile Pictures/test.jpg. Then three images will be created after above Powershell command execution:
1. http://mysite/User Photos/Profile Pictures/SP2010_test_LThumb.jpg
-- size: 148x148 pixel
-- exmaple of usage: mysite title image
2. http://mysite/User Photos/Profile Pictures/SP2010_test_MThumb.jpg
-- size: 96x96
-- exmaple of usage: people search result image
3. http://mysite/User Photos/Profile Pictures/SP2010_test_SThumb.jpg
-- size: 36x36 small image
-- exmaple of usage: mysite colleague photo

One common scenario is to update test accounts' email address so testers/developers can get the email notification in the test environment. User's email address used by SharePoint to send the email notification is not directly from the user profile, instead it's from a user record in the User Information List tied to top site collection (the user info list data could be overridden by user profile service). In order to change a user's email address you have to modify the list item value in the user information list for a given site:
$web = Get-SPWeb http://sp2010site
$userInfoList = $web.Lists | where { $_.Title -eq "User Information List" }
$userAccount = $userInfoList.Items | where { $_["Account"] -eq "sp2010\test" }
$userAccount["Work e-mail"] = "whatevername@whatever.com"
$userAccount.Update()
Be aware that the modification could be overridden by user profile service's scheduled synchronization if you have user profile service enabled.

This is not related to user profile, but it's also common issue in a migrated SharePoint environment. A SharePoint 2010 farm migrated from SharePoint 2007 without visual upgrade will keep the old UI. But you will notice a new site created after migration (root site or subsite) would always use the new SharePoint 2010 UI. In case you just want to keep the old UI and don't have a short-term plan for UI upgrade, how to revert the new created site back to previous version? Simply a few lines of script:
$web = Get-SPWeb http://sp2010site/newsite
$web.UIVersion = 3
$web.MasterUrl = "/newsite/_catelogs/masterpage/homepagev3.master"
$web.AlternateCssUrl = "newsite/style library/customlayout.css"
$web.Update()

For more hands-on Powershell scripts used in SharePoint environment, refer to this codeplex project complied by a few SharePoint gurus.