Monthly Archives: July 2010

RCWP Part 1 – SPRetreat and the Related Content Web Part

I’m sat on the train after a great day of SPRetreat (followed by SharePint of course!), superbly organised by Andrew Woodward (21Apps) and Ben Robb (CScape). It was a really good day of innovative ideas, problem solving, chewing the fat (and the occasional dirty joke… you know who you are!).

The core challenge thrown down for the day involved trying to provide cross-site (collection) “related information”, effectively a “cross-pollination” function, using SharePoint 2010. There were some great ideas and a lot of top effort into involving the Managed Metadata Service, Custom Search API work and some cracking Scrum / Agile design processes.

We had 5 sessions of 1 hour each, and my efforts for the day mostly revolved around delivering a “Related Content Web Part”, which could use Search to show other content from any data source which is in some way related to the information on the current page.

In this post I’m going to walk through how my efforts of the day culminated in the “Related Content Web Part” but ended up being a generic “Dynamic Field Driven Search” web part, basically allowing a set of search results to be displayed based on the value of a field that is stored in the publishing page (i.e. content type) that contains the web part (which derived from the Search CoreResultsWebPart).

In the final sessions we though about how to improve this, and ended up building a custom contextual ribbon interface to surface SP Modal Dialogs to allow easy updating of core web part properties.
Core Functionality

The functionality is really split into 3 major sections:

(Full source code for the solution will be published in Part 3)

RCWP Part 1 – Extending the Core Results Web Part
This is one of the nicest new “features” of SharePoint 2010. They have stopped sealing all of the Web Parts (and Web Part Connections)that are used for the results pages in SharePoint Search solutions. This means it is now much easier for you to extend and add-value to these web parts without having to throw out all of the OOB functionality.
The reason for using the CoreResultsWebPart was simple;

  • Using search is fast, efficient, cross farm and highly configurable.
  • The core results web part using XSLT for rendering, so easy to design the output.
  • Leveraging an OOB web part means we get loads of added functionality for free! (like specifying which scope we want to use).

In this solution, we extended the CoreResultsWebPart to create our own Web Part. Simple create a new Visual Studio 2010 “Web Part” item and set it to inherit from “CoreResultsWebPart” (you will need to add a reference to Microsoft.Office.Server.Search.dll).

[ToolboxItemAttribute(false)]
public class RelatedContentWebPart : CoreResultsWebPart
{
}

In terms of functionality we need to do 1 thing:

Override the Query programmatically, based on the metadata of the current page.

This involved a few steps. First off, we need to identify which field we want to be “targeting”. For this we created a simple string field, exposed as a Web Part Property.

private string fieldName = “Title”;
///<summary>
/// The field on the current page we want to use for filtering
///</summary>
[WebBrowsable(true)]
[Personalizable(PersonalizationScope.Shared)]
[WebDisplayName(“Field Name”)]
[SPWebCategoryName(“Filter SPR”)]
[WebPartStorage(Storage.Shared)]
[Description(“Which field on the current page do you want to use for filtering?”)]
public string FieldName
{ get { return fieldName; } set { fieldName = value; } }

Then we needed to override the query itself. This is simply done by setting the “FixedQuery” property of the CoreResultsWebPart. The trick here is you need to set this in the “OnInit” event, otherwise it won’t take effect (if you try to place it in OnLoad, CreateChildControls or any other later method then it won’t have any effect!).

protected override void OnInit(EventArgs e)
{
    this.FixedQuery = “keyword string value”;
    base.OnInit(e);
}

Finally, we need to make sure we are pulling out the field value of the current list item, based on our custom field. For this we used a simple set of SPContext combined with some defensive programming to make sure we don’t get any NullReferenceException errors. So change the “OnInit” event to the following:

protected override void OnInit(EventArgs e)
{
    if (!string.IsNullOrEmpty(fieldName))
    {
        if(SPContext.Current.ListItem.Fields.
            ContainsFieldWithStaticName(fieldName))
        {
            this.FixedQuery =
                SPContext.Current.ListItem[fieldName].ToString();
        }
    }
    base.OnInit(e);
}

After that … Build / Deploy and the web part was working!

The code could obviously be re-factored a little bit, but on the whole it’s all working 🙂
Make sure you check back for:

  • RCWP Part 2 – Web Part with Ribbon Contextual-Tab
  • RCWP Part 3 – Edit Web Part using a Ribbon modal dialog
  • (Full source code for the solution will be published in Part 3)

    SP 2010 Websites and JavaScript “on demand” bug

    We are currently working on building the new Content and Code website on SharePoint 2010. This is a “brochure-ware” anonymous publishing site, and as such it is fairly lightweight (with most of the normal “heavy” JS files for things like the SP Ribbon, dialogs framework and the like not required).

     

    You can imagine my surprise when our landing page was showing up as over 650KB in size! On further inspection, the vast majority of this was OOB JavaScript files (including the Ribbon, dialog and core JS files!).

     

    I thought we’d made a mistake so I checked an OOB Publishing Portal and even the public “sharepoint.microsoft.com” website (both had the same problem) so it appears this is a more fundamental problem! Importantly, ALL of these tests are as an anonymous user!

     

    Lazy Load vs On Demand

    The first thing to mention is that these files are on “lazy load” (i.e. they load AFTER the UI has been rendered). So end users don’t really notice .. the page is displayed fairly quickly, and the remaining JS files download in the background.

     

    But lets be clear, this is NOT “on demand” JS. Files are clearly being loaded that should NEVER be required by anonymous users (like the ribbon??)

    The “On Demand” was one of the major infrastructure improvements to promote fast and efficient web-based systems, and for custom development works a charm (my own visual web part with on-demand JS worked perfectly .. the custom JS file didn’t get downloaded until I called a specific JS function).

    Caching

     

    The second thing to mention is that it appears these files are only downloaded once. Doing repeated page requests (or even “Ctrl-F5” full load requests) do not bring back these files unless you do a full clear of your browser cache (or try using a different browser) but it does mean that the “first hit” page size is rather large to say the least.

     

    So how many files are we talking about?

     

    I decided to test this on a public facing SharePoint 2010 Website, and the easiest one that sprang to mind was https://sharepoint.microsoft.com (a newly launched Microsoft site powered by SharePoint 2010 specifically for the launch).

     

    I was using Fiddler for the session information (as I could use it for multiple browsers, and knew it was independent to the actual page-source).

    I found that, on the first page hit (subsequent pages do not load them) the following JavaScript references were being downloaded (size in Bytes shown in brackets):

    • core.js (237,096)
    • sp.core.js (12,349)
    • cui.js (351,361)
    • sp.ui.dialog.js (34,243)
    • sp.runtime.js (68,784)
    • sp.js (389,372)
    • inplview.js (38,836)

     
     
    You can see the actual page statistics below, where the 9 JavaScript references represent 1,362,477 Bytes of data! (that’s over 1MB of JavaScript!)

     
     

      Is there a solution?

     

    Well, I haven’t looked too deeply into a solution yet (still peeling back layers looking for what is causing the “on demand” to load on first hit).

     
    One solution is not to use the <SharePoint:ScriptLink> control, and manually refer to whichever JS files you need in your Master Page (using Edit Mode Panels and Security Trimmed Controls to make sure the appropriate JS is available for editing experience and the like).

     

    This however is a complicated approach and requires quite a deep understanding of the JS files, what they do and when they are required (not for the faint hearted).

     

    I would certainly be very keen to hear about your run in with this particular issue. Do you experience this issue on your environments? Have you managed to get around it yet?

     

    Meanwhile I’ll continue my on-going love-hate relationship with SharePoint and try to work out how to stop my website having a 1MB page payload.

    Content and Code wins Microsoft Partner of the Year Awards

    Its that time of year again, and the Microsoft WPC (Worldwide Partner Conference) is almost upon us (Washington D.C., USA, July 11th-15th).

    But before the conference kicks off Microsoft has announced the award finalists and winners, and Content and Code has won two categories:

    • Microsoft Worldwide Partner of the Year, Information Worker Solutions, Enterprise Content Management (link)
    • Microsoft Country Partner of the Year, UK (link)

    I am incredibly proud of Content and Code for winning these awards, not least because the award was attained through the RNIB project which I worked on (along with a fantastic team at both Content and Code and the RNIB!). This is a great achievement, and I’m both proud and privileged to have worked on that project, and for Content and Code themselves!

    To top this off, Content and Code won Microsoft Worldwide Partner of the Year, Information Worker Solutions, Portals and Collaboration for 2009, so this now makes 2 years running that Content and Code have won a Worldwide Partner of the Year recognition!

    The official announcement can be found on the Worldwide Partner Conference site.

    SharePoint 2010 Exams Announced

    A few days ago Microsoft announced several new SharePoint 2010 exams that will be coming soon to a testing centre near you…

    IT Pro Track

    Passing both of those exams will earn you the new certification:

    MCITP: SharePoint Administrator 2010
    (Microsoft Certified IT Professional)

    Developer Track

    Passing both of those exams will earn you the new certification:

    MCPD: SharePoint 2010
    (Microsoft Certified Professional Developer)

    Unfortunately there are no upgrade paths from SharePoint 2007 (which makes sense, as the structure has changed).

    Of course, the exclusive Microsoft Certified Master will continue to run including SharePoint 2010.

    How to use LINQ for SharePoint in Anonymous mode

    Disclaimer – This has not been extensively tested, so please use the sample below at your own risk!
    I am a really big fan of using LINQ to SharePoint. It cuts out all of the nasty CAML and you can do very complex queries with only a few lines of code. The one big let-down was not being able to use LINQ to SharePoint in Anonymous mode… or so I thought!
    I actually stumbled across a blog post from Joe Unfiltered on getting LINQ to SharePoint working anonymously, so all the credit goes to him for finding this little gem.
    Basically, the problem is with the SPServerDataConnection object. If you crack this baby open with Reflector then you’ll see it relies on SPContext.Current for it’s operations. As the objects it calls are off-limits for anonymous users it forces an authentication prompt!
    There is, however, a simple way around this (a technique very similar to one I’ve used before when running with elevated privileges back to the Central Admin SPSite).
    You basically do the following:

    1. Take backup copy of the HttpContext.Current object
    2. Set HttpContext.Current = null;
    3. Execute your LINQ to SharePoint query
    4. Restore HttpContext.Current to it’s original (backup) value

    When the SPServerDataConnection object is created, as the SPContext.Current is null it will go ahead and instantiate a new object using the URL you passed through in your DataContext object 🙂
    Important – Don’t forget to put the HttpContext.Current back again after you have finished your LINQ query. This is VERY important, otherwise things will very quickly start to break!
    Sample code is below. Do Enjoy! (and thanks to Joe!)


    // store a copy of our current Context’s Web URL
    string strWebUrl = SPContext.Current.Web.Url;

    bool nullUser = false;

    try
    {
    // check if we are running anonymously (i.e. nullUser = true)
    nullUser = (SPContext.Current != null && SPContext.Current.Web.CurrentUser == null);

    // take a backup copy of the HttpContext.Current object
    HttpContext backupCtx = HttpContext.Current;

    if (nullUser)
    {
       // if running anonymously, set the current HttpContext to null
       HttpContext.Current = null;
    }

    // instantiate our data context (using the URL)
    MyCustomDataContext thisSite = new MyCustomDataContext (strWebUrl);

    // disable tracking to improve read-only performance
    eventSite.ObjectTrackingEnabled = false;


    // create a list of items from my custom list
    EntityList<Item> listItems =
         thisSite.GetList<Item>(“MyCustomList”);
    // query the list, only selecting items which meet the filter
    var currentItems = from listItem in listItems
       where listItem.Title != “”
       select listItem;

    // TODO: process your items
    }
    finally
    {
       if (nullUser)
       {
       // don’t forget to put your HttpContext back again
       HttpContext.Current = backupCtx;

       }
    }

    Update – Following this post (and the subsequent discussion on Twitter) there is a lot of questions around performance (creating new SPSite / SPWeb objects for each query) and the validity of nulling the HttpContext.Current in the first place.

    My original statement stands .. use this at your own risk, I haven’t fully tested it! 🙂 If you do your performance benchmarking and you are happy with it then no problem!

    SharePoint 2010 ALM – Part 2 – Upgrading your Solutions, In-Place or Side-by-Side?

    So in Part 1 we talked about some of the upgrade mechanisms available in SharePoint 2010, including Assembly level and Feature level versioning.

    In Part 2 we are going to look at 2 distinct versioning strategies:

    • In Place Upgrade (replace old with new)
    • Side-by-Side Release (allow both version to run at the same time)

    In-Place Upgrade

    This is going to be most common scenario for bug fixing, hotfixes and minor improvements to the solution. So you might be adding some help files, or tweaking a user interface, or perhaps fixing some JavaScript or CSS bugs in your code.

    Typically you would not use this approach for major versions of products, as a major change to interface or functionality is likely to confuse existing users, and might causing existing configurations to break! There are exceptions however, your client might not want to manually swap out 500 web-parts on their portal and might want a “big bang”.. provided it has been thoroughly tested!

    So what did we discuss in Part 1 that can be used in SharePoint 2010 to achieve an In-Place Upgrade?

    $SharePoint.Project.AssemblyFullName$

    This can allow you to easily update all of your ASCX / ASPX / CAML references with the new assembly.

    Next time your Page / Web Control / Feature receiver tries to execute then it will be running the new code, seamlessly.

    Assembly Binding Redirects

    This allows you to pick up any code that is attempting to execute in the old version, and forces it to run through the new assembly methods instead.

    This is a blanket, assembly-wide approach, typically deployed at the Web Application level (although you can go server-wide using the machine.config!)

    Warning – This obviously only works from code that runs through the web.config and IIS / ASP.Net. Any other code (such as Timer Jobs and Workflows) will be completely unaware of these redirects and will happily continue running the old code!

    Workflows especially should not use binding redirects this way. If you drop in a redirect for a workflow that is currently running then you will break it next time it tries to serialize itself to the database!

    Feature Versions

    The feature framework allows you to do incremental upgrade steps to modify existing features, upgrading and “replacing” the functionality for existing items (such as Content Types, site columns, custom actions … anything that you would provision through a Feature.xml).

    You might have to be careful with custom code (and the CustomUpgradeAction element) but all in all, Feature versions are designed to do an “in place upgrade” of your features.

    Side-by-Side Release

    This is likely to be the most common scenario for Major Versions, functionality changes, new templates. It is important to recognise that certain elements in the SharePoint schema MUST be upgraded in this way (such as Site Definitions, List Definitions and Workflows).  For this scenario “upgrade” is probably the wrong word to use, because what we really have to do is create a copy (and in some scenarios hide the old one!).

    So how would you go about creating a side-by-side upgrade? Well, the first thing to consider is that you need to leave your existing assembly in place.

    Solution Package (WSP) Considerations

    If you “retract” the solution package then typically it will remove your original DLLs. This will cause existing functionality to break (this is bad).

    One workaround is to create a new WSP. SharePoint will treat this as a separate solution, allowing you to install it while leaving the existing one in place.

    Assembly (DLL) and Package (WSP) Considerations

    The assembly needs careful consideration, depending on where your assembly is going to live;

    If you have a full trust package which deploys to the GAC, then you really don’t have too much of a problem. You can create your new assembly with a different version number and deploy that DLL to the GAC. The GAC has been designed that the same assembly with multiple version numbers can happily sit side-by-side.

    If you have a minimal trust package which deploys to the web application BIN folder then you have more of a problem. Releasing the same assembly name is going to overwrite the file (deleting the original DLL, which will break existing functionality … also bad). Other than moving it to the GAC (which breaks the whole point of deploying for minimal trust in the first place!) there is only one practical workaround here; rename your assembly (so you have 2 different DLL file names)

    So now that you have dealt with your assembly, what about other assets?

    Workflows (WWF)

    Workflows are a nasty one. One they are running the class gets serialized into the database, so you HAVE to leave existing code on the system until all instances of that workflow have stopped running! (you can set a workflow to “no more instances” from the remove screen). The problem is that in a typical environment you are probably going to have NO IDEA how many workflow instances are running (and even if you did, you probably won’t be able, or willing, to stop them!).

    The only scenario is to provide a duplicate workflow. Copy the workflow in Visual Studio, make your changes, and provision it into SharePoint as a DIFFERENT workflow.

    Again, you would hide the original (so people can’t add them to new lists) and you would probably want to perform some kind of exercise to remove any existing workflows (once all the running instances have stopped) and setup the new workflow as a new instance on the list / content type.

    As you can imagine, this is NOT going to be an easy task and it’s highly likely that this will be a manual governance effort, and not something that can be easily automated.

    (note – even if you are leaving the existing workflow class in place you still can’t change the assembly version … the serialization will fail and existing workflow instances will break!)

    Site Definitions (onet.xml)

    Microsoft very clearly states that modifying a Site Definition while it is use is not supported.

    So how DO you “upgrade” them? Well, you have 2 options:

    Feature Stapling

    The easy solution is to use a feature stapler to add new functionality to an existing site.

    This allows you to modify NEW sites that are created, although you won’t be able to impact on everything (features activate before lists and pages are created, so you won’t be able to modify those!)

    This is discussed in detail on MSDN: https://msdn.microsoft.com/en-us/library/bb861862(office.12).aspx

    Copy and Hide the Original

    This is pretty straightforward. You copy your Site Template folder and create a new version.

    You then modify your WEBTEMP*.xml file to add in your new template (using the same name / description as the original).

    You then set the original template in the WEBTEMP*.xml as Hidden=TRUE.

    Deploying this, your users still “see” exactly the same when they create new sites, but when they do create the site it is now provisioning using your new template!

    List Definitions (schema.xml)

    This is a very similar story to Site Definitions. You can change “some” things but not everything (and some of the things you change may break existing lists!).

    The most reliable way is to copy your Feature (giving it a new GUID) and hide the original ListTemplate. This will stop users from using the old template to create new lists, but if you have existing sites that are provisioning them automatically then you’ll have to look at the Site Definition upgrade (see above!).

    That should give you a pretty good grounding (at the very least a start!) at how to upgrade your projects, either in-place upgrade or side-by-side upgrade.

    A lot of it is going to be head-scratching and sitting down to think .. don’t rush into it, take your time, and make sure you truly understand what you are trying to achieve, and what the impact is going to be on the end users!

    SharePoint 2010 ALM – Part 1 – Versioning Features and Assemblies

    This post was prompted by needing to explain the versioning and upgrade strategy in SharePoint 2010 to some partners we are working with at Content and Code. It ended up a very length conversation so I decided to jot it all down and split it up into 2 parts.

    • Part 1 – Feature and Assembly Versioning
    • Part 2 – Upgrading your projects for In-Place Upgrade or Side-by-Side release.

    Regarding upgrading SharePoint 2010 projects there are 2 specific areas of new functionality that will key for SharePoint 2010 Application Lifecycle Management:

    • Assembly Versions
    • Feature Versions

    Assembly Versions

    It is typically recommended that the Assembly Version is incremented when you are upgrading / fixing / changing any of your managed code. This will however have an implication for web parts, web controls and pages across the system, but SharePoint 2010 has provisioned for that.

    $SharePoint.Project.AssemblyFullName$

    This is a new placeholder value that you will find all over SharePoint 2010. This includes both ASPX / ASCX  files (such as Page Layouts or Visual Web Parts, as an <% @Assembly %> tag) as well as Feature Receivers (in the Feature XML). Basically, when you package your WSP SharePoint will automatically swap out this value for the fully-qualified Assembly Reference for the current project state. This means it will swap out the full Assembly name, version number, culture and Public Key Token.

    Assembly BindingRedirect element (solution manifest.xml)

    Another new feature in SharePoint 2010 is the ability to specify an AssemblyBindingRedirect from the WSP.

    This is part of the Manifest.xml schema (and has intellisense support in Visual Studio 2010) to allow you to automatically add new assembling binding information when a new WSP is rolled out.

    Example: binding redirect to redirect version “1.0.0.0” to the current assembly version in the package:

    <Assemblies>

        <Assembly Location="MyCustomAssembly.dll" DeploymentTarget="GlobalAssemblyCache">

          <BindingRedirects>

            <BindingRedirect OldVersion="1.0.0.0"/>

          </BindingRedirects>

        </Assembly>

    </Assemblies>

    With these two functions combined, we should have no problem upgrading an existing assembly to a new version.

    You should be aware though that the Binding Redirects will only apply to the Web Application’s web.config, and therefore will NOT work for anything operating outside of the web context (such as Timer Jobs and Workflows!). This is discussed in more detail in Part 2.

    Feature Versions

    There are a number of new version-related parts to the Feature framework. Each feature.xml can now be given a specific Version number property (if you don’t specify then it will be given a value of 0.0.0.0). This property was listed in the WSS 3.0 SDK, but it was reserved for internal use at the time (now it finally gets some legs!)

    This all stems from a new set of XML elements that you can add to the Feature.xml, starting with <UpgradeActions> element.

    <VersionRange>

    The first element is the Version Range .. which allows you to specify the BeginVersion and EndVersion. The actions specified in the upgrade will only be processed if the if the current version of the feature is between that range!

    Typically you would wrap this element around the actions that you wish to be executed when the upgrade is performed.

    <AddContentTypeField>

    Pretty much as it says on the tin, this allows you to add a field to an existing Content Type.

    <ApplyElementManifests>

    This allows you to apply a specific element manifest when the upgrade process kicks off.

    This is good because if you are adding a new element manifest (say to provision a new site column?) then you can process that specific manifest in isolation as part of your upgrade process without going off and executing all of the other manifests in the feature.

    <MapFile>

    According to the SDK this allows you to re-map a file in the content database to a different “ghosted” location. I suppose this is a supposed to be a quick, memory efficient way of gracefully modifying files, although I’m not sure what value this really has? (could you not just overwrite the file on the hard disk?)

    <CustomUpgradeAction>

    This element ties in with your Feature Receivers. There is a new method in the SPFeatureReceiver class. This method is called: FeatureUpgrading() and it allows you to execute custom code as part of your upgrade process.

    (the only problem is that .. if you deploy your feature to a brand new farm and execute it “first time” from the new version .. this method does not get executed! You should really consider using the Feature Activated event to make sure it the behaviour is consistent!)

    Example:

    Below is an example snippet.

    Here, if we are upgrading from Version 1 to Version 2 then a new field will be added to a content type, and those changes pushed down to all child content types and lists.

    If we are upgrading from Version 2 to Version 3 then a new element manifest will be processed.

    <UpgradeActions>

        <VersionRange BeginVersion="1.0.0.0" EndVersion="2.0.0.0">

          <AddContentTypeField

    ContentTypeId="0x010100C568DB52D9D0A14D9B2FDCC96666E9F2007948130EC3DB064584E219954237AF39"

    FieldId="{C6885FF1-4F70-425F-AB32-41767AB1B8CC}"

    PushDown="TRUE"/>

        </VersionRange>

        <VersionRange

            BeginVersion="2.0.0.0"

            EndVersion="3.0.0.0">

          <ApplyElementManifests>

            <ElementManifest Location="UpgradeManifest.xml"/>

          </ApplyElementManifests>

        </VersionRange>

      </UpgradeActions>

    With the feature upgrade framework, this should allow you to do almost anything when managing how your features are “upgraded” in a graceful way.

     

    In Part 2 we will discuss how we can leverage these for two different upgrade scenarios:

    • In-Place Upgrade (replacing existing functionality)
    • Side-by-Side Installation (allowing two versions to co-exist)

    SP2010 RTM Bug found in CQWP [PageQueryString] feature

    Sigh .. another few days of RTM development and another bug found …

    This one regards the Content by Query Web Part (CQWP) and the new PageQueryString filter functionality. This is a cracking new feature which allows you to dynamically set the filter based on a query string value in the request URL… unfortunately there is a bug.

    I found this when I was trying to set an aggregation of some News pages, but filter them using a query string as a poor-man’s web part connection 🙂

    My Requirement: Show all custom “News” pages within the pages library of my News site, and filter the date using a query string value.

    Normally this would involve custom development, but not any more in SharePoint 2010!

    1. I targetted my CQWP at the Pages Library (no point querying the entire site, as it would be a performance hit).
    2. I also targetted the CQWP at my custom “News Page” content type, as I didn’t want the “home” page to be returned in the aggregation
    3. Finally, I added a Filter of “Article Date” is Greater Than [PageQueryString: StartDate], so that I can start to filter my news by date using the query string.

    I hit the “apply” button with sweaty palms .. excited to see this new functionality in action ..

    There is a problem with the query that this Web Part is issuing. Check the configuration of this Web Part and try again.

    FAIL

    What Went Wrong?
    This was quite a shock .. I checked and double checked that my filters were setup correctly.. I spent an entire day checking and double checking all of my settings and values to make sure I hadn’t made a mistake somwhere. If I removed that filter then it worked (and returned all my news items). Bizarelly if I included a valid query string like ?Title=Home in my URL then it WORKED?? But remove the filter completely and it failed …

    So What EXACTLY doesn’t work ??
    I spent a good few hours investigating this in more detail, and came across the following findings. If you have the following ALL set in your CQWP then it will fail:

    • Targetting a Specific List, and
    • Targetting a specific Content Type, and
    • Any [PageQueryString] filter in your query, and
    • The [PageQueryString] variable is NOT present in the URL

    If you change any one of these settings then it will start working!

    How To Replicate This Bug
    This is quite simple really, just follow these steps:

    1. Create a new “Publishing Site” site collection
    2. On the home page, add a new Content By Query Web Part
    3. Set the following query properties
      • Show items from the following list: /Pages
      • Show items of this content type group: Publishing Content Types
      • Show items of this content type: Page
      • Show items when: Title > Contains > [PageQueryString: title]

    4. Hit Apply. The web part will report: There is a problem with the query that this Web Part is issuing. Check the configuration of this Web Part and try again. If you add ?Title=Home to the query string then you should notice that the web part appears to work, but fails if the query string property is missing!

    What is the Workaround?

    If you then do ANY of the following, it will work:

    • Point it at a Site, or Site Collection, instead of a specific list, or
    • Do not filter by Content Type, or
    • Do not use a [PageQueryString] filter

    Now … for my example (bring up pages using a specific page layouts from a pages library) I chose to change the query to target the site. I don’t have a lot of other content in my News site, so it won’t be much of a problem.

    I can imagine this cropping up again and again in the future though, especially with large Intranet sites with thousands (or even millions?) of documents … you don’t want to be doing site-wide queries there, the performance impact would be significant!!

    Update:
    Glyn Clough has come up with a workaround. It seems that adding an additional filter that is always true (such as Title != “”) makes it work. Annoying, but at least there is a workaround.

    VS 2010 RTM and SP 2010 RTM – Deployment Conflict Resolution will break modules!

    What a way to lose 4 hours of your life.

    I was in a glorious mood this morning having upgraded my main development environment to RTM (both SharePoint 2010 and Visual Studio 2010 (Premium)).

    The main code base we’d been working on (a brand new SP2010 web site, due to go live next month) compiled, the packaged could be created and I manually deployed the WSP and all was good … that is until I tried the “Deploy” button in Visual Studio…

    Error Occurred in deployment step ‘Add Solution’: Value cannot be null. Parameter name: s

    FAIL

    So what on earth could be causing this error? Well, I stumbled across a registry edit that enabled diagnostics logging for SharePoint projects in VS 2010 (thanks to Glyn Clough for letting me know about it).

    Enable Diagnostics Logging for SharePoint Projects in Visual Studio 2010
    [HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\10.0\SharePointTools] “EnableDiagnostics”=dword:00000001

    This gave me a stack trace in the Output window for the build alongside the error message, as follows:

    (I have highlighted the error message, as well as the line throwing it .. the StringReader constructor complaining that it’s been passed a null value!)

    —— Build started: Project: MySharePointProject, Configuration: Debug Any CPU ——

    MySharePointProject -> C:\Projects\MySharePointProject\bin\Debug\MySharePointProject.dll
    Successfully created package at: C:\Projects\MySharePointProject\bin\Debug\MySharePointProject.wsp

    —— Deploy started: Project: MySharePointProject, Configuration: Debug Any CPU ——
    Active Deployment Configuration: Default
    Run Pre-Deployment Command:
    Skipping deployment step because a pre-deployment command is not specified.
    Recycle IIS Application Pool:
    Skipping application pool recycle because no matching package on the server was found.
    Retract Solution:
    Skipping package retraction because no matching package on the server was found.
    Add Solution:
    Error occurred in deployment step ‘Add Solution’: Value cannot be null.
    Parameter name: s

    Exception Message: Value cannot be null.
    Parameter name: s
    Exception Type Name: System.ArgumentNullException

    Exception Stack Trace: at System.IO.StringReader..ctor(String s)
    at System.Xml.Linq.XDocument.Parse(String text, LoadOptions options)
    at Microsoft.VisualStudio.SharePoint.ProjectExtensions.VSPackage.XmlDocument.GetXDocument()
    at Microsoft.VisualStudio.SharePoint.ProjectExtensions.VSPackage.XmlDocument.get_Document()
    at Microsoft.VisualStudio.SharePoint.ProjectExtensions.VSPackage.ModuleElementManifest.GetModuleElements(Boolean ignoreSetupPathModules)
    at Microsoft.VisualStudio.SharePoint.ProjectExtensions.VSPackage.ModuleElementManifest.GetFileElements(Boolean ignoreSetupPathModules)
    at Microsoft.VisualStudio.SharePoint.ProjectExtensions.VSPackage.ModuleCollisionFinder.b__2(ISharePointProjectItemFile elementManifest)
    at System.Linq.Enumerable.d__31`3.MoveNext()
    at System.Linq.Buffer`1..ctor(IEnumerable`1 source)
    at System.Linq.Enumerable.ToArray[TSource](IEnumerable`1 source)
    at Microsoft.VisualStudio.SharePoint.ProjectExtensions.VSPackage.SingleAspectCollisionFinder`1.FindConflicts()
    at Microsoft.VisualStudio.SharePoint.ProjectExtensions.VSPackage.DeploymentConflictFinder.FindAndAddConflictsTo(IDeploymentConflictCollection targetConflictCollection, Boolean promptBeforeResolve)
    at Microsoft.VisualStudio.SharePoint.ProjectExtensions.VSPackage.Module.DetectConflicts(DeploymentStepStartedEventArgs e, Boolean promptBeforeResolve)
    at Microsoft.VisualStudio.SharePoint.ProjectExtensions.VSPackage.Module.DeploymentStepStarted(Object sender, DeploymentStepStartedEventArgs e)
    at Microsoft.VisualStudio.SharePoint.SharePointProjectItemTypeEvents.RaiseDeploymentEvent[T](EventHandler`1 eventHandler, T e, ISharePointProjectItem projectItem, ISharePointProjectItemDeploymentContext context)
    at Microsoft.VisualStudio.SharePoint.SharePointProjectItemTypeEvents.OnDeploymentStepStarted(ISharePointProjectItem projectItem, IDeploymentStepInfo stepInfo, ISharePointProjectItemDeploymentContext context, IDeploymentConflictCollection conflicts)
    at Microsoft.VisualStudio.SharePoint.SharePointProjectItemType.OnDeploymentStepStarted(ISharePointProjectItem projectItem, IDeploymentStepInfo stepInfo, ISharePointProjectItemDeploymentContext context, IDeploymentConflictCollection conflicts)
    at Microsoft.VisualStudio.SharePoint.Deployment.DeploymentUtils.NotifyStepStarted(IDeploymentStepInfo stepInfo, IDeploymentContext context)
    at Microsoft.VisualStudio.SharePoint.Deployment.ConfigurationExecutor.Execute()
    at Microsoft.VisualStudio.SharePoint.Deployment.WspDeploymenHandler.Deploy()
    ========== Build: 1 succeeded or up-to-date, 0 failed, 0 skipped ==========
    ========== Deploy: 0 succeeded, 1 failed, 0 skipped ==========

    So .. what does this tell us?

    Firstly, all of the exceptions are being thrown by internal Microsoft assemblies. There is no custom code, no custom solutions and no 3rd party bolt ons. But the thing that didn’t jump out at me straight away was buried in the middle of the stack trace … a load of references to “conflicts”:

    • VSPackage.Module.DetectConflicts
    • VSPackage.DeploymentConflictFinder.FindAndAddConflictsTo
    • VSPackage.SingleAspectCollisionFinder`1.FindConflicts
    • VSPackage.ModuleCollisionFinder

    This sparked inspiration… there is an option in Visual Studio 2010 to set Deployment Conflict Resolution on a module that you have added. Deployment Conflic Resolution will automatically check for files that you are deploying to see if they already exist. If they are outdated then Visual Studio can forcibly delete them and upload the newer versions (quite a nice feature .. when it works).

    By default this value is set to “Automatic” .. so I tried turning it off and set it to “None“.
    SUCCESS
    The build now works with Deployment Conflict Resolution disabled!
    It’s a shame that I can’t get my solutions to deploy without that turned off. It would be nice to have conflict resolution enabled on my projects, but at the moment this is the only way I can get them to work.
    Why has this happened? My only real guess is that Visual Studio 2010 RTM came out several weeks before SharePoint 2010 RTM, so perhaps somewhere the conflict resolution code got out of sync with the latest SharePoint release?? (that is a very wild guess / stab in the dark ..)
    Also please bear in mind that this is using SharePoint 2010 RTM and Visual Studio 2010 Premium (RTM). I have noticed this with both brand new OOB projects and existing projects, so be careful when migrating your code base from an RC based dev environment to RTM!
    The original MSDN post where I reported (and quite quickly solved myself) can be found here:
    If there is a hotfix or patch released I’ll make sure both places get updated!
    Thanks for listening … another 4 hours of my life gone forever, but hey .. what doesn’t kill you only makes you stronger right??
    « Older Entries Recent Entries »