Tag Archives: SharePoint

SP2010 RTM Bug found in CQWP [PageQueryString] feature

Sigh .. another few days of RTM development and another bug found …

This one regards the Content by Query Web Part (CQWP) and the new PageQueryString filter functionality. This is a cracking new feature which allows you to dynamically set the filter based on a query string value in the request URL… unfortunately there is a bug.

I found this when I was trying to set an aggregation of some News pages, but filter them using a query string as a poor-man’s web part connection 🙂

My Requirement: Show all custom “News” pages within the pages library of my News site, and filter the date using a query string value.

Normally this would involve custom development, but not any more in SharePoint 2010!

  1. I targetted my CQWP at the Pages Library (no point querying the entire site, as it would be a performance hit).
  2. I also targetted the CQWP at my custom “News Page” content type, as I didn’t want the “home” page to be returned in the aggregation
  3. Finally, I added a Filter of “Article Date” is Greater Than [PageQueryString: StartDate], so that I can start to filter my news by date using the query string.

I hit the “apply” button with sweaty palms .. excited to see this new functionality in action ..

There is a problem with the query that this Web Part is issuing. Check the configuration of this Web Part and try again.

FAIL

What Went Wrong?
This was quite a shock .. I checked and double checked that my filters were setup correctly.. I spent an entire day checking and double checking all of my settings and values to make sure I hadn’t made a mistake somwhere. If I removed that filter then it worked (and returned all my news items). Bizarelly if I included a valid query string like ?Title=Home in my URL then it WORKED?? But remove the filter completely and it failed …

So What EXACTLY doesn’t work ??
I spent a good few hours investigating this in more detail, and came across the following findings. If you have the following ALL set in your CQWP then it will fail:

  • Targetting a Specific List, and
  • Targetting a specific Content Type, and
  • Any [PageQueryString] filter in your query, and
  • The [PageQueryString] variable is NOT present in the URL

If you change any one of these settings then it will start working!

How To Replicate This Bug
This is quite simple really, just follow these steps:

  1. Create a new “Publishing Site” site collection
  2. On the home page, add a new Content By Query Web Part
  3. Set the following query properties
    • Show items from the following list: /Pages
    • Show items of this content type group: Publishing Content Types
    • Show items of this content type: Page
    • Show items when: Title > Contains > [PageQueryString: title]

  4. Hit Apply. The web part will report: There is a problem with the query that this Web Part is issuing. Check the configuration of this Web Part and try again. If you add ?Title=Home to the query string then you should notice that the web part appears to work, but fails if the query string property is missing!

What is the Workaround?

If you then do ANY of the following, it will work:

  • Point it at a Site, or Site Collection, instead of a specific list, or
  • Do not filter by Content Type, or
  • Do not use a [PageQueryString] filter

Now … for my example (bring up pages using a specific page layouts from a pages library) I chose to change the query to target the site. I don’t have a lot of other content in my News site, so it won’t be much of a problem.

I can imagine this cropping up again and again in the future though, especially with large Intranet sites with thousands (or even millions?) of documents … you don’t want to be doing site-wide queries there, the performance impact would be significant!!

Update:
Glyn Clough has come up with a workaround. It seems that adding an additional filter that is always true (such as Title != “”) makes it work. Annoying, but at least there is a workaround.

Un-Publish a list item programmatically

I wrestled with this one for a while. Although you can go to Version History and there is an option to “unpublish” a version doing this programmatically was very much a pain.

I finally realised that you could do this quite simply without going anywhere near the versioning end of the SharePoint API.

// overwrite the “published” version with a “draft” version
item.ModerationInformation.Status = SPModerationStatusType.Draft;
item.UpdateOverwriteVersion();

This did work in my instance where I knew the “current” version of the item was Published, although I’m not sure if this also works when you have draft versions in edit (i.e. version 1.5)

SharePoint 2010 release date (and also my wedding anniversary)

Exciting times and exciting news… the SharePoint 2010 (and Office 2010) release date has finally been confirmed to be 12th May 2010.

The SharePoint team blog posted this up on Friday:

Today, we officially announced that May 12th, 2010, is the launch date for SharePoint 2010 & Office 2010. In addition, we announced our intent to RTM (Release to Manufacturing) this April 2010.

This was further confirmed by the Office 2010 Engineering blog who made this announcement:

For businesses, we will launch the 2010 set of products, including Office 2010, SharePoint 2010, Visio 2010, and Project 2010 worldwide on May 12. To find out more about the Worldwide Business Launch, visit https://sharepoint.microsoft.com/businessproductivity/proof/pages/2010-launch-events.aspx.



For consumers, Office 2010 will be available online and on retail shelves this June. Until then, you can get the Office 2010 beta at www.office.com/beta.

The only problem I have is that May 12th is also my wedding anniversary …

What do you think? Organise a launch party and cancel the dinner and flowers?? 🙂 (might be more than my wife life is worth)

Epic Fail – Saving to a document library overwrites AllItems.aspx

This was a major “doh” moment .. learnt the hard way.

I don’t know if this is because I’m using Word 2010 beta, but I was working on a document that I wanted to save into a SharePoint library.

So I copy pasted the URL into the “Save As” dialog (including the “…/Forms/AllItems.aspx” bit on the end).

I expected it to show me a view of my library.

I got a Word Document saved as “AllItems.aspx”.

Epic Fail! 🙂

IIS7 broke my Content Deployment! (404 – Not Found error)

Important one to bear in mind this, especially as SharePoint 2010 is likely to have the same limitation (IIS7!)

Content Deployment works by creating exports of the Site Collection data, and splitting it up into CAB files with a default size of 10MB. These are then shipped off to the target Central Admin Application (using a Layouts page called DeploymentUpload.aspx). Once all the CAB files have been received they are imported into the database of the target site collection.

All very simple, but what do you do when your Content Deployment job starts throwing errors like “404 – Not Found”?? (note – Content Deployment errors end up in the Windows Application Logs).

Well, the first place to look would be your IIS logs for your Central Administration application (which for IIS7 are located in the C:\inetpub folder). Look for a reference to “DeploymentUpload.aspx” with a 404 error reference.. which I have an example of below:

2010-02-02 10:16:29 ::1 POST /_admin/Content+Deployment/DeploymentUpload.aspx filename=%22ExportedFiles18.cab%22&remoteJobId=%22b8a556bc-8eef-4f6f-97b6-6bac54ae8d99%22 40197 – ::1 – 404 13 0 78

Now, I have highlighted the error fragment which states that this is a 404.13 error (or Content Length Too Large)! The reason for this is that one of the CAB files is too big, and IIS7 has a file upload limit of about 27MB!

Now, the quick witted among you will remember I said that the CAB files automatically split up into 10MB chunks .. but if MOSS comes across a file that is too big it will simply expand the file size until it can fit that file in!! In the case of a project I’m working on this lead to a 68mb CAB file!

The only workaround is to configure the IIS7 Virtual Directory for Central Administration to allow file uploads big enough for my CAB file to get through (in my case, I set it to 80,000,000 bytes, approx 75MB).

To do this, open up a command prompt, navigate to C:\Windows\System32\Inetsrv\ and execute the following command:

appcmd set config “SharePoint Central Administration v3” -section:system.webserver/security/requestFiltering -requestLimits.MaxAllowedContentLength:80000000

Restarted the Content Deployment job and all good .. working again 🙂 So something to bear in mind … why IIS7 can break your Content Deployment Jobs!!

Fixed: SPWeb.Navigation.QuickLaunch == null

This one plagued me for quite some time.. you create a new site, and some pages. The navigation menus are all there but when you look in code the SPNavigationNodeCollections are null.

This is the same for both the standard WSS navigation collection (SPWeb.Navigation.QuickLaunch) and the publishing site collection (PublishingWeb.CurrentNavigationNodes).

However … you navigate the the “Modify Navigation” screen and make any change, click Ok and magically all the nodes appear in code!

Why does this happen?
the main thing is the role of the “SiteMap” providers in SharePoint. These are what really drive your navigation rendering, presenting an XML map of the “nodes” structure of your pages and using controls like the ASP.Net menu to render them.

The problem is, these don’t always match up with the data in the content database around navigation settings. I think this follows the similar grounds of “ghosting” in that (for performance reasons) the SPNavigationNodeCollection doesn’t get filled up with data until you “customise” something.

This is normally achieved by going to the “Modify Navigation Settings” page and clicking the “ok” button, at which point SharePoint conveniently goes and populates the nodes collection, and it becomes available to you in code.

What can I do about it?
Well, thankfully there is a fix for this.

You can manually create the nodes yourself, but you DO have to add some special node properties so that SharePoint recognises them as being “linked” to the existing pages and sites, otherwise you end up with duplicates all over the place!

The trick is that any node you add for pages you add a custom Property of NodeType set to the value of Page. For sub-sites you set this property to Area.

This way, SharePoint sees your custom nodes as being “Page” and “Site” nodes instead of new custom links!

Simple eh???

The code is listed below:

// use a Publishing Web object so we can access pages

PublishingWeb pWeb = PublishingWeb.GetPublishingWeb(web);

// use this to store the urls that exist in the current nav

List currentUrls = new List();

foreach (SPNavigationNode node in web.Navigation.QuickLaunch)

{

// collect all of the existing navigation nodes

// so that we don’t add them twice!

currentUrls.Add(node.Url.ToLower());

}

foreach(PublishingPage page in pWeb.GetPublishingPages())

{

// check to make sure we don’t add the page twice

if (!currentUrls.Contains(page.Uri.AbsolutePath.ToLower())

&& page.IncludeInCurrentNavigation)

{

// create the new node

SPNavigationNode newNode = new SPNavigationNode(page.Title, page.Url);

newNode = web.Navigation.QuickLaunch.AddAsFirst(newNode);

// IMPORTANT

// SET THE NodeType to “Page”

newNode.Properties[“NodeType”] = “Page”;

// save changes

newNode.Update();

}

}

foreach(SPWeb tempWeb in web.Webs)

{

// make sure we don’t add the sub-site twice

if (!currentUrls.Contains(tempWeb.ServerRelativeUrl.ToLower()))

{

// create the new node

SPNavigationNode newNode = new SPNavigationNode(tempWeb.Title, tempWeb.ServerRelativeUrl);

newNode = web.Navigation.QuickLaunch.AddAsLast(newNode);

// IMPORTANT

// SET THE NodeType to “Area”

// (this is a throwback to SPS 2003 where it used

// “Portal Area” instead of sub sites)

newNode.Properties[“NodeType”] = “Area”;

newNode.Update();

}

}

// save changes to the Publishing Web

pWeb.Update();

Understanding SharePoint Application Security and Elevating Privileges

This post was prompted because of a particularly challenging bit of security that I needed to traverse. I needed some way of presenting the status of a Content Deployment Job (configured in Central Administration) in the Web Application that it relates to.

Seems pretty straight forward?
Well, its not, and this article will hopefully explain why.

RunWithElevatedPrivileges and Application Pool Accounts
So the first thing I looked at was using the good old SPSecurity.RunWithElevatedPrivileges method. This is a well known (and on occassion heavily used) practice for getting around security in SharePoint. But does everyone understand exactly what it does?

In a nut-shell, this method simply changes the currently impersonated user from the currently logged in user to an account called “SharePoint\System” (a.k.a. “System Account”).

This account doesn’t actually exist, and anyone inspecting the WindowsIdentity or SPUser object in any great detail will spot that this account doesn’t actually have a valid SID (Security Identifier). This is because it represents a placeholder.. a flag in SharePoint that tells it to impersonate the Application Pool Account instead of the currently logged in user.

The Application Pool Account has full SharePoint permissions to the Web Application (effectively making it a Site Collection Administrator in every single Site Collection).

So what does this actually mean?

SQL Server Permissions
Believe it or not, SQL Server permissions in SharePoint are extremely simple.

Taking the 3 core databases for each SharePoint Farm:

1. Farm Configuration Database
This contains the core configuration information (servers, URLs, accounts) for the entire SharePoint Farm.

The Setup Account has DBOwner permissions.

All application pools accounts are added to a Database Role called WSS_Content_Application_Pools which has severely locked down read privileges.

2. Central Administration Content Database
This is effectively the content database for the Central Administration site. This contains the SPSite / SPWeb / SPList objects that store all of the content related settings (including Content Deployment Jobs).

Again, the Setup Account (which incidentally will be running the Central Administration Application Pool!) has DBOwner permissions.

All application pools accounts are added to a Database Role called WSS_Content_Application_Pools which has severely locked down read privileges.

3. Web Application Content Database
This is the database (or mulitple databases) that contain the Site Collection content for the Web Application.

Here the Application Pool Account (for that specific Web Application) is granted DBOwner permissions. No other accounts are specified!

That is pretty much it. From a security (and “least privileged” perspective) it’s a very robust setup. If your application pool is compromised then the application pool account only has SQL permissions to it’s own content database.

According to best practice, every Web Application should have it’s own application pool account, which again makes sense according to the model above, limiting the surface area for any attack (as one web application being compromised would not have any impact on the other application pools).

This should also make it obvious why you should never make an Application Pool Account a Local or Farm Administrator! You are essentially breaking the security model if you do this (and massively widening the exposed area of your system if that account is ever exposed!).

NTLM authentication and “Double Hop”
The first thing that should scream at you here is that none of the SharePoint user accounts have ANY permissions in SQL. Every single SQL query is executed within a SharePoint Web Application using the Application Pool account!

The reason for this is clear once you understand the limitations of NTLM authentication.

Basically, when you log in to a SharePoint web site, you authenticate with the Web Server (IIS). There is no way for IIS to pass through credentials back to SQL Server because NTLM only supports “single hop” authentication (i.e. from one single machine – the browser – to another machine – the web server). For “double-hop” you need a more robust authentication method such as Kerberos (i.e. from one machine – the browser – hop to another machine – the web server – hop a second time to a third machine – the database server?).

Note – This is why you need Kerberos to use pass-through authentication with 3rd party systems (such as CRM or other LOB systems).

Thats all great .. but what do I care?
Well, this all nails down to where the object is that you are trying to access, what the SQL permissions are on that object.

Lets take the example of accessing a Content Deployment Job.

The first problem you will hit is that your account needs to be a Farm Administrator. We already know that making the Application Pool an admin account is bad for security.

So as an alternative you could use ASP.Net Impersonation to get around the SharePoint API, but as we discussed above, this doesn’t solve the NTLM “single-hop” problem (your query is still going to execute in SQL using the Application Pool account, regardless of which account you are impersonating!)

Using .Net Reflector (tsk!) tells us that the Content Deployment Job information is stored in an SPList in the Central Administration Content Database. Using RunWithElevatedPrivileges simply executes using the Application Pool account (which we know from the SQL Permissions above, has very limited permissions).

So lets assume you tried to use Impersonation … what happens?

Well, you get a nasty “Exception in HRESULT” error message.
Delving in to the SharePoint Diagnostics Logs tells something like “ does not have EXECUTE permissions on ‘proc_EnumLists’ in “.

Basically running that code tries to execute a Stored Procedure in SQL in the Central Admin database which the Application Pool Account doesn’t have access to! Your code managed to fool the SharePoint API into thinking you have permissions, but good old SQL Server stops you short (just as it should … good server!)

So what can I do?
Well, the first thing to note is that you won’t always run into this problem.
Many of the Farm level options (including access SSP and User Profile properties) can be gotten around in other ways, but when something like the above happens, your options are limited to 3 potential solutions:

  1. Ignore all of the best practice. Make your application pool account an administrator, and spend your days hiding from the network security admins and hoping it doesn’t all go wrong.
  2. Create a dedicated Web Service, which executes as an admin account. Use this to farm out your “privileged” code, and make sure you lock it down tight as a drum so you can’t get to it from outside of the SharePoint farm!
  3. Don’t do it .. and tell your users that it was a stupid idea in the first place!

Now I admit, Options 1 and 3 probably won’t go down too well, and Option 2 is the best option but still has it’s issues (running a Web Service as an admin account is still a security risk, if a smaller one than running the entire public facing Application Pool as an admin account!)

Summary
We ended up opting for Option 2, admittedly locking it down so that the URL was never published and it would only accept connections from other servers in the farm (so that end users could never access it).

Hopefully you now have a better grasp of SharePoint Application Security, what that super-method “SPSecurity.RunWithElevatedPrivileges” is actually doing and why it doesn’t always work!

Comments a feedback welcome! 🙂

SharePoint 2010 and Office 2010 Beta released!

Yep, Microsoft have got slightly ahead of expectations and the official public beta release of SharePoint 2010 and Office 2010 is now out (although you need a TechNet or MSDN subscription at the moment!).

You can find the download information as well as more details on the relevant websites:

For those of you who have access to the Technical Preview of SharePoint 2010 you can expect to see quite a few changes and improvements in the beta version. For those of you who haven’t seen either, you’re in for a treat!

How to add a Lookup Field to your List dynamically and programmatically

This is an old old problem, and for Content Types it has been easily realised that this can be solved by using a Feature Receiver and creating the lookup through code (to set the lookup list GUID .. something that you cannot do through CAML).

The biggest problem with Lookup fields is with List Definitions. You can still add the Lookup Field to your schema.xml, but it won’t do anything without knowing which list it is supposed to lookup to so you typically end up with a field that doesn’t work.

Now you can get around this IF you know the URL or the ID of the field you want to lookup to, but most list definitions that you release can be created anywhere so this is very rarely possible. Now .. in my particular example, we wanted the Lookup to lookup to itself! In this case, there really isn’t any URL or method through the schema.xml that we can use for this, and managed code is the only route … the main problem there is that there is no “ListAdded” event that you can trap when your list gets created … but then I had a spark of inspiration!

The solution was surprisingly simple and came in a bit of a eureka moment… SPListEventReceiver “OnFieldAdding” event.

You can bind in event receivers using a variety of methods (not covered here) but for my example I bound it into the schema.xml as part of my list definition (so this code only ever executes when the list is created).

The “OnFieldAdding” event then executes every time a new field is added to the list, and this includes fields provisioned from the schema.xml! All I then needed to do was identify the field (which I could do easily because I knew the Field ID) and then I could use managed code to manipulate the SPField object and fill in all the blanks that I couldn’t do from the schema.xml!

I really like this method, as it has opened the door to a way of executing code on a list when it is created. Now I admit it’s not exactly bullet proof because it potentially executes the code a LOT of times (although ideally you would remove the event handler once you’ve finished executing what you needed to), but it was a really nice “other option” .. and certainly one I hadn’t considered before!

« Older Entries Recent Entries »