SharePoint Search {User.Property} Query Variables and Scalability

This was something I stumbled into when working on a large global Intranet (>100k user platform) being built on SharePoint 2013. This is a WCM publishing site using “Search Driven” content leveraging Managed Metadata tagging combined with {User.Property} tokens to deliver “personalised” content. Now .. if there were 2 “buzz words” to market SharePoint 2013 they would be “search driven content” and “personalised results”, so I was surprised at what I found.

The Problem

So we basically found that page load times were >20 seconds and our SharePoint Web Servers were maxed out at 100% CPU usage. The load test showed that performance was very good with low load, but once we started ramping the load up CPU usage went up extremely quickly and rapidly ended up being almost un-usable.

It is worth bearing in mind that this is a completely “cloud friendly” solution, so zero server-side components, using almost exclusively “out of the box” web parts (mostly “Search Result Web Parts”, they would have been “Content by Search” but this was a Standard SKU install). We also use Output caching and blob caching, as well as minified and cached assets to slim down the site as much as possible,

Also worth noting that we have 10 (ten) WFE servers, each with 4 CPU cores and 32GB RAM (not including a whole battery of search query servers, index servers, and other general “back-end” servers). So we weren’t exactly light on oomph in the hardware department.

We eventually found it was the search result web parts (we have several on the home page) which were flattening the web servers. This could be easily proved by removing those web parts from the page and re-running our Load Tests (at which point CPU load dropped to ~60% and page load times dropped to 0.2 sec per page even above our “maximum capacity” tests).

What was particularly weird is that the web servers were the ones maxing out their CPU. The Search Query Component servers (dedicated hardware) were not too heavily stressed at all!

Query Variables anyone?

So the next thing we considered is that we make quite liberal use of “Query Variables” and in particular the {User.Property} ones. This allows you to use a “variable” in your Search Query which is swapped out “on the fly” for the values in that user’s SharePoint User Profile.

In our example we had “Location” and “Function” in both content and the User Profile Database, all mapped to the same MMS term sets. The crux of if is that it allows you to “tag” a news article with a specific location (region, country, city, building) and a specific function (e.g. a business unit, department or team) and when users hit the home page they only see content “targeted” at them.

To me this is what defines a “personalised” intranet .. and is the holy grail of most comms teams

However, when we took these personalisation values out (i.e. replacing {User.Location} with some actual Term ID GUID values) performance got remarkedly better! We also saw a significant uplift in the CPU usage on our Query Servers (so they were approaching 100% too).

So it would appear that SOMETHING in the use of Query Variables was causing a lot of additional CPU load on the Web Servers!

It does what??

So, now we get technical. I used JetBrains “DotPeek” tool to disassemble some of the SharePoint Server DLLs to find out what on earth happens when a Query Variable is passed in.

I was surprised at what I found!

I ended up delving down into the Microsoft.Office.Server.Search.Query.SearchExecutor class as this was where most of the “search” based activity went on, in particular in the PreExecuteQuery() method. This in turn referred to the Microsoft.SharePoint.Publishing.SearchTokenExpansion class and its GetTokenValue() method.

It then hits a fairly large switch statement with any {User.Property} tokens being passed over to a static GetUserProperty() method, which in turn calls GetUserPropertyInner(). This is where the fun begins!

The first thing it does is call UserProfileManager.GetUserProfile() to load up the current users SharePoint profile. There doesn’t appear to be any caching here (so this is PER TOKEN instance. If you have 5 {} declarations in a single query, this happens 5 times!).

The next thing that happens is that it uses profile.GetProfileValueCollection() to load the property values from the UPA database, and (if it has the IsTaxonomic flag set) calls GetTaxonomyTerms() to retrieve the term values. These are full-blown “Term” objects which get created from calls to either TaxonomySession.GetTerms() or TermStore.GetTerms(). Either way, this results in a service/database roundtrip to the Managed Metadata Service.

Finally it ends up at GetTermProperty() which is just a simple bit of logic to build out the Keyword Query Syntax for Taxonomy fields (the “#0” thing) for each Term in your value collection.

So the call stack goes something like this:

=> SearchTokenExpansion::GetTokenValue()
=> GetUserProperty()
=> GetUserPropertyInner()
=> UserProfileManager::GetUserProfile()
=> UserProfile::Properties.GetPropertyByName().CoreProperty.IsTaxonomic
If it is (which ours always are) then …
=> UserProfile::GetProfileValueCollection()::GetTaxonomyTerms()
=> TermStore::GetTerms()
Then for each term in the collection
=> SearchTokenExpansion::GetTermProperty()
This just builds out the “#0” + term.Id.ToString() query value

So what does this really mean?

Well lets put a simple example here.

Lets say you want to include a simple “personalised” search query to bring back targeted News content.

{|NewsFunction:{User.Function}} AND {|NewsLocation:{User.Location}}

This looks for two Search Managed Properties (NewsFunction and NewsLocation) and queries those two fields using the User Profile properties “Function” and “Location” respectively. Note – This supports multiple values (and will concatenate the query with “NewsFunction: OR NewsFunction:” as required)

On the Web Server this results in:

  • 2x “GetUserProfile” calls to retrieve the user’s profile
  • 2x “GetPropertyByName” calls to retrieve the attributes of the UPA property
  • 2x “GetTerms” queries to retrieve the term values bound to that profile

And this is happening PER PAGE REFRESH, PER USER.

So … now it suddenly became clear.

With 100k users hitting the home page it was bottlenecking the Web Servers because every home page hit resulted in double the amount of server-side lookups to the User Profile Service and Managed Metadata Service (on top of all of the other standard processing).

So how to get round this?

The solution we are gunning for is to throw away the Search Web Parts and build our own using REST calls to the Search API and KnockoutJS for the data binding.

This allows us to use client-side caching of the query (including any “expanded” query variables, and caching of their profile data) and we can even cache the entire search query result if needed so “repeat visits” to the page don’t result in additional server load.

This was a fairly high profile investigation, including Microsoft coming in for a bit of a chat about some of the problems we’re facing. After some investigation they did confirm another option (which didn’t work for us, but useful to know) which is this:

  • Query Variables in the Search Web Part are processed by the Web Server before being passed to the Query Component
  • The same query variables in a Result Source or Query Rule will be processed on the Query Server directly!

So if you have a requirement which you can compartmentalise into a Query Rule or Result Source, you might want to look at that approach instead to reduce the WFE processing load.

Cheers! And good luck!

JSLink and Display Templates Part 7 – Code Samples

When I was speaking at the SharePoint Evolutions conference earlier this year I ran into Jeremy Thake (@JThake) and Vesa Juvonen (@vesajuvonen) and they saw the JSLink samples I was running through for the session I had.

Well .. one thing led to another as they say .. and by the time the conference finished I had refactored all of my code and uploaded it to the OfficeDev PnP (Patterns and Practices) GitHub repository.

So you can find all of my sample code in the “Branding.JSLink” section (OfficeDev PnP > Samples > Branding.JSLink).

This includes full documentation, full source code, a compiled deployable WSP package, as well as loads and loads of awesome stuff from the rest of the OfficeDev PnP contributors.

The sample code includes:

  • Re-Render Lookups as bulleted lists and checkboxes
  • Cascading Drop-Downs for Lookup Fields
  • Cascading Drop-Downs for Managed Metadata Fields
  • Google Maps integration (allowing both pin-point and shape selection)
  • Sample colour picker

And here are some tasty screenshots to get you in the mood!

Cascading Lookup Fields, with Checkboxes (multi-select lookups)

Cascading Drop Downs dynamically loaded from a Taxonomy (Managed Metadata Term Set)

Google Maps Thumbnails in List Views

Extensive editing interface for Google Maps fields

Simple Colour formatting

Announcement View as an Accordian


Hope you enjoyed the series (sorry it took so long!)

JSLink and Display Templates Part 6 – Creating View Templates and Deployment Options

Well first of all .. OH MY GOD I AM SORRY .. this has taken an absolute age to get out of the door. There really isn’t any excuse (although I’m going to try and use the excuse of the birth of my second child along with crazy busy real-world-life getting in the way).

But .. I am back and should be blogging a little bit more frequently from now on! So .. the JSLink stuff .. where was I? (believe it or not this series has been going on for almost 2 YEARS!).. View Templates! Right!

In Part 5 we covered the ability to override the rendering of List Views, and from a developer perspective this was awesome, but it isn’t that useful from content editor’s perspective. The field-level overrides are easy enough to push through (because they can be applied to every single instance of a field at either the Site Column or Content Type scope) but the views tend to get in the way a little bit.

What we really need is the ability for someone who creates a new view to be able to pick one of our custom view templates, and that is exactly what we are going to do here. This is perhaps one of the least known features of the JSLink / Client-Side-Rendering approach for SharePoint 2013 and even people I have spoken to who have been doing JSLink development for a while now didn’t know about this.

In order to get this to work you will need to make sure each of your “views” are encapsulated into separate JavaScript files (one view per file) and you will be uploading them into the Master Page Gallery (if any of you have read my Content Search Web Part series, or done any development with Search Results display templates, then all of this should be intimately familiar!).

Now you can put these files anywhere in the Master Page Gallery, my personal preference is to create yourself a new folder called “List Views” in the “Master Page Gallery > Display Templates” folder. The secret sauce however is the choice of Content Type:

  • Content Type: JavaScript Display Template
  • Name: <name of file>
  • Title: <how it will appear when selecting the template>
  • Target Control Type: View
  • Standalone: Standalone
  • Target Scope: <Relative URL where you want it to be used>
  • Target List Template ID: <ID of list where view is available> (Optional)

To keep in line with my example in Part 5 I have added “MJH Announcement View” with a Target Scope of “*” (i.e. all sites) and a List Template ID of 104 (Announcement Lists)


Once this has been saved then if you browse to any announcement list and create a view then my new View type is available from the template selection screen!


Now finally a word on Deployment Options and this is relatively straightforward.

Where should I put my files … ?

The first thing is that your files need to reside in SharePoint. I had a conversation with @MarkStokes about this just the other week and he was trying to load his JS files from Azure Storage (so that you could upgrade multiple O365 tenancies from a single file). This didn’t work, as the absolute URLs in the JSLink properties weren’t being picked up and the files weren’t loaded.

The solution was adding a lightweight JavaScript “script loader” .. basically just a short JS file which then dynamically loaded in the reference JS from Azure.

In terms of where THAT file lives, the Master Page Gallery or Style Library are obvious choices as they include automatic permissions for all users to have limited read access. The JSLink properties then allow you to reference dynamic ~sitecollection URLs so you can get a reference URL to them relatively easily.

How do I get them there … ?

Again, this is really standard SharePoint stuff. If you are already using a No-Code-Sandbox-Solution then simply using a Module to push the files in makes sense. If you are instead using a Remote-Code provisioning approach (either using a Provisioning App or PowerShell approach) then you will be using CSOM to push the files in.

The only thing to bear in mind is what other assets rely on those JS files. If your JS file provides rendering overrides for a Site Column then deploy the file in the same feature / provisioning logic that you are using to provision your Site Column. If you want the end users to be able to turn it on and off for a given site then a separate Web Scoped feature makes plenty of sense.

If you want to apply it carte blanche to all sites then a Custom Action to inject the JavaScript file using the ScriptLink element will allow you to push this onto every page without touching the master page, but be aware that this will also include plenty of back-end system pages (file dialogs, site settings, and so forth) so make sure you thoroughly test your code, and make it defensive enough that it doesn’t throw unexpected errors when the SharePoint libraries you expect to be there aren’t present.

If you are just mocking this up as an example then you can of course just upload the file manually, and for REALLY quick demos just copy-paste the JavaScript into a Script Editor Web Part!

November Events : Speaking at SPCON14 and SPSUK

Well, looks like November is going to be a busy month for me.

For starters I’m booked into the Combined Knowledge “App Venture” course with the legendary Ted Pattison (all about SP2013 Apps and I’m hoping to pick up some MVC, MVVM, Knockout and AngularJS goodness).

I’m also honoured to be invited back to speak at both SharePoint Connect 2014 (#SPCON14) and SharePoint Saturday UK (#SPSUK), where I’ll be talking about using JSLink and Display Templates to provide advanced list rendering such as Cascading Drop Downs, Custom UIs, validation checking and integrating Google Maps functionality.

SharePoint Connect Conference 2014 (18th and 19th November)

SharePoint Connect 2014 - I'll be there! - Martin Hatch

This is a fantastic 2 day conference being held over the 18th and 19th of November at the Meervaart Theature in Amsterdam. It has been running for several years now and this will be the second time that I’ll be speaking at this event, last year was a blast! It was even voted in an independent poll as the 3rd best SharePoint conference in the world!

The sessions are split up amongst the usual IT Pro, Dev and End User tracks, as well as the increasingly common Office 365 and Business Tracks. There are also sponsor sessions where you can find out about vendor products and tools related to the SharePoint world.

If you are interested in going but haven’t bought any tickets yet then you can still get a 10% Discount using the code SA238 when you sign up at:

SharePoint Saturday UK (29th November)

This is always on my calendar as soon as it is announced. It is a great event (as many of the “SharePoint Saturday” events are) and it is a genuine pleasure to both attend and speak at this conference, and best of all .. its FREE!

It is held every year in the UK Midlands, a refreshing change from what are normally London oriented events, and this year (as it was last year) it will be held at the Hinckley Island Hotel (in Hinckley, of all places!).

Again there is a great spread of sessions among IT Pro, Dev, Information Worker and also Business and Social tracks (so I’m hoping to see some Yammer integration at some point). Despite being a free event there are some world class speakers in attendance and a varied set of sessions, so please register and hopefully I’ll see you there!

Customising the Content Search Web Part – Part 4 – Packaging & Deployment in Visual Studio

This is the final post in a series I have been writing on the Content by Search Web Part (aka CSWP).

  1. What you get in the box
  2. Custom Display Templates with JavaScript
  3. Going Old Skool with XSLT
  4. Packaging & Deployment in Visual Studio (this post)

So in this final post we will be looking at how your Content Search Web Parts can be deployed as WSP packages in Visual Studio.

The first thing you will need to decide is:

Do you deploy the HTML Designer File?


Or just the final JS file?

This was really triggered with a discussion I had with Chris O’Brien (@ChrisO_Brien) when we got talking about Content Search Web Parts and what the best approach was for deploying them.

In my opinion it really comes down to what environment you are deploying to, and whether the admin / power users will need to have access to a (much easier to modify) designer file.

Deploying the JS File

This is definitely the easiest approach as it doesn’t really involve anything too complicated. You would still start off with your HTML Designer file and deploy it to your DEV box, but you would then extract the compiled JS file and drop THAT file into Visual Studio.

You can then deploy it using a standard Module element.

<?xml version="1.0" encoding="utf-8"?>

<Elements xmlns="">

  <Module Name="MJHDisplayTemplates" Url="_catalogs/masterpage/Display Templates/Content Web Parts" Path="MJH Display Templates" RootWebOnly="TRUE">

    <File Url="Item_MJHSummary.js" Type="GhostableInLibrary" />



Deploying the HTML Designer File

This is a little bit more tricky. The main problem is that the JS file is compiled “on the fly” by an SPItemEventReceiver in the Master Page and Page Layouts Gallery. Of course, event receivers do not get fired when a file is dropped in from a module from a feature, so you basically need to go and give SharePoint a prod to make it “do its thing”.

My approach is to use a Feature Receiver to “touch” the file (which prompts the event receiver to fire) so that your JS file is then compiled.

In order to make this more dynamic we will inject the Feature ID as a property of the SPFile which is actually provisioned by the module. Thankfully this is a relatively easy thing to achieve.

<?xml version="1.0" encoding="utf-8"?>

<Elements xmlns="">

  <Module Name="MJHDisplayTemplates" Url="_catalogs/masterpage/Display Templates/Content Web Parts" Path="MJH Display Templates" RootWebOnly="TRUE">

    <File Url="Item_MJHSummary.html" Type="GhostableInLibrary">

      <Property Name="FeatureId" Value="$SharePoint.Feature.Id$" Type="string"/>




The trick then is to have a Feature Receiver which looks for all of the files which have that property and modify the file in some way (I just pull the file bytes and push it back again, basically uploading a duplicate copy of the file, just calling SPListItem.Update() or SPFile.Update() didn’t seem to work!).

string featureId = properties.Feature.Definition.Id.ToString();

SPSite site = properties.Feature.Parent as SPSite;

SPFolder displayTemplateFolder = rootWeb.GetFolder("_catalogs/masterpage/Display Templates/Content Web Parts");



    SPList parentList = folder.ParentWeb.Lists[folder.ParentListId];


    SPFileCollection files = folder.Files;

    var templateFiles = from SPFile f

                          in files

                          where String.Equals(f.Properties["FeatureId"] as string, featureId, StringComparison.InvariantCultureIgnoreCase)

                          select f;


    List<Guid> guidFilesToModify = new List<Guid>();

    foreach (SPFile file in templateFiles)





    foreach (Guid fileId in guidFilesToModify)


        // instantiate new object to avoid modifying the collection during enumeration

        SPFile file = parentList.ParentWeb.GetFile(fileId);


        // get the file contents

        byte[] fileBytes = file.OpenBinary();


        // re-add the same file again, forcing the event receiver to fire

        folder.Files.Add(file.Name, fileBytes, true);



So in the above code sample (which is inside a “Feature Activated” method) we are retrieving the Feature ID for the feature which is activating. We then proceed to the folder where we provisioned our files and did a simple query to pull out those files which have the feature ID in their properties (which we set in our module above).

We then pull the binary data of the file as a Byte Array, and then push exactly the same file back into the folder (which triggers the event receiver to fire).

And that should be all you need!

Customising the Content Search Web Part – Part 3 – Going old Skool with XSLT

This is the third post in a series I will be writing on the Content by Search Web Part (aka CSWP).

  1. What you get in the box
  2. Custom Display Templates with JavaScript
  3. Going Old Skool with XSLT (this post)
  4. Packaging & Deployment in Visual Studio

Now I am admittedly going to cop out here. I was originally intending to write this up in detail but to be honest it has already been done (very well) before.

So .. I would invite you to read the most excellent blog post from Waldek Mastykarz (@waldekm).

Using server-side rendering with Content Search Web Part in SharePoint 2013

He not only shows you how to hook up your JS Display Template with a server-side XSL file, but also shows you how to deploy the files and it’s primary usage (as a Search Crawler output).

If you want to use this method for all of your web requests then you can set the AlwaysRenderOnServer property of the CSWP to “true” and it will always use your XSL template file.


Some real photos from 6 Months of using a 41MP Nokia Lumia 1020

I have long been an advocate of both the Windows Phone operating system and Nokia Lumia phones, and one thing that I always look for in a new smartphone is a decent camera! “The best camera is the one you have with you” is one that rings strongly with me and I am frequently finding myself out at kids parties, in the park or just on family days out without my bulky DSLR (which is awesome, but due to the size of it I just can’t slip it in my pocket).

The Lumia 1020 therefore grabbed my attention (and media headlines) when it was announced with a 41MP camera! Now, this isn’t some marketing gimic or crazy “lets cram more MP in” kind of publicity stunt. This was coupled with some fancy Nokia tech (from their “PureView” camera team) which uses a combination of pixel oversampling and post-processing to produce fantastic images (the final images typically being around 34MP plus a 5MP version for sharing on social networking sites).

I won’t go into too much detail about the technical details here but you can certainly read more about it here, here and here.

So .. the purpose of this article is to share some of the photos I have taken myself in the 6 months or so of owning and using a Lumia 1020 in everyday life. I’ve specifically picked ones which don’t contain people (other than myself) and are pretty harmless (no photos of my kids of family just yet ;)).

Read more »

SharePoint 2013 Reference Sheet – Services on Server

This post will provide a description of each of the SharePoint Services in the “Services on Server” section of Central Administration, describing what it is for and anything you need to look out for.

This was borne out of a frustration of checking client environments and consistently finding environments which had services running which they weren’t using (and were never going to use!).

Note that some of these also have a corresponding Service Application that you will need to create in order to use them.

Service Purpose Comments
Access Database Service 2010 Enables SharePoint 2010 Access Services functionality
Access Services Enables SharePoint 2013 Access Services functionality Required for “Access Apps”
App Management Service Manages SharePoint App licenses and permissions Required for Apps to work
Application Discovery and Load Balancer Service Determines which server to send Service Application requests to .. this is how SharePoint automatically balances load. Fundamental SharePoint Service, defaults to “Round Robin”Can be extended with custom load balancing code if you are brave enough!
Business Data Connectivity Service Enables BCS which provides External Content Types and External Lists. Required if you want to sync external LOB systems with User Profiles.
Central Administration Hosts the Central Admin Web Application The default URL is set to the server-name of the first server SharePoint is installed on.If you want to run this on multiple servers you should consider Alternate Access Mappings with a DNS entry
Claims to Windows Token Service (C2WTS) Used to convert SharePoint Claims back into Windows Tokens for Kerberos delegation. Required for Kerberos when used with BI Tools.Requires some manual steps:
Distributed Cache Heavily used in SharePoint.Requires ICMP ports open between SharePoint Servers.

Numerous Gotchas:


Document Conversions Launcher Service Enables an extension point to configure conversion from one document format to another.
Document Conversions Load Balancer Service Enables an extension point to configure conversion from one document format to another.
Excel Calculation Services Enables the Excel Services BI functionality Significant RAM overhead (recommended 32GB)If you want to use “PowerView” in Excel then you also need to configure the SQL PowerPivot add-on.

Kerberos will require C2WTS.

Lotus Notes Connector Allows you to connect Search to a Lotus Notes database to enable indexing and crawling of Lotus content.
Machine Translation Service Allows an API for developers to submit content to be translated into another language.
Managed Metadata Web Service Enables “Managed Metadata” taxonomies and term sets Required for default navigation settings in SP2013 (navigation stored in Term Store)Required for User Profiles
Microsoft SharePoint Foundation Incoming E-Mail Enables inbound emails to be stored in Document Libraries. Requires significant configuration including AD and DNS settings:
Microsoft SharePoint Foundation Outgoing E-Mail Enables outbound emails from the server Needs to be configured in Central Admin
Microsoft SharePoint Foundation Sandboxed Code Service Allows Sandbox Solutions to be used and executed. Required for SharePoint Hosted Apps, Design Manager and “Save as Template” functions.(SharePoint Hosted Apps use the Sandbox Code Service to provision features and content in the “App Web”)
Microsoft SharePoint Foundation Subscription Settings Service Manages subscriptions between Sites and Apps. Required for Apps to work
Microsoft SharePoint Foundation Web Application  Hosts the Content Web Applications If this is enabled, all of the Web Applications (except for Central Admin) will be deployed.Also determines which servers are deployed to when deploying a WSP which is Web Application Targeted.
Microsoft SharePoint Foundation Workflow Timer Service Runs any SharePoint 2010 style Workflows
PerformancePoint Service Runs the PerformancePoint BI component. Significant RAM overhead (recommended 32GB)Kerberos will require C2WTS.
PowerPoint Conversion Service Enables an API for developers to convert PowerPoint Presentations to various different formats (e.g. PPTX / PDF / JPG / PNG)
Request Management Allows custom routing rules for requests made to Service Applications. e.g. to route Excel traffic from one Site to a specific server If you turn this on without defining any routing configurations then everything breaks!
Search Administration Web Service Servers that run search components This service is automatically started on all servers that run search topology components.
Search Host Controller Service Servers that run search components This service is automatically started on all servers that run search topology components.
Search Query and Site Settings Service Servers that run the query processing component This service is automatically started on all servers that run search topology components.
Secure Store Service Allows storage of credentials and other secure information. Database is encrypted using a “Master Key” configured when the service is setup.Required for the BI Unattended Service Account configuration
SharePoint Server Search Crawls content for the search index This service is automatically started on all servers that run search topology components.Note – Cannot be stopped or started from the Services on Server page
User Profile Service Manages the user profiles, creation of My Sites and SharePoint Social Features (and associated permissions and properties). Required for “High Trust” Provider Hosted Apps to work
User Profile Synchronization Service Used to synchronise data from AD (and other data sources) into the Profile Database. Not necessarily required for User Profiles!If you are just doing the “Directory Import” option then this service is not required.

The definitive “how-to” guide:

Visio Graphics Service Allows Visio diagrams to be deployed to SharePoint so they can be displayed in the browser.
Word Automation Services Enables an API for developers to convert Word Documents to various different formats (e.g. DOCX / PDF)
Work Management Service Allows task aggregation, particular bringing together Tasks from Exchange, Project Server and SharePoint. Requires Search and My SitesTasks are stored in a hidden list in the user’s My Site.

So that gives you a run through of the SharePoint 2013 Services, and hopefully an indication of whether you should have them running or not!

Any additional suggestions, comments or errors you’ve spotted please let me know and I’ll try and keep this updated!

Note – this list only includes the Vanilla SharePoint 2013 services and does not include services added through other installs like SQL Server: {PowerView, Reporting Services, PowerPivot}

For more detail, including some great technical detail you can also check this TechNet article out:

Plan service deployment in SharePoint 2013 (

Windows 8, Hyper-V, BitLocker and “Cannot connect to virtual machine configuration storage”

So I am now working at a new professional services company in South East England (Ballard Chalmers) who use Hyper-V throughout their DEV / TEST environments. I have previously been a VMWare Workstation person myself (and I still think the simplicity and ease of the user interface is unmatched) but for the foreseeable time I will be running Windows 8.1 Pro on my laptop as a Hyper-V host.

Before we get started it is worth describing my setup:

  • Windows 8.1 Pro
  • 3rd Gen Intel i7-3820QM CPU
  • 32GB DDR3 RAM
  • Two physical disk drives
    • C:\ SYSTEM – 512GB SSD (for Operating System, Files and Applications)
    • D:\ DATA – 512GB SSD (for Hyper-V Images and MSDN ISOs) (running in an “Ultra-Bay” where the Optical Drive used to be)

Now like most modern laptops I have a TPM (Trusted Platform Module) on my machine so I also have BitLocker encryption running on both my C: and D: drives (for those who are interested I barely notice any performance drop at all .. and I can still get 550 MB/s sustained read even with BitLocker enabled).

Saved-Critical – Cannot connect to virtual machine configuration storage

Now I noticed from time to time that my Virtual Machines were showing error messages when my computer started up. I noticed it here and there until Thomas Vochten (@ThomasVochten) also mentioned he was getting it every time he started his machine up.

Hyper-V Error

Note – You can get this error for all sorts of reasons, particularly if you have recently changed the Drive Letters, re-partitioned your hard disks or moved a VM. In this case I was getting the error without doing anything other than turning my laptop on!

Read more »

64GB of RAM in a Laptop, and why I want it …

Well, the rumour mills have been well and truly circulating recently about the potential for high capacity DRAM chips which could allow laptops to have up to 64GB of memory. I was recently directed to this article ( from the ArsTechnica forums.

This article basically describes a new method of DRAM stacking (as opposed to the standard method of NAND stacking) which allows the production of 16GB SODIMMs chips. My current laptop has four SODIMM slots (like pretty much every other high-end laptop on the market) so with the current maximum of 8GB SODIMMs my laptop supports 32GB RAM. If I could use 16GB SODIMMs then I could theoretically swap those chips out for a straight 4x 16GB SODIMMs (i.e. 64GB of RAM).

The best news is that these chips could be on the market this year!

“Mass production is set to begin in March and April, with initial pricing per 16GB module in the $320-$350 range for both DIMM and SO-DIMM, ECC being on the higher end of that range.” (source: Anandtech article linked above)

Read more »

« Older Entries