10 signs you are an Office 365 Consultant!

Over the last few weeks my MVP colleague Luise Freese and I joined up to create a sway about “10 signs your are an Office 365 Consultant.” Hope you enjoy the read!

You can find the sway here


Activate Hyper-V Feature on Windows 10 using PowerShell DSC

Recently I got a new laptop and I am planning to configure the laptops OS the best way possible.

As I’m a big fan of PowerShell DSC I want to use PowerShell DSC for this configuration. I did use PowerShell DSC in many customer engagements with Windows Server OS. Sadly the switch to Windows 10 was not as smooth as expected.

PowerShell Execution Policy

The very first and major blocking error was the PowerShell Execution Policy. Per default the PowerShell Execution Policy on Windows 10 is set to RemoteSigned. Not every DSC Module is remote-signed. A possible solution is not pretty: Setting the execution policy to a lower level of security.

BTW: This ExecutionPolicy blocks also the Domain Join of a Windows 10 VM in Azure (see: Error: “Resource”.psm1 is not digitally signed)

WindowsFeature Resource

As part of my configuration I did want to setup Hyper-V. Windows Features can be configured by using the WindowsFeature Resource. There is just one limitation that will error during runtime: The WindowsFeature Resource does require functions, that are only available in Windows Server OS. During runtime of the configuration there will be the following error message:

“Installing roles and features using PowerShell Desired State Configuration is supported only on Server SKU’s. It is not supported on Client SKU.”

To get around this issue I created the following Script Resource:


If you never heard of chocolatey head over to their website: The package manager for Windows. Chocolatey is a package manager for windows and allow you to install software from their repository very easy. The even provide a DSC Resource to automate this process even further.

Implementing the cChocoInstaller comes with a small hurdle: Make sure you have the InstallDir created before using the cChocoInstaller resource. I use a File Resource in dependency with chocolatey to avoid this glitch.



How to filter for SharePoint Framework #SPFx related tweets in TweetDeck

Over the last couple of days there were some tweets related to the beloved #SPFx keyword. Stefan Bauer shared something related to SharePoint Framework and got a new follower 😛

That’s the one side of the medal of using an abbreviation for SharePoint Framework that is also used by special effects and make-up folks.

The other side of the medal is: How do I filter in TweetDeck for Tweets that are related to SharePoint and not to makeup or something else?

Please find below my current filter for TweetDeck. Please do not hesitate to suggest changes to the filter for tweets related to SharePoint Framework.

The current Filter can be found in this github repository: https://github.com/andikrueger/TweetDeckSPFxFilter

The first version of the filter is stored in this gist:

Using Desired State Configuration (DSC) for SaaS – Office365DSC thoughts

PowerShell Desired State Configuration (DSC) enhances your setup experience of new environments like no other technique before. There are around 270 different DSC resources available that provide methods used to configure windows server components and software like:

  • SharePoint Server
  • SQL Server
  • Active Directory
  • IIS

I’m using DSC very frequently. It gives me great advantages over my previous PowerShell scripts. I can use DSC in combination with Azure DSC to configure my systems and track their status. Configuration drifts can only happen for parts that are not part of my configuration.

The part that is missing: PowerShell DSC is a per node technology. Every server, that I want to configure must have a local configuration manager (LCM), that is responsible for applying my configuration. The LCM is responsible for applying the configuration either as local system account or as a configurable account.

Fast forward: Currently more and more people switch to Software as a Service (SaaS) offerings like Office 365. Office 365 offers many configuration options, but there is no LCM available, that would handle the configuration.

Let’s speak about a possible Office 365 DSC resource.

This resource should be responsible for administering Office 365 in a DSC way. I see the following options on how to apply a configuration to Office 365:

  1. Create a configuration and apply this configuration on node “localhost”. This means we are using the current computer and the LCM to apply a configuration to a SaaS.
    This looks like an “ok” solution. The configuration depends on the local system and nothing of the configuration gets apply to the local system. This feels odd.
  2. Azure Runbooks offer a way to run a PowerShell script, reuse PowerShell Modules and schedule the script.
    Compared to a local PowerShell and LCM Azure Runbooks could be the way to go. A SaaS to configure another SaaS – this feels kind of perfect – but won’t work as many cmdlets are dependant on .NET components, that can’t be loaded into Azure Runbooks and this is very not DSC.

Maybe there are other options available, that I can’t think of. If you have a suggestion, feel free to leave a comment.

Currently the most resources focus on products and functions that life on a Windows Server System – the configuration is specific to a Node.

Thoughts about an Office365DSC Resource

Now many customers start using Office365. In Germany customers are very aware of their data and sometimes spend a good amount of time to define a governance for Office365.

With an option to configure Office365 with DSC, they could gain a lot of comfort and overview of what is configured and how. Working with test tenants would be very easy, as you could replicate your production settings to your test tenant easily – despite there is no Office365 DSC Resource available yet.

What are the current options to script your Office365 administration?

Why not to script everything and use DSC instead?

DSC is about configuration management. If I want to update any setting in the Office 365 admin centre, a DSC configuration seems to be the best option.

The scripting guy would load the PowerShell Module, the CLI or open the admin portal to change a setting with a function call like:

IWantToSetThis-Function -Something This

With DSC there would happen something else:

Something {
    State = "This"

DSC would try to get the current setting for “Something”, compare the parameter “State” to “This” and only if they differ, call the function above.

Is DSC only a better approach to script?

I’m a big fan of using the DSC approach. In the end it’s just PowerShell, but in a better structured manner. Having a predefined set of Get-, Set- and Test-Functions (and with Office365DSC Export-Function) allows to reuse the functionality provided.

Where can I find Office365DSC?

You can find the repository of Office365DSC at github: Microsoft/Office365DSC

The struggle with configuration data in DSC configurations – SQL Alias and reusability

Today I was very happy to find a neat solution to handle configuration data for a DSC configuration. I was facing the following challenge:

In a SharePoint DSC Configuration I want to reuse several SQL Aliases that are created during run time of the DSC configuration based on the configuration data.

In recent DSC setups my configuration did look like this:

and I created the SQL Alias with the following lines of code:

This far there was no struggle at all. Creating a SQL Alias with DSC is very straight forward, even if there is the need to create more than one.

The struggle got real the moment I had to reuse the Alias Name in SharePoint. How do I properly access the Alias? Do I iterate over all aliases again and filter or do I hard code the alias name or…? None of the before felt right.

My solutions is pretty simple: Why not change the array to another hash table – A hash table allows to access the data more easily. 🙂

Final challenge: How can I iterate over a hash table? A hash table object has two properties: keys and values:

So what changed in my SharePointDsc configuration part? Now I can address my SQL Alias properly without having any troubles:

SharePoint 2016 MinRoles – Behind the scenes and where the C2WTS should be provisioned

Have you ever wondered how SharePoint 2016 MinRoles are working under the hood? Everything is about the ShouldProvision method of the SPServiceInstance class.

This issue is still under investigation – current I think this issue occurs, when SharePoint is installed, followed by a Language Pack and then the creation of the farm. Checking the compliance status of the min role at this particular time will result in a not compliant state.

Microsoft introduced with SharePoint 2016 a new feature called “MinRoles”. MinRoles offer a new way on how to create SharePoint topologies. The following MinRoles are available:

  • Front-End
    A role made for all loads in context of serving SharePoint
  • Application
    This role is optimized for all services that need to run in a SharePoint Farm – without the Search loads
  • Distributed Cache
    Hosting the Distributed cache service
  • Search
    All services associated to the search load of SharePoint
  • Custom
    This role is needed if you plan on using Business Intelligence loads, as they are not pat of the other MinRoles
  • Single-Server farm

In October 2016 Microsoft release Feature Pack 1 (FP1) for SharePoint 2016. The FP1 offers two new MinRoles (MinRolesV2):

  • Front-end with Distributed Cache
  • Application with Search

These two roles combine the prior roles, so that customers can create high available (HA) farms for SharePoint with less servers. Prior to FP1 you did need at least two servers of (each) role: 2 WFE, 2 App, 2 Search, 2 Distributed Cache. In total 8 servers. After you install FP1 you can switch to the combined min roles and will be able to create a SharePoint HA Farm with 4 servers instead. This offering focuses on SharePoint customers with HA requirements, but not enough workload for hosting 8 SharePoint servers.

With each MinRole a set of services can be run on a server. SharePoint 2016 enforces the state of these services. For a complete list of all services that belong to a role, visit the technet documentation.

Behind the scenes

New “Services in Farm” experience in Central Administration

The following screenshot shows the “Services in Farm” page of SharePoint 2016 Central Administration:

Services on Server on SharePoint 2016

A soon as any MinRole is defined the Services on Server page will show the selected servers role and for each service the status and its compliance state. SharePoint offers you some options:

  • you can stop a service
  • you can fix a non-compliant state of a service with one click

Where does this MinRole compliance information come from?

In a recent customer engagement I stumbled across something, that bothered me. The customers SharePoint Farm uses MinRolesV2 and I did check the Microsoft Documentation for wich services are allowed on the server. After a while I had a service in a not compliant state and did not know why. I reached out to the community, but did not end up, with a define answer:

Today I spend some time with the SharePoint source code and did find the answer:

SharePoint Service instances are represented by the SPServiceInstance class. Every SharePoint Service Instance (e.g. SPWindowsTokenServiceInstance) derives from this class and overrides the following method:

 // Microsoft.SharePoint.Administration.SPServiceInstance
 public virtual bool ShouldProvision(SPServerRole serverRole)

There is a additional internal method: ShouldProvisionInternal, which does some additional the tests for the following roles:

  • SPServerRole.WebFrontEndWithDistributedCache
  • SPServerRole.ApplicationWithSearch

When your server is of the roles above, the ShouldProvision method will be called with both single roles and the combined role. If any of these tests returns true, then this role is compliant.

Not knowing, whether the documentation  is wrong or the code, I did investigated the SPWindowsTokenServiceInstance code.

  // Microsoft.SharePoint.Administration.Claims.SPWindowsTokenServiceInstance
 public override bool ShouldProvision(SPServerRole serverRole)
     return SPServerRole.SingleServerFarm == serverRole || SPServerRole.Application == serverRole || SPServerRole.WebFrontEnd == serverRole || (SPServerRole.DistributedCache == serverRole && !SPServerRoleManager.IsMinRoleV2Enabled()) || (SPServerRole.Search == serverRole && !SPServerRoleManager.IsMinRoleV2Enabled()) || SPServerRole.Custom == serverRole;

The implementation above, does not include tests for the MinRolesV2 – but the non V2 roles are introduced through the ShouldProvisionInternal method anyway. Following this, there is no error in the code or the documentation. This applies for the Microsoft.SharePoint.dll in Version: 16.0.4561.1000

So what is wrong with my farm (Januar CU 2018):

 // Microsoft.SharePoint.Administration.Claims.SPWindowsTokenServiceInstance
 public override bool ShouldProvision(SPServerRole serverRole)
     if (SPServerRole.SingleServerFarm != serverRole && SPServerRole.Application != serverRole && SPServerRole.WebFrontEnd != serverRole && (SPServerRole.DistributedCache != serverRole || SPServerRoleManager.IsMinRoleV2Enabled()) && (SPServerRole.Search != serverRole || SPServerRoleManager.IsMinRoleV2Enabled()))
         return SPServerRole.Custom == serverRole;
     return true;

The code above is taken from a Microsoft.SharePoint.dll in File Version: 16.0.4639.1002

This means the current documentation does not reflect the code properly. In case, that I read the code properly, the the only allowed role to host C2WTS with the MinRoleV2 is the Custom role.

Conclusion: There is a drift in the documentation. When I’m not mistaken, I do not need the C2WTS in a non-BI enabled farm. The BI enabled farm does need a MinRole Server “Custom” to run reporting services and other roles. This answers why, the C2WTS is no longer allowed in any other role. Maybe someone should update the documentation…

Update 2018-03-23:

I did double check the code of the ShouldProvisionInternal and ShouldProvision methods and have a strong believe, that there is something wrong to determine the MinRole compliance. Find below a screenshot of what I think the returns should be, but somehow are not!

Expected return values for the ShouldProvisionInternal method

Following this code: The result should be that the MinRoleV2 ApplicationWithSearch should be able to run the Claims to Windows Token service.

I really hope, someone can me help me to figure this out. Is there any part, I do not read correctly?

SharePoint 2016 – Central Administration – ScriptResource.axd 404 error

Recently my colleague and I came across a SharePoint2016 Farm, that had some issues with the suitebar not loading. In the network trace we found a 404 error for the ScriptResource.axd request. Using the event viewer we came across the following error message:

Event code: 3012
Event message: An error occurred processing a web or script resource request. The requested resource 'ZSystem.Web.Extensions,,,31bf3856ad364e35|MicrosoftAjax.js|' does not exist or there was a problem loading it. 
Event time: 21.03.2018 09:57:57
Event time (UTC): 21.03.2018 08:57:57
Event ID: 59637ef8b92b43b59cadfd4b3ce61619 Event sequence: 7 Event occurrence: 1 Event detail code: 0 
Application information: 
 Application domain: /LM/W3SVC/275657850/ROOT-1-131660962662183250 
 Trust level: Full 
 Application Virtual Path: / 
 Application Path: C:\inetpub\wwwroot\wss\VirtualDirectories\4432\ 
 Machine name: SERVERNAME 
Process information: 
 Process ID: 6916 
 Process name: w3wp.exe 
 Account name: ***\***** 
Exception information: 
 Exception type: ZLibException 
 Exception message: The underlying compression routine could not be loaded correctly.
 at System.IO.Compression.DeflaterZLib.DeflateInit(CompressionLevel compressionLevel, Int32 windowBits, Int32 memLevel, CompressionStrategy strategy)
 at System.IO.Compression.DeflaterZLib..ctor(CompressionLevel compressionLevel)
 at System.IO.Compression.DeflateStream.CreateDeflater(Nullable`1 compressionLevel)
 at System.IO.Compression.DeflateStream..ctor(Stream stream, CompressionMode mode, Boolean leaveOpen)
 at System.IO.Compression.GZipStream..ctor(Stream stream, CompressionMode mode)
 at System.Web.Handlers.ScriptResourceHandler.ProcessRequestInternal(HttpResponseBase response, String decryptedString, VirtualFileReader fileReader)
 at System.Web.Handlers.ScriptResourceHandler.ProcessRequest(HttpContextBase context, VirtualFileReader fileReader, Action`2 logAction, Boolean validatePath)

The type initializer for 'NativeZLibDLLStub' threw an exception.
 at System.IO.Compression.ZLibNative.ZLibStreamHandle.DeflateInit2_(CompressionLevel level, Int32 windowBits, Int32 memLevel, CompressionStrategy strategy)
 at System.IO.Compression.DeflaterZLib.DeflateInit(CompressionLevel compressionLevel, Int32 windowBits, Int32 memLevel, CompressionStrategy strategy)

Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))
 at System.Runtime.InteropServices.Marshal.ThrowExceptionForHRInternal(Int32 errorCode, IntPtr errorInfo)
 at System.Runtime.InteropServices.Marshal.ThrowExceptionForHR(Int32 errorCode, IntPtr errorInfo)
 at System.IO.Compression.ZLibNative.ZLibStreamHandle.NativeZLibDLLStub.LoadZLibDLL()
 at System.IO.Compression.ZLibNative.ZLibStreamHandle.NativeZLibDLLStub..cctor()

Request information: 
 Request URL: http://SERVERNAME:10000/ScriptResource.axd?d=8QMyovunFMoY381OpYQyz9XcLkCjJ_XuifoGDX5Q8vatF2gNXkCNXfW0-6Dz_mvZVHapsY1FH-l9zy2l3q3V1Z5dqWndvspmAcN7L1hB-UomBeamTvhHGIe5ZcW_f-DqtS3ymyGl5Bk-ybE1j6mAb8ZMhM8n4e_WwQEe8IkCMRsq_DHxVXNP720ZZXWT1ATa0&t=ffffffffad4b7194 
 Request path: /ScriptResource.axd 
 User host address: ***SERVERNAME 
 Is authenticated: False 
 Authentication Type: 
 Thread account name: NT AUTHORITY\IUSR 
Thread information: 
 Thread ID: 20 
 Thread account name: NT AUTHORITY\IUSR 
 Is impersonating: False 
 Stack trace: at System.IO.Compression.DeflaterZLib.DeflateInit(CompressionLevel compressionLevel, Int32 windowBits, Int32 memLevel, CompressionStrategy strategy)
 at System.IO.Compression.DeflaterZLib..ctor(CompressionLevel compressionLevel)
 at System.IO.Compression.DeflateStream.CreateDeflater(Nullable`1 compressionLevel)
 at System.IO.Compression.DeflateStream..ctor(Stream stream, CompressionMode mode, Boolean leaveOpen)
 at System.IO.Compression.GZipStream..ctor(Stream stream, CompressionMode mode)
 at System.Web.Handlers.ScriptResourceHandler.ProcessRequestInternal(HttpResponseBase response, String decryptedString, VirtualFileReader fileReader)
 at System.Web.Handlers.ScriptResourceHandler.ProcessRequest(HttpContextBase context, VirtualFileReader fileReader, Action`2 logAction, Boolean validatePath)

After several hours of a nerve-wracking search for a solution, we came across this blog by Mike Lee . He saw the same issue and did find a solution. In the end we did change some additional GPO settings (Replace a process level token) for our application pool accounts.

In the first place we were in the search for FIPS settings, as there are many results in google pointing in that direction. We did:

  • Disable FIPS in the registry
  • Changed the GPO that got applied to this server
  • Tried the GZIP compression using a console application.

These settings did not help. Thanks to Mike Lee we were able to resolve this issue.