TeamsDnA user group

On July 3rd the new Microsoft Teams DnA (Development and Adoption) user group started in Munich. You can find the meetup page over here: Microsoft Teams DnA Development

The Teams DnA usergroup is organized by

What are the Teams DnA topics?

Over the last few meetings, we did focus on:

  • presentations on “Microsoft Teams as development platform”.
  • the expectations of the participants of this usergroups.

Our goal is to highlight Microsoft Teams as platform for line of business applications for users, administrators and developers. The different possibilities will be explained with examples. During the following meetups, there will be a short presentation on a focus topic.

In addition to these focus presentations, this usergroups has another goal: To provide a application for teams, that is developed by the community. This approach allows insights on how to develop an app for teams and how to leverage several technologies (Azure, Office 365) and methodologies (e.g. DevOps) at the same time.

What’s the TeamsDnA menu-butler application?

During the first meetup in July the usergroup did a bit of brainstorming. What could be a possible Microsoft Teams application? The brainstorming has been done with the following premise: What is an application that every company needs and every user is familiar with the use-case?

Menu-Butler was born.

Menu-Butler is your personal lunch assistant. With every intranet project the most common requirement probably is the easy retrieval of the daily and weekly menu for the user’s respective location.

Menu Butler is a personal assistant that integrates into Office 365 and Microsoft Teams seamlessly and helps to solve this requirement.

You can find more information over here: teamsdna/menu-butler

How can I join the Teams DnA usergroup?

With two meetings onsite in Munich, we did switch to a Microsoft Teams meeting, as there is growing interest in this user group by people outside of Munich.

The easiest is to join the meetup group. We will post the Teams link to the meeting in each meeting description.

 

10 signs you are an Office 365 Consultant!

Over the last few weeks my MVP colleague Luise Freese and I joined up to create a sway about “10 signs your are an Office 365 Consultant.” Hope you enjoy the read!

You can find the sway here

Activate Hyper-V Feature on Windows 10 using PowerShell DSC

Recently I got a new laptop and I am planning to configure the laptops OS the best way possible.

As I’m a big fan of PowerShell DSC I want to use PowerShell DSC for this configuration. I did use PowerShell DSC in many customer engagements with Windows Server OS. Sadly the switch to Windows 10 was not as smooth as expected.

PowerShell Execution Policy

The very first and major blocking error was the PowerShell Execution Policy. Per default the PowerShell Execution Policy on Windows 10 is set to RemoteSigned. Not every DSC Module is remote-signed. A possible solution is not pretty: Setting the execution policy to a lower level of security.

BTW: This ExecutionPolicy blocks also the Domain Join of a Windows 10 VM in Azure (see: Error: “Resource”.psm1 is not digitally signed)

WindowsFeature Resource

As part of my configuration I did want to setup Hyper-V. Windows Features can be configured by using the WindowsFeature Resource. There is just one limitation that will error during runtime: The WindowsFeature Resource does require functions, that are only available in Windows Server OS. During runtime of the configuration there will be the following error message:

“Installing roles and features using PowerShell Desired State Configuration is supported only on Server SKU’s. It is not supported on Client SKU.”

To get around this issue I created the following Script Resource:

Script Hyper-V
{
GetScript = {
Write-Verbose "Get current status for Microsoft-Hyper-V Feature"
$hyperVFeatureState = Get-WindowsOptionalFeature -FeatureName Microsoft-Hyper-V -Online -ErrorAction SilentlyContinue
return @{
Result = $hyperVFeatureState
}
}
SetScript = {
Write-Verbose "Activating Microsoft-Hyper-V Feature"
Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Hyper-V -All
}
TestScript = {
Write-Verbose "Test the current status for Microsoft-Hyper-V Feature"
$hyperVFeatureState = & ([ScriptBlock]::Create($GetScript))
Write-Verbose "Current status is: $($hyperVFeatureState.Result.State)"
return $hyperVFeatureState.Result.State -eq 'Enabled'
}
}

chocolatey

If you never heard of chocolatey head over to their website: The package manager for Windows. Chocolatey is a package manager for windows and allow you to install software from their repository very easy. The even provide a DSC Resource to automate this process even further.

Implementing the cChocoInstaller comes with a small hurdle: Make sure you have the InstallDir created before using the cChocoInstaller resource. I use a File Resource in dependency with chocolatey to avoid this glitch.

 

 

How to filter for SharePoint Framework #SPFx related tweets in TweetDeck

Over the last couple of days there were some tweets related to the beloved #SPFx keyword. Stefan Bauer shared something related to SharePoint Framework and got a new follower 😛

That’s the one side of the medal of using an abbreviation for SharePoint Framework that is also used by special effects and make-up folks.

The other side of the medal is: How do I filter in TweetDeck for Tweets that are related to SharePoint and not to makeup or something else?

Please find below my current filter for TweetDeck. Please do not hesitate to suggest changes to the filter for tweets related to SharePoint Framework.

The current Filter can be found in this github repository: https://github.com/andikrueger/TweetDeckSPFxFilter

The first version of the filter is stored in this gist:

#SPFx -#art -#artist -#artistreborn -#behindthescenes -#blood -#Elementary -#hollywood -#LED -#makeup -#makeupeffects -#makeupfx -#mua -#mufx -#pyro -#pyrotechnics -#sfx -#SharpFX -#specialeffects -#specialeffectsmakeup -#specialfx -#specialfxmakeup -#specialmakeupeffects -#specialmakeupfx -#spfmakeup -#spfxmua -#tv -#wounds -from:webpart_o_matic

Using Desired State Configuration (DSC) for SaaS – Office365DSC thoughts

PowerShell Desired State Configuration (DSC) enhances your setup experience of new environments like no other technique before. There are around 270 different DSC resources available that provide methods used to configure windows server components and software like:

  • SharePoint Server
  • SQL Server
  • Active Directory
  • IIS

I’m using DSC very frequently. It gives me great advantages over my previous PowerShell scripts. I can use DSC in combination with Azure DSC to configure my systems and track their status. Configuration drifts can only happen for parts that are not part of my configuration.

The part that is missing: PowerShell DSC is a per node technology. Every server, that I want to configure must have a local configuration manager (LCM), that is responsible for applying my configuration. The LCM is responsible for applying the configuration either as local system account or as a configurable account.

Fast forward: Currently more and more people switch to Software as a Service (SaaS) offerings like Office 365. Office 365 offers many configuration options, but there is no LCM available, that would handle the configuration.

Let’s speak about a possible Office 365 DSC resource.

This resource should be responsible for administering Office 365 in a DSC way. I see the following options on how to apply a configuration to Office 365:

  1. Create a configuration and apply this configuration on node “localhost”. This means we are using the current computer and the LCM to apply a configuration to a SaaS.
    This looks like an “ok” solution. The configuration depends on the local system and nothing of the configuration gets apply to the local system. This feels odd.
  2. Azure Runbooks offer a way to run a PowerShell script, reuse PowerShell Modules and schedule the script.
    Compared to a local PowerShell and LCM Azure Runbooks could be the way to go. A SaaS to configure another SaaS – this feels kind of perfect – but won’t work as many cmdlets are dependant on .NET components, that can’t be loaded into Azure Runbooks and this is very not DSC.

Maybe there are other options available, that I can’t think of. If you have a suggestion, feel free to leave a comment.

Currently the most resources focus on products and functions that life on a Windows Server System – the configuration is specific to a Node.

Thoughts about an Office365DSC Resource

Now many customers start using Office365. In Germany customers are very aware of their data and sometimes spend a good amount of time to define a governance for Office365.

With an option to configure Office365 with DSC, they could gain a lot of comfort and overview of what is configured and how. Working with test tenants would be very easy, as you could replicate your production settings to your test tenant easily – despite there is no Office365 DSC Resource available yet.

What are the current options to script your Office365 administration?

Why not to script everything and use DSC instead?

DSC is about configuration management. If I want to update any setting in the Office 365 admin centre, a DSC configuration seems to be the best option.

The scripting guy would load the PowerShell Module, the CLI or open the admin portal to change a setting with a function call like:

IWantToSetThis-Function -Something This

With DSC there would happen something else:

Something {
    State = "This"
}

DSC would try to get the current setting for “Something”, compare the parameter “State” to “This” and only if they differ, call the function above.

Is DSC only a better approach to script?

I’m a big fan of using the DSC approach. In the end it’s just PowerShell, but in a better structured manner. Having a predefined set of Get-, Set- and Test-Functions (and with Office365DSC Export-Function) allows to reuse the functionality provided.

Where can I find Office365DSC?

You can find the repository of Office365DSC at github: Microsoft/Office365DSC

The struggle with configuration data in DSC configurations – SQL Alias and reusability

Today I was very happy to find a neat solution to handle configuration data for a DSC configuration. I was facing the following challenge:

In a SharePoint DSC Configuration I want to reuse several SQL Aliases that are created during run time of the DSC configuration based on the configuration data.

In recent DSC setups my configuration did look like this:

@{
AllNodes = @(
@{
NodeName = "*"
PSDscAllowPlainTextPassword = $true
PSDscAllowDomainUser = $true
ServerRole = "invalid"
},
@{
NodeName = "SharePointServer"
ServerRole = "SingleServerFarm"
CentralAdmin = $true
IsMasterNode = $true
ServiceInstances = @(
)
}
)
NonNodeData = @{
SqlServerAlias = @(
@{
ServerName = "SqlServer\Content"
Name = "SharePoint-Content"
Protocol = "TCP"
UseDynamicTcpPort = $true
TcpPort = $false
},
@{
ServerName = "SqlServer1\Config"
Name = "SharePoint-Config"
Protocol = "TCP"
UseDynamicTcpPort = $true
TcpPort = $false
}
)
}
}

and I created the SQL Alias with the following lines of code:

$ConfigurationData.NonNodeData.SqlServerAlias | ForEach-Object -Process {
SqlAlias $_.Name
{
Name = $_.Name
ServerName = $_.ServerName
Protocol = $_.Protocol
UseDynamicTcpPort = $_.UseDynamicTcpPort
TcpPort = $_.TcpPort
Ensure = 'Present'
PsDscRunAsCredential = $SpSetupAccount
}
}

This far there was no struggle at all. Creating a SQL Alias with DSC is very straight forward, even if there is the need to create more than one.

The struggle got real the moment I had to reuse the Alias Name in SharePoint. How do I properly access the Alias? Do I iterate over all aliases again and filter or do I hard code the alias name or…? None of the before felt right.

My solutions is pretty simple: Why not change the array to another hash table – A hash table allows to access the data more easily. 🙂

@{
AllNodes = @(
@{
NodeName = "*"
PSDscAllowPlainTextPassword = $true
PSDscAllowDomainUser = $true
ServerRole = "invalid"
},
@{
NodeName = "SharePointServer"
ServerRole = "SingleServerFarm"
CentralAdmin = $true
IsMasterNode = $true
ServiceInstances = @(
)
}
)
NonNodeData = @{
SqlServerAlias = @{
Content = @{
ServerName = "SqlServer\Content"
Name = "SharePoint-Content"
Protocol = "TCP"
UseDynamicTcpPort = $true
TcpPort = $false
}
Config = @{
ServerName = "SqlServer1\Config"
Name = "SharePoint-Config"
Protocol = "TCP"
UseDynamicTcpPort = $true
TcpPort = $false
}
}
}
}

Final challenge: How can I iterate over a hash table? A hash table object has two properties: keys and values:

$ConfigurationData.NonNodeData.SqlServerAlias.Keys | ForEach-Object -Process {
SqlAlias $ConfigurationData.NonNodeData.SqlServerAlias[$_].Name
{
Name = $ConfigurationData.NonNodeData.SqlServerAlias[$_].Name
ServerName = $ConfigurationData.NonNodeData.SqlServerAlias[$_].ServerName
Protocol = $ConfigurationData.NonNodeData.SqlServerAlias[$_].Protocol
UseDynamicTcpPort = $ConfigurationData.NonNodeData.SqlServerAlias[$_].UseDynamicTcpPort
TcpPort = $ConfigurationData.NonNodeData.SqlServerAlias[$_].TcpPort
Ensure = 'Present'
PsDscRunAsCredential = $SpSetupAccount
}
}

So what changed in my SharePointDsc configuration part? Now I can address my SQL Alias properly without having any troubles:

Previously:
SPFarm SharePointFarm
{
DatabaseServer = ¯\_(ツ)_/¯
IsSingleInstance = "Yes"
FarmConfigDatabaseName = "SP_Config"
AdminContentDatabaseName = "SP_AdminContent"
Passphrase = $Passphrase
FarmAccount = $FarmAccount
RunCentralAdmin = $true
PsDscRunAsCredential = $SpSetupAccount
}
Now:
SPFarm SharePointFarm
{
DatabaseServer = $ConfigurationData.NonNodeData.SqlServerAlias.Config.Name
IsSingleInstance = "Yes"
FarmConfigDatabaseName = "SP_Config"
AdminContentDatabaseName = "SP_AdminContent"
Passphrase = $Passphrase
FarmAccount = $FarmAccount
RunCentralAdmin = $true
PsDscRunAsCredential = $SpSetupAccount
}
view raw SharePoint.ps1 hosted with ❤ by GitHub

SharePoint 2016 MinRoles – Behind the scenes and where the C2WTS should be provisioned

This issue is still under investigation – current I think this issue occurs, when SharePoint is installed, followed by a Language Pack and then the creation of the farm. Checking the compliance status of the min role at this particular time will result in a not compliant state.



Microsoft introduced with SharePoint 2016 a new feature called “MinRoles”. MinRoles offer a new way on how to create SharePoint topologies. The following MinRoles are available:

  • Front-End
    A role made for all loads in context of serving SharePoint
  • Application
    This role is optimized for all services that need to run in a SharePoint Farm – without the Search loads
  • Distributed Cache
    Hosting the Distributed cache service
  • Search
    All services associated to the search load of SharePoint
  • Custom
    This role is needed if you plan on using Business Intelligence loads, as they are not pat of the other MinRoles
  • Single-Server farm

In October 2016 Microsoft release Feature Pack 1 (FP1) for SharePoint 2016. The FP1 offers two new MinRoles (MinRolesV2):

  • Front-end with Distributed Cache
  • Application with Search

These two roles combine the prior roles, so that customers can create high available (HA) farms for SharePoint with less servers. Prior to FP1 you did need at least two servers of (each) role: 2 WFE, 2 App, 2 Search, 2 Distributed Cache. In total 8 servers. After you install FP1 you can switch to the combined min roles and will be able to create a SharePoint HA Farm with 4 servers instead. This offering focuses on SharePoint customers with HA requirements, but not enough workload for hosting 8 SharePoint servers.

With each MinRole a set of services can be run on a server. SharePoint 2016 enforces the state of these services. For a complete list of all services that belong to a role, visit the technet documentation.

Behind the scenes

New “Services in Farm” experience in Central Administration

The following screenshot shows the “Services in Farm” page of SharePoint 2016 Central Administration:

Services on Server on SharePoint 2016

A soon as any MinRole is defined the Services on Server page will show the selected servers role and for each service the status and its compliance state. SharePoint offers you some options:

  • you can stop a service
  • you can fix a non-compliant state of a service with one click

Where does this MinRole compliance information come from?

In a recent customer engagement I stumbled across something, that bothered me. The customers SharePoint Farm uses MinRolesV2 and I did check the Microsoft Documentation for wich services are allowed on the server. After a while I had a service in a not compliant state and did not know why. I reached out to the community, but did not end up, with a define answer:

Today I spend some time with the SharePoint source code and did find the answer:

SharePoint Service instances are represented by the SPServiceInstance class. Every SharePoint Service Instance (e.g. SPWindowsTokenServiceInstance) derives from this class and overrides the following method:

 // Microsoft.SharePoint.Administration.SPServiceInstance
 public virtual bool ShouldProvision(SPServerRole serverRole)
 {
 }

There is a additional internal method: ShouldProvisionInternal, which does some additional the tests for the following roles:

  • SPServerRole.WebFrontEndWithDistributedCache
  • SPServerRole.ApplicationWithSearch

When your server is of the roles above, the ShouldProvision method will be called with both single roles and the combined role. If any of these tests returns true, then this role is compliant.

Not knowing, whether the documentation  is wrong or the code, I did investigated the SPWindowsTokenServiceInstance code.

  // Microsoft.SharePoint.Administration.Claims.SPWindowsTokenServiceInstance
 public override bool ShouldProvision(SPServerRole serverRole)
 {
     return SPServerRole.SingleServerFarm == serverRole || SPServerRole.Application == serverRole || SPServerRole.WebFrontEnd == serverRole || (SPServerRole.DistributedCache == serverRole && !SPServerRoleManager.IsMinRoleV2Enabled()) || (SPServerRole.Search == serverRole && !SPServerRoleManager.IsMinRoleV2Enabled()) || SPServerRole.Custom == serverRole;
 }

The implementation above, does not include tests for the MinRolesV2 – but the non V2 roles are introduced through the ShouldProvisionInternal method anyway. Following this, there is no error in the code or the documentation. This applies for the Microsoft.SharePoint.dll in Version: 16.0.4561.1000

So what is wrong with my farm (Januar CU 2018):

 // Microsoft.SharePoint.Administration.Claims.SPWindowsTokenServiceInstance
 public override bool ShouldProvision(SPServerRole serverRole)
 {
     if (SPServerRole.SingleServerFarm != serverRole && SPServerRole.Application != serverRole && SPServerRole.WebFrontEnd != serverRole && (SPServerRole.DistributedCache != serverRole || SPServerRoleManager.IsMinRoleV2Enabled()) && (SPServerRole.Search != serverRole || SPServerRoleManager.IsMinRoleV2Enabled()))
     {
         return SPServerRole.Custom == serverRole;
     }
     return true;
 }

The code above is taken from a Microsoft.SharePoint.dll in File Version: 16.0.4639.1002

This means the current documentation does not reflect the code properly. In case, that I read the code properly, the the only allowed role to host C2WTS with the MinRoleV2 is the Custom role.

Conclusion: There is a drift in the documentation. When I’m not mistaken, I do not need the C2WTS in a non-BI enabled farm. The BI enabled farm does need a MinRole Server “Custom” to run reporting services and other roles. This answers why, the C2WTS is no longer allowed in any other role. Maybe someone should update the documentation…

Update 2018-03-23:

I did double check the code of the ShouldProvisionInternal and ShouldProvision methods and have a strong believe, that there is something wrong to determine the MinRole compliance. Find below a screenshot of what I think the returns should be, but somehow are not!

ShouldProvisionInternal
Expected return values for the ShouldProvisionInternal method

Following this code: The result should be that the MinRoleV2 ApplicationWithSearch should be able to run the Claims to Windows Token service.

I really hope, someone can me help me to figure this out. Is there any part, I do not read correctly?

SharePoint 2016 – Central Administration – ScriptResource.axd 404 error

Recently my colleague and I came across a SharePoint2016 Farm, that had some issues with the suitebar not loading. In the network trace we found a 404 error for the ScriptResource.axd request. Using the event viewer we came across the following error message:

Event code: 3012
Event message: An error occurred processing a web or script resource request. The requested resource 'ZSystem.Web.Extensions,4.0.0.0,,31bf3856ad364e35|MicrosoftAjax.js|' does not exist or there was a problem loading it. 
Event time: 21.03.2018 09:57:57
Event time (UTC): 21.03.2018 08:57:57
Event ID: 59637ef8b92b43b59cadfd4b3ce61619 Event sequence: 7 Event occurrence: 1 Event detail code: 0 
 
Application information: 
 Application domain: /LM/W3SVC/275657850/ROOT-1-131660962662183250 
 Trust level: Full 
 Application Virtual Path: / 
 Application Path: C:\inetpub\wwwroot\wss\VirtualDirectories\4432\ 
 Machine name: SERVERNAME 
 
Process information: 
 Process ID: 6916 
 Process name: w3wp.exe 
 Account name: ***\***** 
 
Exception information: 
 Exception type: ZLibException 
 Exception message: The underlying compression routine could not be loaded correctly.
 at System.IO.Compression.DeflaterZLib.DeflateInit(CompressionLevel compressionLevel, Int32 windowBits, Int32 memLevel, CompressionStrategy strategy)
 at System.IO.Compression.DeflaterZLib..ctor(CompressionLevel compressionLevel)
 at System.IO.Compression.DeflateStream.CreateDeflater(Nullable`1 compressionLevel)
 at System.IO.Compression.DeflateStream..ctor(Stream stream, CompressionMode mode, Boolean leaveOpen)
 at System.IO.Compression.GZipStream..ctor(Stream stream, CompressionMode mode)
 at System.Web.Handlers.ScriptResourceHandler.ProcessRequestInternal(HttpResponseBase response, String decryptedString, VirtualFileReader fileReader)
 at System.Web.Handlers.ScriptResourceHandler.ProcessRequest(HttpContextBase context, VirtualFileReader fileReader, Action`2 logAction, Boolean validatePath)

The type initializer for 'NativeZLibDLLStub' threw an exception.
 at System.IO.Compression.ZLibNative.ZLibStreamHandle.DeflateInit2_(CompressionLevel level, Int32 windowBits, Int32 memLevel, CompressionStrategy strategy)
 at System.IO.Compression.DeflaterZLib.DeflateInit(CompressionLevel compressionLevel, Int32 windowBits, Int32 memLevel, CompressionStrategy strategy)

Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))
 at System.Runtime.InteropServices.Marshal.ThrowExceptionForHRInternal(Int32 errorCode, IntPtr errorInfo)
 at System.Runtime.InteropServices.Marshal.ThrowExceptionForHR(Int32 errorCode, IntPtr errorInfo)
 at System.IO.Compression.ZLibNative.ZLibStreamHandle.NativeZLibDLLStub.LoadZLibDLL()
 at System.IO.Compression.ZLibNative.ZLibStreamHandle.NativeZLibDLLStub..cctor()

Request information: 
 Request URL: http://SERVERNAME:10000/ScriptResource.axd?d=8QMyovunFMoY381OpYQyz9XcLkCjJ_XuifoGDX5Q8vatF2gNXkCNXfW0-6Dz_mvZVHapsY1FH-l9zy2l3q3V1Z5dqWndvspmAcN7L1hB-UomBeamTvhHGIe5ZcW_f-DqtS3ymyGl5Bk-ybE1j6mAb8ZMhM8n4e_WwQEe8IkCMRsq_DHxVXNP720ZZXWT1ATa0&t=ffffffffad4b7194 
 Request path: /ScriptResource.axd 
 User host address: ***SERVERNAME 
 User: 
 Is authenticated: False 
 Authentication Type: 
 Thread account name: NT AUTHORITY\IUSR 
 
Thread information: 
 Thread ID: 20 
 Thread account name: NT AUTHORITY\IUSR 
 Is impersonating: False 
 Stack trace: at System.IO.Compression.DeflaterZLib.DeflateInit(CompressionLevel compressionLevel, Int32 windowBits, Int32 memLevel, CompressionStrategy strategy)
 at System.IO.Compression.DeflaterZLib..ctor(CompressionLevel compressionLevel)
 at System.IO.Compression.DeflateStream.CreateDeflater(Nullable`1 compressionLevel)
 at System.IO.Compression.DeflateStream..ctor(Stream stream, CompressionMode mode, Boolean leaveOpen)
 at System.IO.Compression.GZipStream..ctor(Stream stream, CompressionMode mode)
 at System.Web.Handlers.ScriptResourceHandler.ProcessRequestInternal(HttpResponseBase response, String decryptedString, VirtualFileReader fileReader)
 at System.Web.Handlers.ScriptResourceHandler.ProcessRequest(HttpContextBase context, VirtualFileReader fileReader, Action`2 logAction, Boolean validatePath)

After several hours of a nerve-wracking search for a solution, we came across this blog by Mike Lee . He saw the same issue and did find a solution. In the end we did change some additional GPO settings (Replace a process level token) for our application pool accounts.

In the first place we were in the search for FIPS settings, as there are many results in google pointing in that direction. We did:

  • Disable FIPS in the registry
  • Changed the GPO that got applied to this server
  • Tried the GZIP compression using a console application.

These settings did not help. Thanks to Mike Lee we were able to resolve this issue.

AutoSpInstaller XML to SharePointDsc Converter – Preview – Update 2018/04/19

Today the first preview of a web based AutoSpInstaller to SharePoint DSC converter got released.

There are still some limitations, as the mapping of the xml file to SharePointDsc is not complete jet. This is a preview to demonstrate the capabilities.

The converter targets the following use case:

As a user with an AutoSpInstaller XML available I want want to switch to SharePointDsc.

Currently the converter is able to create a multi node SharePoint DSC configuration based on the input of the contents of the AutoSPInstaller XML file. The configuration will contain the following elements:

  • one node block for each server name. If you are using localhost mixed with real server names, there will be an additional node for localhost.
    • On each node the following configuration is placed:
      • SQLAliases
      • SharePoint Prerequistes
      • SharePoint Binary Installation
      • Farm create or join
  • The following components are currently extracted from the AutoSPInstaller xml file:
    • Basic Farm setup
    • Managed Accounts
    • Web Applications
    • Site Collections
    • Managed Paths
    • Diagnostics Logging Service
    • State Service Application
    • Sandboxed Code Service
    • Claims to Windows Token Service
    • Outgoing Mail
    • Distributed Cache
    • Workflow Timer Setting
  • Update 2018/04/19 – There are the following additions:
    • Creation of Application Pools for Web Applications, Search, Serivces
    • User Profile Service Application
    • Search Service Application
    • Managed Metadata Service Application

SharePoint 2016 base language and language packs

All over the world there are customers that want to install and use SharePoint in their preferred language (official language, e.g. in Germany it’s German). I can fully understand the backgrounds of this wish:

  • Not every employee speaks English
  • Not every administrator speaks English
  • Adopting a new System is easier, when it’s in the mother tongue.

The SharePoint 2016 sources are available in the following 24 languages. Additionally, there are the following 49 language packs:

Language Source Language Language Pack
Arabic x x
Azerbaijani x
Basque x
Bosnian x
Bulgarian x
Catalan x
Chinese – Simplified x x
Chinese – Traditional x x
Croatian x
Czech x x
Danish x x
Dari x
Dutch x x
English x x
Estonian x
Finnish x x
French x x
Galician x
German x x
Greek x x
Hebrew x x
Hindi x
Hungarian x x
Indonesian x
Irish x
Italian x x
Japanese x x
Kazakh x
Korean x x
Latvian x
Lithuanian x
Macedonian x
Malay x
Norwegian-Bokmal x x
Polish x x
Portuguese-Brazil x x
Portuguese-Portugal x x
Romanian x
Russian x x
Serbian x
Slovak x
Slovenian x
Spanish x x
Swedish x x
Thai x x
Turkish x x
Ukrainian x
Vietnamese x
Welsh x

There are many possible variations of base language and language packs.

I tend to install SharePoint with the Englisch Sources and prefer to install Language Packs afterwards!

The reason behind this philosophy: “Keep it simple!”

Working every day with different SharePoint farms can be challenging if the language of the SharePoint logs vary or the PowerShell commands do need to be modified to match the installed SharePoint language.

The tools available focus on English SharePoint sources:

…and I prefer Central Administration to be in English. The german translation does not feelt right 😉

What’s the best way to figure out the base language and the installed Language Packs?

I use the following PowerShell Script to figre out the base language and the installed Language Packs:


#Requires -RunAsAdministrator
function Get-SharePointLanguages
{
$baseLanguageKey = Get-Item 'HKLM:\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\16.0\ServerLanguage'
$baseLanguageValue = $baseLanguageKey.Property
if($null -ne $baseLanguageValue -and $baseLanguageValue.Count -eq 1)
{
$baseLanguageCulture = New-Object System.Globalization.CultureInfo([int]$baseLanguageValue[0])
Write-Host "SharePoint base language:" -ForegroundColor Green
Write-Host $($baseLanguageCulture.DisplayName)
}
$languagePackKey = Get-Item 'HKLM:\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\16.0\InstalledLanguages'
if($null -ne $languagePackKey)
{
Write-Host "SharePoint Language Packs:" -ForegroundColor Green
$languagePackKey.Property | ForEach-Object -Process {
$languagePackCulture = New-Object System.Globalization.CultureInfo([int]$_)
Write-Host $languagePackCulture.DisplayName
}
}
}
Export-ModuleMember -Function Get-*