Set URL for “New Tab” in Microsoft Edge

In Microsoft Edge the default behaviour of “new tab” is it to open a customizable page with Microsoft Bing and other content. The content can be disabled but there is no way to change the “new tab” page completely.

Here is a PowerShell snippet to do it, tested on Windows Server 2019. Not working on Windows 10 Pro 20H2 / MS Edge 88.

if( (get-item "HKLM:\Software\Policies\Microsoft\Edge" -ea Ignore) -eq $null) {
    New-Item "HKLM:\Software\Policies\Microsoft\Edge"
}

if( (Get-ItemProperty -Path "HKLM:\Software\Policies\Microsoft\Edge" -Name NewTabPageLocation -ea 0) -eq $null ) {
    New-ItemProperty -Path "HKLM:\Software\Policies\Microsoft\Edge" -Name NewTabPageLocation -PropertyType String -Value "https://google.de"
} else {
    Set-ItemProperty -Path "HKLM:\Software\Policies\Microsoft\Edge" -Name NewTabPageLocation  -Value "https://google.de"
}

Get-ItemProperty -Path "HKLM:\Software\Policies\Microsoft\Edge" -Name NewTabPageLocation 

Update…

It does not work on a machine that is not domain joined. 🙁

Here I found a possible solution for Windows 10:

Fix PowerShell Error: “The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel.”

Just as a reminder.

When getting this error:

The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel.

…just add this to your PowerShell script:

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

Auto Update VISUAL STUDIO 2019/2017 on Windows

I have some development machines with Visual Studio 2017 and/or 2019 installed. Too many to constantly install updates in.

For a long time I was looking for a solution for this, but I did not find one. Supposedly the automatic update of Visual Studio is not possible.

In the end… I have found a way.

In a nutshell:

  1. Download the latest Visual Studio Installer for the respective version.
  2. Install the VS Installer update.
  3. Use the VS Installer to run the Visual Studio Update.

Look at this:

https://github.com/ikarstein/auto-update-vs

Original “PS2EXE” migrated from MS Technet Gallery to Github

This is more “for the record” than it makes real sense. But why not? For a few years “PS2EXE” was represented on Technet and has over 90.000 downloads to date. That is super great. – Meanwhile there is also a successor, which I didn’t develop, but it seems to be alive. That makes me happy, even if a bit wistful, because I didn’t manage to stay on the ball with this little project.

So here is the link: https://github.com/ikarstein/ps2exe

SharePoint 2013 Error: An error occurred during the compilation of the requested file, or one of its dependencies.

I have had this exception on SharePoint 2013 pages on several servers after they are running a for one or two days:

“An error occurred during the compilation of the requested file, or one of its dependencies.”

In the Windows Event Log -> Application I found this error:

sperror20161108-1

There are two possible reasons (maybe there are more 😉 ):

    1. There was .NET Framework 4.6.x installed but was removed in order to “fix” the SharePoint 2013 setup failure message.

      In this case you need to re-install .NET Framework 4.6.x after SharePoint 2013 setup. SharePoint 2013 is fully supported with 4.6.x!

       

    2. If the server is under heavy load and there are lots of processes startet in the Windows OS, then the “Desktop Heap” could be out of memory.

      In this case you need to increase the Desktop Heap size of the “Inactive Windows Desktop”.

      1. Open the Windows registry and navigate to HKLM\System\CurrentControlSet\Control\SessionManager\SubSystems
      2. Open Property “Windows”
      3. Copy the complete content …
      4. … into Notepad
      5. Change the 3rd value of the “SharedSection” part to 20480.
      6. Copy the whole string back into the registry property “Windows”
      7. Reboot your server

      sperror20161108-2

      This will increase the Dektop Heap to 20MB for the inactive desktops.
      Links:

      https://blogs.msdn.microsoft.com/winsdk/2015/06/03/what-is-up-with-the-application-failed-to-initialize-properly-0xc0000142-error/
      https://support.microsoft.com/en-us/kb/126962
      https://blogs.msdn.microsoft.com/ntdebugging/2007/01/04/desktop-heap-overview/

 

SharePoint 2013: Error in “Products Configuration Wizard” after Update to SP1

Today I got an error on my development farm after I updated SP 2013 to SP1 level. I also removed Visual Studio 2012 from the box and installed Visual Studio 2013.

In fact I did the SP1 update as last step.

Than I tried to run the “SharePoint 2013 Products Configuration Wizard” and got this error in the first update step (“Initializing…”):

image

One ore more typed failed to load.

I was pretty sure this error was caused by the Visual Studio “update”

image

Could not load file or assembly ‘Microsoft.Data.Edm, Version=5.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35’ or one of its dependencies.

image

Could not load file or assembly ‘Microsoft.Data.OData, Version=5.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35’ or one of its dependencies.

After some research I found this two components belong to “WCF Data Services 5.0 for OData V3”.

I downloaded the package from http://www.microsoft.com/en-us/download/details.aspx?id=29306 and installed it on my SharePoint dev box.

After that the “Products Configuration Wizard” worked as expected.

Troubleshooting: Cannot access Managed Metadata Service Application in SharePoint 2013

In the last couple of months I got several times the following error:

The Managed Metadata Service Application was not accessible in the Central Administration.

 

image

I checked all know issues of missing security settings of the service application and in the database. – Everything as expected.

Some days before it worked like a charm.

In one case I saw the problem appear after deploying the (new) Service Pack 1.

In the ULS is this error:

image

Failed to get term store for proxy ‘Managed Metadata Service Application’. Exception: System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary.
at Microsoft.SharePoint.Taxonomy.Internal.XmlDataReader.GetDateTime(String name)
at Microsoft.SharePoint.Taxonomy.Internal.SharedTermStore.Initialize(IDataReader dataReader, Guid termStoreIdValue, Boolean fromPersistedData)
at Microsoft.SharePoint.Taxonomy.Internal.SharedTermStore..ctor(IDataReader dataReader, Guid termStoreId, Boolean fromPersistedData)
at Microsoft.SharePoint.Taxonomy.Internal.DataAccessManager.GetTermStoreData(MetadataWebServiceApplicationProxy sharedServiceProxy, Boolean& partitionCreated)

Searching the internet I found no solution but exactly the same error, e.g:

http://sharepoint-community.net/forum/topics/failed-to-get-term-store-for-proxy-managed-metadata-service

I tried lots of suggested approaches, e.g. to re-create the service application using the old database. Nothing worked for me.

Finally I tried the following and it solved the issue without loosing data.

1. Check and note all settings of the service application, especially security and properties (Hub Url, etc.)

2. Backup the service application using PowerShell and cmdlet Backup-SPFarm:

Add-PSSnapin “Microsoft.SharePoint.PowerShell” -ea 0

backup-spfarm -BackupMethod full -Directory \\kcdevsqlexch1\backup\SvcApp -item “Farm\Shared Services\Shared Services Applications\Managed Metadata Service Application” -verbose

backup-spfarm -BackupMethod full -Directory \\kcdevsqlexch1\backup\SvcAppProxy -item “Farm\Shared Services\Shared Services Proxies\Managed Metadata Service Application” -verbose

You can  get the “item” path from command:

Backup-SPFarm -ShowTree

image

The names maybe different on your system.

You need to create a network share for the backup data.

3. After that I deleted the service application using the Central Administration. Including the database.

4. Now I run the restore commands in PowerShell:

Restore-SPFarm -Directory \\kcdevsqlexch1\backup\SvcApp -RestoreMethod New  -Verbose

Restore-SPFarm -Directory \\kcdevsqlexch1\backup\SvcAppProxy -RestoreMethod New  -Verbose

5. The next thing was to add the restored service application to the service application proxy groups, in my case only the “Default” proxy group.

image

6. The last step was to check security settings and service application properties. In my case I had to restore the settings and security manually.

That’s it. Managed Metadata Service Application working again. No data lost.

If you want to follow these steps make sure you have tested everything yourself. Make database backups. Note all settings. As always you do it on your own risk!

Get Rid of Orphaned Content Types in SharePoint 2013

One of my customers has a SharePoint 2013 farm with content migrated from SharePoint 2010.

In the past on SharePoint 2010 they have had the Microsoft SQL Server 2008 R2 Reporting Services-Add-In for SharePoint enabled on some sites. Because they did not need it any more they removed it from SharePoint but without deactivating the Report Server feature on each site collection.

Now on SharePoint 2013 this lead to problems with 3 orphaned content types:

  • Report Builder Model
  • Report Builder Report
  • Report Data Source

First I tried to remove the corresponding feature “ReportServer” with ID e8389ec7-70fd-4179-a1c4-6fcb4342d7a0 from the site:

clip_image001

Get-SPFeature : Cannot find an Enabled Feature object with Id: e8389ec7-70fd-4179-a1c4-6fcb4342d7a0 in scope Site Url: https://sp2013.kc-dev.com/sites/reportingservices.

 

Than I tried to remove the content types manually using the web GUI and PowerShell. I got this errors:

clip_image002

and

clip_image003

Same message: “The content type “Report Builder Model” is part of an application feature.”

 

After reading some articles I found two propesed ways:

At last I found another way to get rid of the orphaned site collections.

In the following demo I use the content types of the Microsoft SQL Server 2008 R2 Reporting Services SharePoint Addon. I installed it on a SharePoint 2010 platform and created a site collection “https://sp2010.kc-dev.com/sites/reportingservices” in a seperate content database.

Than I migrated this content database to SharePoint 2013 including site collection upgrade. – New URL: https://sp2013.kc-dev.com/sites/reportingservices

Than I checked the usage of the content types using the static method GetUsage of Microsoft.SharePoint.SPContentTypeUsage.

clip_image004

In my case all 3 content types are used in a single list. It is necessary to remove each usage of each content type!! – In my case I deleted the list. After deletation it need to be removed from the recycle bin too!

clip_image005

… and also from the site collection recycle bin!

clip_image006

Now my site collection is clean. No results when checking the usage again:

clip_image007

After that I created a new folder in the FEATURES sub folder in the SharePoint hive:

image

Than I created the following script to create a dummy feature inside this folder. The dummy feature uses EXACLTY the same feature Id as the missing feature containing the orphaned content types.

clip_image009

Note #1: The feature ID is the same as for the original Reporting Services feature!

Note #2: For this content types it is necessary to remove the XmlDocuments tag and ist content. Otherwise the next step will fail.

With that script I took the cotnent type XML out of SharePoint into a elements.xml file.

Than I was able to install the feature using PowerShell:

clip_image010

At this point the missing feature is back in SharePoint. The system now will not moan if I deactivate the feature.

BEFORE deactivating I made a screenshot of the site content types page:

clip_image011

After this command…

clip_image012

… the site content types page looks like this:

clip_image013

The orphaned content types are gone!

Note: Maybe you get an error during deactivation saying the feature is not active at the scope. In this case you need to activate the feature first and deactivate it afterwards!

The last step is to remove the dummy feature:

clip_image014

That’s it.

Here you download the script if you like: http://bit.ly/OCToWx

Cross Site Scripting with SharePoint 2013 REST calls

Today I had to figure out how to query a SharePoint 2013 REST service from another domain.

It took a while to find the correct settings. 😉

There was no list on the internet so I want to post it here as reference. – If you have additions to it please post them in the comments.

My test bed:

  1. I created two web applications
  2. At the root of both web apps I created a “Team Site” site collection.
  3. I uploaded a copy of jQuery to the masterpage catalog of http://fromhere.kc-dev.com .
  4. Also to the masterpage catalog of this site I uploaded a script file named “crosssitescripting.js” containing the REST call to http://tohere.kc-dev.com/_api .
  5. On the homepage of the root site collection of http://fromhere.kc-dev.com I added some script tags to load the script files jQuery.js and crosssitescripting.js. And a div tag for the sub web list.

clip_image001

I opened the homepage of http://fromhere.kc-dev.com in the browser and got an error in the F12 dev tools of the Internet Explorer. As expected.

clip_image002

Now I added some web.config modifications using PowerShell to enable cross site scripting. (Some years ago I wrote a note on that topic: https://blog.kenaro.com/2010/09/02/add-web-config-modification-with-powershell-spwebconfigmodification)

After reloading the site I could see the sub web list:

clip_image003

Here is the content of crosssitescripting.js

$(document).ready(function(){
    $.support.cors = true;
    $.ajax({
        url: "http://tohere.kc-dev.com/_api/Web/Webs",
        type: "GET",
        crossDomain: true,
        dataType: "json",
        headers: { "Accept": "application/json; odata=verbose" },
        xhrFields: { withCredentials: true },
        success: function (response) {
            var ul = $("#weblist").append("<ul/>");
            $(response.d.results).each(function(){
                $("<li>"+this.Url+"</li>").appendTo(ul);
            });
        },
        error: function (xhr, status) {
            debugger;
        }
    });
});

Here is the PowerShell script to add the web.config modifications:

Add-PSSnapin Microsoft.SharePoint.PowerShell -EA 0

$localFarm = Get-SPFarm

$webapp = Get-SPWebApplication "http://tohere.kc-dev.com"

# Remove old web.config modifications of MyAuthenticationProvider
$oldMods = @();
$webapp.WebConfigModifications | ? { $_.Owner -eq "CrossSiteScripting" } | % { 
    $oldMods = $oldMods + $_
}

$oldMods | % { 
    $webapp.WebConfigModifications.Remove($_) 
}

# update the Web Application and apply all existing web.config modifications - this executes the "remove" actions from above
$webapp.Update()
[Microsoft.SharePoint.Administration.SPWebService]::ContentService.ApplyWebConfigModifications()

#Wait until web.config modifications finished by timer job
while( (Get-SPTimerJob | ? { $_.Name -eq "job-webconfig-modification"}) -ne $null ) {
    Write-Host "." -NoNewline
    Start-Sleep 1
}

# New web.config modifications for MyAuthenticationProvider
$myModification1 = new-object Microsoft.SharePoint.Administration.SPWebConfigModification
$myModification1.Path = "configuration/system.webServer/httpProtocol/customHeaders"
$myModification1.Name = "add[@name='Access-Control-Allow-Origin'][@value='http://fromhere.kc-dev.com']"
$myModification1.Sequence = 0
$myModification1.Owner = "CrossSiteScripting"
#0 = for the enum value "SPWebConfigModification.SPWebConfigModificationType.EnsureChildNode"
$myModification1.Type = 0
$myModification1.Value = "<add name='Access-Control-Allow-Origin' value='http://fromhere.kc-dev.com' />"
$webapp.WebConfigModifications.Add($myModification1)

$myModification1 = new-object Microsoft.SharePoint.Administration.SPWebConfigModification
$myModification1.Path = "configuration/system.webServer/httpProtocol/customHeaders"
$myModification1.Name = "add[@name='Access-Control-Request-Method'][@value='GET,POST,HEAD,OPTIONS']"
$myModification1.Sequence = 0
$myModification1.Owner = "CrossSiteScripting"
$myModification1.Type = 0
$myModification1.Value = "<add name='Access-Control-Request-Method' value='GET,POST,HEAD,OPTIONS' />"
$webapp.WebConfigModifications.Add($myModification1)

$myModification1 = new-object Microsoft.SharePoint.Administration.SPWebConfigModification
$myModification1.Path = "configuration/system.webServer/httpProtocol/customHeaders"
$myModification1.Name = "add[@name='Access-Control-Request-Headers'][@value='Content-Type,Authorization']"
$myModification1.Sequence = 0
$myModification1.Owner = "CrossSiteScripting"
$myModification1.Type = 0
$myModification1.Value = "<add name='Access-Control-Request-Headers' value='Content-Type,Authorization' />"
$webapp.WebConfigModifications.Add($myModification1)

$myModification1 = new-objectMicrosoft.SharePoint.Administration.SPWebConfigModification
$myModification1.Path = "configuration/system.webServer/httpProtocol/customHeaders"
$myModification1.Name = "add[@name='Access-Control-Allow-Credentials'][@value='true']"
$myModification1.Sequence = 0
$myModification1.Owner = "CrossSiteScripting"
$myModification1.Type = 0
$myModification1.Value = "<add name='Access-Control-Allow-Credentials' value='true' />"
$webapp.WebConfigModifications.Add($myModification1)

$webapp.Update()
[Microsoft.SharePoint.Administration.SPWebService]::ContentService.ApplyWebConfigModifications()

#Wait until web.config modifications finished by timer job
while( (Get-SPTimerJob | ? { $_.Name -eq "job-webconfig-modification"}) -ne $null ) {
    Write-Host "." -NoNewline
    Start-Sleep 1
}

PowerShell Snippet: Store Login Information Secure in PowerShell using Windows Security API

Today I want to show you a small PowerShell snippet that I created for a webinar for AvePoint. It’s a webinar in German language about the DocAve module “Content Manager”.

The snippet will show you how to store a encrypted password in a plain text file.

Therefore I use some Windows OS APIs that are accessible in .NET:

http://msdn.microsoft.com/en-us/library/system.security.cryptography.protecteddata.protect(v=vs.110).aspx

This encapsulates the “Data Protection API” of Windows: http://msdn.microsoft.com/en-us/library/ms995355.aspx

With the methods of this class you are able to encrypt and decrypt data very easily, either in the context of the current user or in the context of the local machine.

The encrypted data can only be decrypted on the same machine in the same context as where they were encrypted.

Very easy and handy. It is  NOT EASY BUT POSSIBLE to decrypt it on another machine. Just read the article mentioned above, especially the section “DPAPI Security” (http://msdn.microsoft.com/en-us/library/ms995355.aspx#windataprotection-dpapi_topic04).

It is DocAve specific but of course you can modify it for your own purpose.

Here is the Script:

<##
  Created by Ingo Karstein 
    https://blog.kenaro.com
##>

#Load Modules and Assemblies
Import-Module-Name "C:\program files\AvePoint\DocAve6\Shell\DocAveModules\DocAveModule" -DisableNameChecking
[System.Reflection.Assembly]::LoadWithPartialName("System.Security") | Out-Null

#Current folder of script
$path = Split-Path $MyInvocation.MyCommand.Path

#Config values
$docavemanageruser = "admin"
$docavemanagerserver = "kcdevsqlexch1"
$docavemanagerport = 14000

#Read password from file or get it from user and store it into a file
if( [string]::IsNullOrEmpty($docavepwd) ) {
  if( Test-Path "$($path)\pwd.txt" ) {
     $data= [System.Convert]::FromBase64String((Get-Content "$($path)\pwd.txt" -Encoding UTF8))
     $global:docavepwd = [System.Text.Encoding]::UTF8.GetString([System.Security.Cryptography.ProtectedData]::Unprotect($data, (123,54,67,89,12,32,146), "CurrentUser"))
  } else {
     $global:docavepwd = Read-Host "Enter AvePoint ""$($docavemanageruser)"" password"
     $data= [System.Security.Cryptography.ProtectedData]::Protect( ([System.Text.Encoding]::UTF8.GetBytes($docavepwd)) ,(123,54,67,89,12,32,146), "CurrentUser")

     [System.Convert]::ToBase64String($data) | Set-Content "$($path)\pwd.txt" -Encoding UTF8 -Force
  }
}

#exit if no password
if( [string]::IsNullOrEmpty($docavepwd) ) {
  exit
}

$success=$false
#check if already logged in into DocAve
try {
  $success= (Get-DALocalUser -ErrorAction 0) -ne $null 
  if( !$? ) {
    $success=$false
  }
} catch {
  $success=$false
}

#If not already logged in: Login using credentials
if( !$success ) {
  $cred = New-Object System.Management.Automation.PSCredential( $docavemanageruser, (ConvertTo-SecureString -Force -AsPlainText $docavepwd))
  Login-DAManager -ControlHost $docavemanagerserver -ControlPort $docavemanagerport -Credential $cred
  if( $? -eq $false ) {
    exit
  }
}