Windows Server

Creating a Self-Signed Code Signing Certificate for AD FS Signing and Decrypting

Some times we don’t want to use automatic rollover for the certificates in AD FS, simply because we want even more granular control on what’s going on. To solve this we can either buy an public signed certificate from an CA we trust, or we can create a self-signed certificate our self using makecert.exe. Note that I would recommend a publically signed certificate for production use, but if you’re not as paranoid as me, self-signed works just as fine. A tool we can use for certificate creation in the Microsoft world, is makecert.exe. makecert.exe is a part of the Windows SDK. It’s also included when you install Visual Studio. The path for Makecert should be something like this on our computer

  • Version 6.3.9600.17298 – C:\Program Files (x86)\Windows Kits\8.1\Bin\x64\
  • Version 6.3.9600.17298 – C:\Program Files (x86)\Windows Kits\8.1\Bin\x86\
  • Version 6.2.9200.20789 – C:\Program Files (x86)\Windows Kits\8.0\Bin\x64\
  • Version 6.2.9200.20789 – C:\Program Files (x86)\Windows Kits\8.0\Bin\x86\
  • Version 6.1.7600.16385 – C:\Program Files (x86)\Microsoft SDKs\Windows\7.1A\Bin\x64\
  • Version 6.1.7600.16385 – C:\Program Files (x86)\Microsoft SDKs\Windows\7.1A\Bin\

The syntax for makecert is as follow

makecert [options] outputCertificateFile

To make a certificate that can be used with AD FS signing, our command should be like this (All in one line)

makecert -r -pe -n “CN=MySigningCert” -b 12/28/2014 -e 01/01/2020 -eku -ss my -sr localMachine -sky exchange -sp “Microsoft RSA SChannel Cryptographic Provider” -sy 12 “MySigningCert.cer” -len 2048

This should be successful. We can now export the key form our computer, and use it in the AD FS service.

The options we use in the example above

We used the following options in our script. Most of this table is copied from the official documentation.

-r Creates a self-signed certificate.
-pe Marks the generated private key as exportable. This allows the private key to be included in the certificate.
-n Specifies the subject’s certificate name. This name must conform to the X.500 standard. The simplest method is to specify the name in double quotes, preceded by CN=; for example, -n “CN=myName”.
-b Specifies the start of the validity period. Defaults to the current date.
-e Specifies the end of the validity period. Defaults to 12/31/2039 11:59:59 GMT.
-eku Inserts a list of comma-separated, enhanced key usage object identifiers (OIDs) into the certificate. The following MSDN document contains a list of supported OIDs
-ss Specifies the subject’s certificate store name that stores the output certificate.
-sr Specifies the subject’s certificate store location. location can be either currentuser (the default) or localmachine.
-sky Specifies the subject’s key type, which must be one of the following: signature (which indicates that the key is used for a digital signature), exchange (which indicates that the key is used for key encryption and key exchange), or an integer that represents a provider type. By default, you can pass 1 for an exchange key or 2 for a signature key.
-sp Specifies the subject’s CryptoAPI provider name, which must be defined in the registry subkeys of HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\Defaults\Provider. If both -sp and -sy are present, the type of the CryptoAPI provider must correspond to the Type value of the provider’s subkey.
-sy Specifies the subject’s CryptoAPI provider type, which must be defined in the registry subkeys of HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography\Defaults\Provider Types. If both -sy and -sp are present, the name of the CryptoAPI provider must correspond to the Name value of the provider type subkey.
-len Specifies the generated key length, in bits.

List of some possible Providers and Provider Types

The following list is based on the available Providers at my dev machine.

Provider Name Provider Type
Microsoft Base Cryptographic Provider v1.0 1
Microsoft Enhanced Cryptographic Provider v1.0 1
Microsoft Base Smart Card Crypto Provider 1
Microsoft Strong Cryptographic Provider 1
Microsoft Base DSS Cryptographic Provider 3
Microsoft RSA SChannel Cryptographic Provider 12
Microsoft Base DSS and Diffie-Hellman Cryptographic Provider 13
Microsoft Enhanced DSS and Diffie-Hellman Cryptographic Provider 13
Microsoft DH SChannel Cryptographic Provider 18
Microsoft Enhanced RSA and AES Cryptographic Provider 24


Publish applications with Application Proxy

In this example I want to show show incredible easy it is to publish an internal application to the internet, using the Application Proxy feature in Azure Active Directory.

What is the Application Proxy?

Simply put. Azure Active Directory Application Proxy is a small service running on a Windows Server in your LAN, and creates a secure channel back to Microsoft Azure. From there, the service is made publicly available from a url based on the application name you choose, either using HTTP or HTTPS. The result is that you can make any web application available without advanced infrastructure at your side, and the best thing: You don’t need a public, static IP address!

If you don’t have a Azure Subscription today, you could sign up for a free trial at


  • Windows Server 2012 R2
  • The server must be able to connect to the internet, and a list of outbound ports must be open in your firewall
Port Number Description
20200 – 20204 To enable long poll requests originated from the connector towards the Azure service
10100 – 10104 To enable LOB HTTP responses sent back to the proxy
8080 To enable the connector bootstrap sequence
9090 To enable connector registration

Enable the Application Proxy service in Azure AD

This is a two step process. First we need to click a magic button that enables Application Proxy for your Azure AD tenant. Step two is to install the connector on a Windows Serve running in your network.

Step one: Sign in and activate Application Proxy

Sign in to the Azure Management Portal at From there, access the configuration tab for the Azure Active Directory you want to use. Around the middle of the page, you have a setting named application proxy. This setting has a button named Enabled. Click it, and then save the configuration.


That’s all we need to do with the directory at this time. Next step then.

Step two: Install the connector

Sign in to the Windows Server you want to use as a Application Proxy, and download the Application Proxy Connector. When done downloading, click Run.


On the Welcome page, simply click Next, and that’s all the configuration you need to do before starting the installation. So click Install. It should be over in seconds.


Then you need to sign in using a Global Administrator account. Last time I did this, using a MFA enabled account it failed (Just so you’re not troubleshooting this again).


A few seconds later, it’s ready. Now, we can publish applications.


Publishing the intranet application

OK, it’s time to make the web application available to the internet.

The web application in this case is a password protected WordPress installation, running as an intranet application with the URL http://intranet/. Since I’m no IIS and URL Rewrite master, I’m going the easy way with this one, and just publish the internal application with the same name that we get from the This way, everything within the WordPress world works afterwards.

Back in the Azure Management Portal open the Applications tab of your Azure AD. At he bottom of the page, you have a add button. Click it, and in the popup, click Publish an application that will be accessible from outside your network


The first task is to name our application. Name it smart, cause the name is used to generate a URL for your application.


The next task is to configure the internal URL, click the confirm button. In this case, I published it using HTTP, since I had some issues with the WordPress Admin page when using HTTPS and URL Rewrite in IIS. If you know how I could solve that one, feel free to leave a comment. In a production environment I would never recommend anyone to publish their intranet using HTTP.


We are done configuring the proxy application!


The next step is to configure WordPress, and add a record on the application proxy server so the URLs are published correctly.

To solve this you need to do 3 things

  1. Sign in to the server hosting the Application Proxy and add a record in the hosts file, pointing the new external url to the web server running WordPress.
  2. At the WordPress server, change the binding in IIS to accept the new hostname.
  3. In the WordPress Admin Center, go to the Settings -> General page, and change WordPress Address and Site Address to the new external address

Refresh your browser, and would you look at that! It works 🙂


Resources for you to continue on your own?

Reading list MS 74-409 – Server Virtualization with Windows Server Hyper-V and System Center

A new reading list is done, and it can be found at the following page

MS 74-409 – Server Virtualization with Windows Server Hyper-V and System Center

I hope you all find this one help full 🙂


LAB: Migrate Office 365 synced users from one on-prem forest to another forest

This post is inspired from a forum tread I answered over at Technet. After clicking Submit I got the feeling that I didn’t have all the facts, and that I might miss some of the steps.

It all started as a quick write on how to handle an AD migration where dirsync is in use also. Well, it turns out that this is no quick post at all, and actually became a LAB guide instead. In addition, yes, my feeling telling me that I might did not get the reply 100% correct was confirmed. Especially the part on how matching the synched users after migration works.

Anyhow, I hope that you find it interesting, and could use it to do something productive some time.

– Thank you MVP Olav Tvedt for reviewing the post:

The case

There is harsh times in the business market and Contoso Ltd. and Fabricam, Inc. has decided to fuse their business. The decision makers have decided that all users from Contoso are to be migrated over to Fabricam, and the Contoso name is discontinued.

You are an administrator at Fabricam, Inc. and have been delegated the task to make this go as smooth as ever possible. While you are at it, you also decided to restructure Active Directory in the target domain, so everything gets a bit more logical.

Some details on Contoso

  • Forest root domain name is
  • All users and security groups are located in a OU named Organization\Users and Organization\Groups
  • The UPN suffix for the users are
  • Located in Oslo, Norway
  • WAAD Sync enabled for the Office 365 tenant.
  • Using Exchange and Lync. SharePoint is not in use

Some details on Fabricam

  • Forest root domain name is
  • All users and security groups are located in a OU named Domain Users and Domain Groups
  • The UPN suffix for the users are
  • Located in Stavanger, Norway

The new company structure

The companies are merged, but the locations is kept, so I decide to use a geographical OU structure in Active Directory Domain Services. The result is an AD implementation looking something like this.


All users from Contoso are signing in using their UPN. This is also implemented in Office 365. Their Office 365 tenant is named

After the migration, all users will sign inn using This means that the Office 365 UPN suffix needs to be changed after the migration.


For demonstrating this case, I created the following LAB on my computer using the power found in Hyper-V for Windows 8.1. I also registered for a 30 days free trial Office 365 subscription. As you learn later on, ADMT has not been released for Windows Server 2012/2012R2 yet, so the servers are running Windows Server 2008R2.


I also populated the Contoso domain with a few users that I then synched with Office 365 using WAAD Sync Tool. I did not care to crate groups, as this is simply a demonstration.

In this deployment I have used two subdomains from my own root domain.


These domains is used both on-premises and in the cloud. The on-premises domains could have been whatever I like, but to make it easier, I chose to use the public routable domain on both. The reason for this is

  1. To use a custom domain for Office 365, or any Microsoft Online Service for that mater, it requires a public routable domain.
  2. I do not need to add an alternative UPN for all the users on-premises

You can read more about the Domain requirements in the Office 365 Service description, and UPN requirements in the Prepare for directory synchronization document for Windows Azure Active Directory

How did I do it?

Well, first I quickly browsed trough these articles to understand the concept of how ADMT and DirSync work.

A quick overview of the tasks looks like this

  1. Plan for an Interforest Migration
  2. Prepare the two forests
  3. Disable and remove the old WAAD Sync from
  4. Migrate the users to the new domain
  5. Installing WAAD sync in the new domain
  6. Change the UPN suffix for the migrated user

First step: Planning and preparing for an Interforest Migration

In this demo we are doing an Interforest Migration. That means that we are migration from one AD Forest to another, but you probably knew that already. There is a nice checklist at TechNet ( that gives us a quick overview of what we are going to do. I advise you to read this thoroughly before you do it in a production environment. For the purpose of this demo, just rush through it.

Second step: Preparing the two forests

This step involves three sub steps.

  1. Restructure the OU structure in the Fabricam domain as planned.
  2. Add as alternative UPN
  3. Establishing the Required forest trusts
  4. Create and establish required accounts for migration
  5. Install ADMT on

Restructure AD

In this step I am going to implement the structure planed earlier. In my case this is no big deal, sense I only have few OUs to create. Anyhow, there is no fun in doing this from a MMC snapin, so I created a PowerShell script.

First, I crated a CSV file containing the OU structure that I wanted, and saved it as OUStructure.csv. It looked like this.


Then I created a PowerShell script looking like this, and saved it as CreateOUStructure.ps1

foreach ($NewOU in $(Import-Csv .\OUStructure.csv -Delimiter ";")) {
    New-ADOrganizationalUnit -Name $NewOU.Name -Path $NewOU.Path

Finally, I opened Active Directory Module for Windows Powershell. It’s located under Administrative Tools, and then I simply ran .\CreateOUStructure.ps1


After this I had a OU structure looking like this in ADUC


Add alternative UPN

Before I continued with the trust part, I added as an alternative UPN using the Active Directory Domain and Trust MMC snapin. Just to get some variation. I could also run this PowerShell command in the Fabricam domain.

Set-ADForest –Identitiy –UPNSuffixes @{Add=""}

Create forest trust

This step is outlined in the following TechNet article, and a deeper dive on trusts can be found here

I am setting up a two-way trust.

The first thing to do is to make sure network communication is up, and DNS is in place. Sense this is my LAB; my RRAS takes care of the networking part. My job is to configure the DNS server.

Note: If you need to configure your firewall, something you probably should need to do if this where a production environment, take a look at these two articles that describes the required settings.

I begin at condc01 in and open DNS Manager. There I make a Conditional Forwarder, pointing to

Then I did the same thing at fabdc01 in, except that the conditional forwarder was for and was pointing to

Then I test DNS lookup in both domains before continuing. If everything looks fine, I am good to go.

Next step is to create the trust itself. I did the whole process in once from fabdc01 in Fabricam.

I open Active Directory Domains and Trust, located under Administrative Tools. There, I right click, select Properties, and click the Trusts tab. Then I click New Trust…


A New Trust Wizard pops up. I click Next at the first page.

At the Trust Name page, I enter the name of the remote forest.


Sense I am in a lab with only one domain in each forest, the next step does not matter that much, so I just click Next here. The difference is that an External trust only trust the domains we specify. A Forest trust would trust any domain in the target forest.


The next window asks me to choose the direction of the trust. For the purpose of this demo, I just leave the defaults and create a Two-way trust. We create a two what trust because we want both forests to trust each other.


Now I am asked if I want to create the trust in both domains at once, or in this domain only. I choose Both this domain and the specified domain. This way, I save some time, sense this wizard automatically creates the trust in the other domain.


Then I am are asked for the User Name and Password for the remote domain. I just use the Domain Administrator account.


Next step is to set the scope of authentication. I simply go with the defaults here, and select Domain-wide authentication both from and from the local domain. This way I do not have to grant specific accounts access. And in this case, both domains are in the same Organization.


I am now ready to create the trust. Click Next to create it


And boom, the trust is in place. The last steps is to verify the trust. I get the option to do so at the next page.


I make sure to select Yes, confirm the outgoing trust, and click Next. This probably work. If not, take a look at this article over at TechNet


Next, I confirm the incoming trust, by selecting Yes, confirm the incoming trust and click Next.


This should also be a success, so now I can click Finish. For now, I just ignore the warning about SID filtering. I am going to let ADMT configure this for me the first time it runs.


Lest get to the next step, and configure some migration accounts.

Creating Migration Accounts

This part is described in this TechNet article

I open ADUC on condc01 (DC at our source domain) and add an account called res_migrator. I also add this account to the Domain Admins group.

Then sign in to the domain controller in the target domain. That would be fabdc01.

My res_migrator account need delegated access to the Oslo OU, as this is the destination for migrated users and groups.

I right click the Oslo OU, and click Delegate Control… I click Next on the Delegating of Control Wizard welcome page.

Then I add reg_migrator from the contoso domain, and click Next


On the Tasks to Delegate, I select

  • Create, delete, and manage user accounts
  • Create, delete and manage groups
  • Modify the membership of a group


On the Completing the Delegation of Control Wizard page, I click Finish.

Then, I right click the root domain, and select Delegate Control… I click Next on the Welcome page. I Add our res_migrator account and click Next. Now, at the Tasks to Delegate page, I select Create a custom task to delegate, and click Next. On the Active Directory Object Type, I leave the defaults to This folder, existing objects in this folder, and creation of new objects in this folder, and click Next.

On the Permissions page, I find and check Migrate SID history.


I click Next, and then Finish.

The last step in preparing the reg_migrator account is to sign in at the server that’s going to run ADMT and add reg_migrator as local administrator.

Installing ADMT

ADMT version 3.2, which is the newest one available today, requires at least an MS SQL Express instance as a prerequisite. Therefore, I begin with the installation of SQL Server. MS SQL Express 2008 is downloadable from Yea, I am not kidding. If you try to use R2 or newer, you get this error message during the ADMT installation.


Except that, the SQL installation is straightforward. I start the setup file, click New installation, and just goes with the defaults. I also add CONTOSO\res_migrator to the list of administrators for the instance. If you are unsure, take a look at this blog post from MSDN (2008R2 but the basics apply), or do a quick bing.

Then I go to and download the installer for ADMT. The steps for installing ADMT is described here

On the Welcome page, I click Next, Agree to the License Agreement, and choose not to join the CEIP. Then I just click Next.

At the Database Selection page, I type the SERVER\Instance name for the SQL server. It should be something like ADMT\SQLEXPRESS. Hit Next. The wizard tries to connect, and if successful, the installation starts.

When done, I am prompted to import data. I choose No, do not … and click Next. This creates a new empty database for us. After a few seconds, I am able to click Finish.

The final step is to enable migration of passwords. This step is described here

While still signed in to our ADMT migration server, I open a command prompt and type

admt key /option:create / /keyfile:"C:\Users\res_migrator\keyfile" /keypassword:pass@word1


Next step is to install PES on our source DC. I copy the key file and sign in to I store the key file somewhere logical, and then download and run PES from

On the Welcome page, I click Next, and accept the license.

On the Encryption File page, I locate the file you copied, and click Next.


When prompted for password for the encrypting key, I type the same password that I used when generating the encryption file.


I click Next when prompted if I am ready.

After a few seconds, a box asks me if this is going to run as Local System account or another log on account. I just leave the Local System account selected and click OK. Setup will now finish and I have to restart the domain controller. After restart, I start Password Export Server Service from Services under Administrative Tools.

To finish off, we need to enable Audit Account Management and Audit Directory Service Access on both domain controllers. The next steps is a copy/paste from this TechNet article

  1. Log on as an administrator to any domain controller in the target domain.
  2. Click Start, point to All Programs, point to Administrative Tools, and then click Group Policy Management.
  3. Navigate to the following node:
    Forest | Domains | Domain | Domain Controllers | Default Domain Controllers Policy
  4. Right-click Default Domain Controllers Policy and click Edit.
  5. In Group Policy Management Editor, in the console tree, navigate to the following node:

Computer Configuration | Policies | Windows Settings | Security Settings | Local Policies | Audit Policy

  1. In the details pane, right-click Audit account management, and then click Properties.
  2. Click Define these policy settings, and then click Success and Failure.
  3. Click Apply, and then click OK.
  4. In the details pane, right-click Audit directory service access and then click Properties.
  5. Click Define these policy settings and then click Success.
  6. Click Apply, and then click OK.
  7. If the changes need to be immediately reflected on the domain controller, open an elevated command prompt and type gpupdate /force.
  8. Repeat steps 1 through 12 in the source domain.

ADMT is now ready for migration jobs, so let us run a quick test.

I open Active Directory Users and Computers on condc01, and create a new OU named Test. In it, I create a user object named Test User. Then at fabdc01, I create new OU named Test. On the OU in, I also delegate the same permissions for CONTOSO\res_migrator as I did for the Oslo OU earlier.

Back at the ADMT server, I make sure that I am signed in as CONTOSO\res_migrator. I open Active Directory Migration Tool from the Administrator Tools menu.

Then I right click Active Directory Migration Tool, and select User Account Migration Wizard.


On the Welcome page, I click Next as usual.

On the Domain Selection page, I add as Source Domain and as Target Domain. Then I click Next.


On the User Selection Option page, I choose Select user from domain, and click Next.

Then on the User Selection page, I add the test user I just created.


Then I select the Target OU in fabricam.


I choose to Migrate password on the Password Options page.


On the Account Transition Options page, I make sure to check Migrate user SIDs to target domain. I leave the rest to its defaults.


Now I get an error, telling me that auditing is not enabled in the source domain, and if I want it corrected. Yes, I want to.


Then I am warned about another issue. The local group CONTOSO$$$ does not exists. Yes, I want it created.


On the User Account page, I add the credentials for CONTOSO\res_migrator, and click Next


On the User Options page, I leave the defaults, and click Next.

On the Object Property Exclusion page, I do now exclude anything. Just click Next.

On the Conflict Management page, I select Do not migrate source object is a conflict is detected in the target domain, and click Next.

Then, at the Completing the User Account Migration Wizard, I click Finish. A progress window opens, and when Status changes to Competed, I click Close.


Migration works, so I can move on.

Third step: Preparing Office 365 before user migration

When we migrate the users to the new forest, it is also a good idea to make sure that synchronization with Office 365 happens from the new forest.

A quick overview of the steps looks like this

  1. Disable and remove dirsync from Contoso
  2. “Reset” dirsync in Office 365

Why, do you say?

If not, we would just get many error messages from the synchronization service, because the ImmutableID will be changed, and synchronization will fail in the old domain.

By removing WAAD Sync Tool from the old domain, and reset Active Directory Synchronization in Office 365, our users become “In cloud” users. This way we could continue synchronizing them from the new domain, by changing the ImmutableID once the users have been migrated.

I sign in to CONDC01 and uninstall Windows Azure Active Directory Sync Tool.

After uninstalling WAAD Sync Tool, I sign in to Office 365 admin center, and open users and groups. There, I click Deactivate on Active Directory synchronization.


This could take up to 72 hours, but my experience is that it takes a few minutes. When Deactivation is done, I activate it again, so everything is ready when we are going to install WAAD Sync Tool again in fabricam.

Fourth step: Begin migration

We should be ready for migration now. I sign in to using CONTOSO\res_migrator, and start Active Directory Migration Tool from the Administrative Tools menu.

I use the same procedure as with the test migration. Just this time, I select all the users from\Organization\Users and put them into their Oslo\Users\Office365 in the fabricam domain. Then I start migration.

When migration is done, I need to install WAAD Sync Tool, and get stuff ready for migration.

I download WAAD Sync Tool, and install it on FABDC01 using the best practice documented here

After installation, and initial configuration, I do not start the synchronization. We are first going to configure it, so only our Office365 OUs are synced.

I open Explorer and browse to C:\Program Files\Windows Azure Active Directory Sync Tool\SYNCBUS\Synchronization Service\UIShell\ and open miisclient.exe

When Synchronization Service Manager starts, I click Management Agents, and goes to Properties on the Active Directory Connector


There, I click Configure Directory Partitions and then click Containers … further down on the page.


I enter the same administrator credentials as I used to configure WAAD Sync Tool.


Then I select only those containers having objects that we want to synchronize with Office 365.


Then I click OK, and OK again until I am back in the Synchronization Service Manager again.

Now, if WAAD Sync Tool should run, we would get an error, sense out ImmutableID in Office 365 no longer matches our on-premises ObjectGUID. We are going to solve this using PowerShell, but first I need to install the tools

The script I am using looks like this. It gets all users from Office 365, matches them with the on-prem users, and then writes back the updated ImmutableID.

# Import required PowerShell Modules
Import-Module ActiveDirectory
Import-Module MSOnline
# Sign in to Office 365
$Office365AdminCredentials = Get-Credential
Connect-MsolService -Credential $Office365AdminCredentials
# Get all Office 365 users
$MsolUsers = Get-MsolUser
Foreach ($MsolUser in $MsolUsers) {
    # Test that the user on prem match the online user using the UPN
    If ($OnPremMatchingUser = Get-ADUser -Filter * | where {$_.UserPrincipalName  -eq $MsolUser.UserPrincipalName})
        # Fetch the ObjectGUID from the on-prem user as a byte array
        $ObjectGUID = $OnPremMatchingUser.ObjectGUID.toByteArray()
        # Convert it to a ImmutableID
        $NewImmutableID = [system.convert]::ToBase64String($ObjectGUID)
        Write-Host "Changing ImmutableID for Msol User" $MsolUser.UserPrincipalNAme
        Write-Host "- Old ImmutableID was" $MsolUser.ImmutableID
        Write-Host "- New ImmutableID is" $NewImmutableID
        Set-MsolUser -UserPrincipalName $OnPremMatchingUser.UserPrincipalName -ImmutableID $NewImmutableID

Now, it’s time to do the first sync from the new AD. I start PowerShell as an Administrator and open C:\Program Files\Windows Azure Active Directory Sync\DirSyncConfigShell.psc1

Then I run the command Start-OnlineCoexistenceSync

If you open Synchronization Service Manager and watches the Operations tab, you can see that the sync runs, and you should get an Operation saying that the Windows Azure Active Directory Connector with profile Export was a success.


Back in Office 365 admin center you can see that the users are no more In Cloud users, but Synced with Active Directory.

Fifth step: Enable new UPN for the users

My users are synching with Office 365, and everybody is happy. Except the branding department. All users from Contoso still uses their old UPN, both for sign on and as SMTP address suffix.

To solve this, I first add as a domain to our Office 365 subscription. After the domain has been validated, there is three steps to go through.

  1. Update the On-Prem UPN and e-mail address
  2. Update the Office 365 UPN
  3. Run DirSync

Of course we are going to use PowerShell to do this. Anything else would just be too time consuming.

Let me start with the on-premises accounts, and then the Office 365 accounts. Sign in to a computer having both the ActiveDirectory and MSOnline module for PowerShell installed, and run this script.

Import-Module ActiveDirectory
Import-Module MSOnline

Get-ADUser -Filter {UserPrincipalName -like "*"} | foreach {
    Set-ADUser $_ -UserPrincipalName $_.UserPrincipalName.Replace("","") -EmailAddress $_.UserPrincipalName.Replace("","")

$MsolCredentials = Get-Credential
Connect-MsolService $MsolCredentials

Get-MsolUser -DomainName "" | foreach {
    Set-MsolUserPrincipalName -UserPrincipalName $_.UserPrincipalName -NewUserPrincipalName $_.UserPrincipalName.Replace("","")

We are done, case closed 🙂

Performance Analysis of Log (PAL) Tool

We have all been there. You install a system, and everything works just perfect. Then, after a few weeks or maybe months, things begin to happen, and your task list fills up with users complaining about poor performance.

Therefore, you dive in and monitors a bunch of counters you found using your favorite search engine on the internet, but everything looks just fine. Well, I know, it is a real pain. This is where PAL could be your savior.

Over at CodePlex, there is a tool with the describing mane Performance Analysis of Log Tool, or just short PAL.

What PAL can do for you is to create a Perfmon template with just the interesting counters. After you have created a new Data Collection Set using this template and collected some data, PAL then analyze the output for you. What you then need to do is to read though a HTML document, looking for any red or yellow text.

In the steps below, I crated a template for Microsoft Active Directory, and let it run on a DC in my LAB network. While it run, I created some bad powershell scripts that did some searching in AD, and some IO intensive operations to the disk. We should get some red flags…

After you have installed PAL, open it and navigate to the Threshold File tab. Then simply select a predefined template in the Threshold file title drop down box, and click Export to Perfmon template file. To see, or edit, the counters used in this template, simply click Edit before exporting the file.


Save the exported XML file to a known location, and copy it to your Active Directory Server.

Safely there, open Performance Monitor, and expand Data Collection Sets and click User Defined. Right click User Defined, and select New, Data Collection Set. To learn more about Data Collection Sets take a look at this link


In the wizard that pops up, give the DCS a good name like PAL_ActiveDirectory, and make sure that Create from a template (Recommended) is selected. Then click Next.


Now we want to select our exported template. Click Browse… when asked to select template, and find to XML file you imported.


You should then end up with something like this


Then just click Finish, unless you want to run this under a specific account, or want the data saved elsewhere. Now start the collector and let it run for a few hours. You could also schedule this, as you normally would do when working with Data Collection Sets.

OK, so our DCS have been running for a few hours, and it is time to do something with the data.

Sign in to the domain controller, and copy your perfmon log from C:\PerfLogs back to your computer where you have PAL installed. The next job could take some time. Depending on the amount of data you have collected, and how badass your computer is, this could take hours. On my old HP 6465b laptop, it took about 3.5 hours to process about 550MB of Exchange 2010 log files using one tread.

First, go to the Threshold File tab, and choose the same template and settings you used earlier. Then open the Questions tab, and select correct Operating system, and the amount of Physical Memory (or virtual) installed on the DC. As the last step before beginning the analysis, open the File Output and specify an Output Directory.

If you open the Queue tab you should see something like this. The window is showing how you could execute the job without using the PAL Wizard.


Then go to Execute, and click Finish. Now, wait while PAL does its magic.



After some time, the analysis is done.


And you are left off with a great report to study. Have fun doing it. If you are a bit geeky like me, it could actually be some interesting reading.