Project Honolulu and Server Core

By now many IT administrators have heard of Project Honolulu from Microsoft. I must admit, when I heard the headline and initial talk about it, I book-marked the info intending to come back but didn’t really dig into it. I thought, “oh a revamped Server Manager.”

My perspective is all changed now.

I’m watching the “Windows Server: What’s new and what’s next” session from Ignite 2017. I have been following Aidan Finn’s blogging of Ignite sessions including this one in particular for the “What’s new” session, and recall reading his notes about Server Core being in semi-annual channel, but not Server GUI, so “you better learn some PowerShell to troubleshoot your networking and drivers/firmware”.

But now having watched the session myself, it makes Microsoft’s vision and a path forward here very clear to me.

Using Server Core allows an organization to reduce their surface area for vulnerability, streamline the size/frequency of Windows Updates, optimize performance and scalability on hardware, and stay up to date on the Windows Server cadence.

Project Honolulu makes using Server Core viable. This is the answer to the Windows system administrator saying “Server Core just doesn’t give me the visibility I need into my servers”. As Jeff Woolsey walks through the functions of Project Honolulu, it is obvious that THIS is where the visibility will be; no more RDP into individual servers to manage their roles, devices, and settings. No more MMC windows to open Event Viewer and Shares and other applets.

Now I’m excited.

Azure PowerShell DNS – Modify TTL

A brief note on modifying TTL on an Azure DNS record set. This is changed on the record set, not the record itself.

The Azure DNS PowerShell docs don’t make it explicitly clear how to do this, however using “help set-AzureRmDnsRecordSet -examples” gave me a clue on how to achieve it.

The key is the middle line here:

$rs = Get-AzureRmDnsRecordSet -Name "msoid" -RecordType CNAME -ZoneName "domain.com" -ResourceGroupName "DNS"
$rs.TTL = 3600
Set-AzureRmDnsRecordSet -RecordSet $rs

 

 

Expand Commvault Disk Library

My Commvault disk library is nearing 90% capacity, which is a growing cause for concern. The disk library is provided by a Dell MD1400 with 12x6TB SAS drives, which was originally configured as a 10 disk RAID6 set with 2 hot spares (~45TB usable) on the PERC H830 card.

As an intermediate step before purchasing an additional disk shelf, I did the following to give a little breathing room.

  1. Using the Dell Open Manage Server Administrator (OMSA), I located one of the hot spare drives and unassigned it.
    Dell OMSA unassign global hot spare
  2. Still within OMSA, I chose the “Reconfigure” option on my Virtual Disk, keeping the RAID level at 6 but adding in the new available capacity.
  3. Then I waited. It took 16 days for the reconfiguration to complete as the RAID6 parity was re-spread/calculated across the set. This left me with a ~50TB virtual disk.
    Dell OMSA virtual disk size
  4. Within Disk Management, I carved the free space into two additional ~3TB volumes, which are mounted within a standard disk library path of L:\DiskLibrary. This size adheres to the recommendations given to us by our professional services partner.
    Disk management showing new volumes
  5. During the original configuration, we used Automated Mount Path Detection, to point to L:\DiskLibrary:
    commvault export storage config example
  6. Due to this, the new mount paths were picked up automatically after about 15 minutes, and appeared within the list of mount paths under the library:
    mount paths on commvault library

Checking the Disk Usage tab on the properties of the library dropped utilization down to 80%, and provided us some time to address the growing data concerns.

 

Exchange Online PowerShell access denied

I am attempting to test aspects of Office 365 Modern Authentication in a UAT environment prior to enabling it within our production Tenant.

Part of this work is testing the Exchange Online PowerShell access, as there is quite amount of automation configured in our environment and we want to ensure it doesn’t break. I’ve read it “shouldn’t”, but that’s a dangerous word to trust.

Until now I’ve been unable to make the PowerShell connection to Exchange Online in our UAT environment, receiving the following during my attempts:

New-PSSession : [outlook.office365.com] Connecting to remote server outlook.office365.com failed with the following error message:[ClientAccessServer=servername,BackEndServer=servername.prod.outlook.com,RequestId=e6f6b9e7-7c5e-45ec-87fe-59332db1fb95,TimeStamp=8/17/2017 3:16:52 PM] Access Denied For more information, see the about_Remote_Troubleshooting Help topic.

I can use the same account to connect in-browser to http://portal.office.com, and it is set as a Global Administrator in O365, so I know that the account itself has appropriate access.

Interestingly, if I connect with the MFA-supported PowerShell method, with the same account, it connects successfully.

Through testing I’ve determined that using any on-premise account synchronized through Azure AD Connect fails with the same “Access Denied” message, while any cloud-only account connects successfully.

I began to look at our ADFS implementation in UAT since that is a key component for authenticating the on-premise user account. This environment has ADFS 2.0 on Server 2008 R2, which is different than production but shouldn’t be a barrier to connectivity (without MFA).

After comparing the O365 trust configuration and finding no issues, I decided to use the Microsoft Connectivity tool to test. Using the Office 365 Single Sign On test, I saw a failure with this error:

A certificate chain couldn't be constructed for the certificate.
Additional Details
The certificate chain has errors. Chain status = NotTimeValid.

This let me on the path to fixing expired/broken SSL certificates in our UAT ADFS, which I posted about previously here.

Now that the SSL problem is resolved, I attempted to connect to Exchange Online PowerShell again, and was successful!

Looks like this “Access Denied” message was directly related to the expired certificate of the ADFS proxy.

 

 

ADFS 2.0 renew Service Communications certificate

I’ve recently solved a problem with the help of Microsoft Premier Support that didn’t have any references online that I could find.

Looking at the ADFS console under Certificates, the “Service Communications” section had a message of “Certificate not found in store”.

Connecting to the certificate store showed a proper external SSL cert for our UAT ADFS DNS name. Trying the option “Set Service Communications Certificate” in ADFS produced the error:

The Certificate could not be processed.
Error message: Object reference not set to an instance of an object.

This error led me to this discussion on the Microsoft forums, with the following command to attempt:

AddPsSnapin Microsoft.Adfs.PowerShell SetAdfsCertificate CertificateType “Service-Communications” Thumbprintaa bb cc dd …”

However, when I tried to run this command I repeatedly got the following error:

The type initializer for 'Microsoft.IdentityServer.Dkm.ADRepository' threw an exception. Microsoft.IdentityServer.PowerShell.Commands.SetCertificateCommand

The resolution: run PowerShell as the ADFS service account, and then use the command above to set the certificate. After this, I was able to restart the ADFS service and the console displayed the certificate properly.

I also needed to update the certificate on the ADFS proxy in IIS to get a successful result from the Microsoft Remote Connectivity Analyzer.