Start here

Copying Receive Connectors Hither and Yon

If you have done a number of Exchange migrations, or have a large number of servers to migrate in a single migration, I am sure you have run into the pain of replicating the receive connectors to the new server. Lots of settings to copy down and move over plus there is the headache of explicit permissions granted on the connector in the case of relays or other special use connectors. That can waste a lot of time that you would much rather spend on the finale of Sherlock season 3. Let’s see if we can simplify that today with this script for Copy-ReceiveConnector. You call the script as follows:

Copy-Receive Connector -SourceConnector "EXCHANGE\Alternate Receive Connector" -DestinationServer NEWEXCHANGE -DomainController dc01

This will create a new receive connector on the destination server with all of the settings specified on the old receive connector. It will then loop through all of the non-inherited permissions on the connector and copy those over. You can also specify a new name for the connector via -Name. Onto the code.

<#
.SYNOPSIS
Copy-ReceiveConnector - Copies a receive connector from a source server to a 
destination server
.DESCRIPTION
Takes the source receive connector a creates a copy on the destination server
with values populated from the source receive connector.
.PARAMETER SourceConnector
Identity of the source receive connector
.PARAMETER DestinationServer
Server name of the destination Exchange server
.PARAMETER DomainController
Target domain controller for setting the configuration
.PARAMETER Name
Optional new name for the connector
.EXAMPLE
Copy-Receive Connector -SourceConnector "EXCHANGE\Alternate Receive Connector" 
-DestinationServer NEWEXCHANGE -DomainController dc01
#>
[CmdletBinding()]
param(
[Parameter(Mandatory=$True)][string]$SourceConnector,
[Parameter(Mandatory=$True)][string]$DestinationServer,
[Parameter(Mandatory=$True)][string]$DomainController,
[Parameter(Mandatory=$False)][string]$Name
)
Import-Module ActiveDirectory
# Get the values for the old connector
$Source = Get-ReceiveConnector -Identity $SourceConnector
# Update the name if specified
if($Name)
{
 $Source.Name = $Name
}
# Custom permission group is not allowed in Exchange 2013 so we need to remove it
# Nothing to be concerned about since the ACEs are explicitly copied over.
$TempArray = @($Source.PermissionGroups) -split ", " | Select-String -Pattern "Custom" -NotMatch
$TempString = "$($TempArray)"
$Source.PermissionGroups = $TempString.Replace(" ", ", ")
# Copy all the values over to create the new connector on the 2013 server
New-ReceiveConnector -Bindings $Source.Bindings -Server $DestinationServer -DomainController $DomainController -Name $Source.Name -RemoteIPRanges $Source.RemoteIPRanges -AdvertiseClientSettings $Source.AdvertiseClientSettings -AuthMechanism $Source.AuthMechanism -Banner $Source.Banner -BinaryMimeEnabled $Source.BinaryMimeEnabled -ChunkingEnabled $Source.ChunkingEnabled -Comment $Source.Comment -ConnectionInactivityTimeout $Source.ConnectionInactivityTimeout -ConnectionTimeout $Source.ConnectionTimeout -DefaultDomain $Source.DefaultDomain -DeliveryStatusNotificationEnabled $Source.DeliveryStatusNotificationEnabled -DomainSecureEnabled $Source.DomainSecureEnabled -EightBitMimeEnabled $Source.EightBitMimeEnabled -EnableAuthGSSAPI $Source.EnableAuthGSSAPI -Enabled $Source.Enabled -EnhancedStatusCodesEnabled $Source.EnhancedStatusCodesEnabled -ExtendedProtectionPolicy $Source.ExtendedProtectionPolicy -Fqdn $Source.Fqdn -LongAddressesEnabled $Source.LongAddressesEnabled -MaxAcknowledgementDelay $Source.MaxAcknowledgementDelay -MaxHeaderSize $Source.MaxHeaderSize -MaxHopCount $Source.MaxHopCount -MaxInboundConnection $Source.MaxInboundConnection -MaxInboundConnectionPercentagePerSource $Source.MaxInboundConnectionPercentagePerSource -MaxInboundConnectionPerSource $Source.MaxInboundConnectionPerSource -MaxLocalHopCount $Source.MaxLocalHopCount -MaxLogonFailures $Source.MaxLogonFailures -MaxMessageSize $Source.MaxMessageSize -MaxProtocolErrors $Source.MaxProtocolErrors -MaxRecipientsPerMessage $Source.MaxRecipientsPerMessage -MessageRateLimit $Source.MessageRateLimit -MessageRateSource $Source.MessageRateSource -PermissionGroups $Source.PermissionGroups -PipeliningEnabled $Source.PipeliningEnabled -ProtocolLoggingLevel $Source.ProtocolLoggingLevel -RequireEHLODomain $Source.RequireEHLODomain -RequireTLS $Source.RequireTLS -ServiceDiscoveryFqdn $Source.ServiceDiscoveryFqdn -SizeEnabled $Source.SizeEnabled -SuppressXAnonymousTls $Source.SuppressXAnonymousTls -TarpitInterval $Source.TarpitInterval -TlsDomainCapabilities $Source.TlsDomainCapabilities -TransportRole $Source.TransportRole
# Next we need to copy over all of the explicity created permissions
$ConnectorPermissions = Get-ReceiveConnector -Identity $SourceConnector | Get-ADPermission | where {$_.IsInherited -eq $false}
$ConnectorPermissions | foreach {
 Get-ReceiveConnector "$($DestinationServer)\$($Source.Name)" | Add-ADPermission -DomainController $DomainController -User $_.User -Deny:$_.Deny -AccessRights $_.AccessRights -ExtendedRights $_.ExtendedRights
}

And as a bonus here’s a script for just copying over the permissions configured on a connector, in case you wanted to roll your own connector but didn’t want to spend the time on redefining all of the permissions. Usage is not quite the same as above as you are just specifying a source and destination connector.

Copy-ReceiveConnectorPermissions -SourceConnector "EXCHANGE\Alternate Receive Connector" -DestinationConnector "NEWEXCHANGE\New Receive Connector"
<#
.SYNOPSIS
Copy-ReceiveConnectorPermissions - Copies the permissions from the source 
connector to the destination connector
.DESCRIPTION
Takes the source receive connector, retrieves all of the explicitly defined 
permissions, then applies them to the destination receive connector
.PARAMETER SourceConnector
Identity of the source receive connector
.PARAMETER DestinationConnector
Identity of the destination receive connector
.EXAMPLE
Copy-Receive Connector -SourceConnector "EXCHANGE\Alternate Receive Connector" 
-DestinationConnector "NEWEXCHANGE\New Receive Connector"
#>
[CmdletBinding()]
param(
[Parameter(Mandatory=$True)][string]$SourceConnector,
[Parameter(Mandatory=$True)][string]$DestinationConnector
)
Import-Module ActiveDirectory
# We need to copy over all of the explicity created permissions
$ConnectorPermissions = Get-ReceiveConnector -Identity $SourceConnector | Get-ADPermission | where {$_.IsInherited -eq $false}
$ConnectorPermissions | foreach {
 Get-ReceiveConnector "$($DestinationConnector)" | Add-ADPermission -User $_.User -Deny:$_.Deny -AccessRights $_.AccessRights -ExtendedRights $_.ExtendedRights
}

A Mass Contact Conversion Experience

This may save you a fair bit of trouble in the future. I was working on a client with an Exchange 2007 server that was overrun with contacts. That’s not necessarily a problem in and of itself, but these contacts had served their purpose and now needed to be turned into mail enabled users. They already had a number of duplicates between AD users and contacts and it was causing a lot of trouble in their SharePoint hierarchy. Fortunately PowerShell can step in and save the day! Or would you really enjoy manually copying over all of the contact’s profile data to an AD user? Not my idea of fun.

Here’s the code. You’ll want to tweak it to your purposes of course. I wanted some clear on the screen results as it progressed and also have an HTML report that I can reference afterwards. This marked my first venture in LINQ with moderate success. I will have to revisit things later to get it working exactly how I want.

<#
.SYNOPSIS
Reads in a CSV of contacts or mail enabled users and moves the contacts to mail enabled users or updates
the current mail enabled users

.DESCRIPTION
Convert-CSVContactsToUsers reads in the CSV specified in the command line. This CSV contains a list
of users that either currently have a contact which needs to be converted to a mail enabled
user, have both a contact and a user that is not mail enabled which needs the info from the contact
copied over to the user which is then mail enabled, or have neither a contact or a user and need
to have a mail enabled user created for them.

.PARAMETER FilePath
The path and filename to the CSV containing the required users.

.PARAMETER DomainController
The NetBIOS or FQDN for the target DC to use.

.PARAMETER ResultsFile
The filename to save the HTML formatted results to.

.EXAMPLE
Convert-CSVContactstoUsers -FilePath .\UserList.csv -DomainController dc01 -ResultsFile .\Results.html

.NOTES
Require Server 2008 R2 and should be run from the EMS.
.NET 3.5 and above required for HTML output.
Appropriate domain and Exchange rights are required.

Revision History
v1.0 - Initial release
v1.1 - Added X500 address, moved status report
#>

[CmdletBinding()]
param(
[Parameter(Mandatory=$True)][string]$FilePath,
[Parameter(Mandatory=$True)][string]$DomainController,
[Parameter(Mandatory=$False)][string]$ResultsFile
)

Import-Module ActiveDirectory

#$ErrorActionPreference= 'silentlycontinue'
$CSVList = Import-CSV -Path "$($FilePath)"
$Password = ConvertTo-SecureString -String "expPassword!!" -AsPlainText -Force

# Create a custom table for storing the results
$ResultsTable = New-Object System.Data.DataTable "Conversion Results"
$column01 = New-Object System.Data.DataColumn User
$column02 = New-Object System.Data.DataColumn Result

$ResultsTable.columns.add($column01)
$ResultsTable.columns.add($column02)

# Loop through each CSV entry, check for object existence
# then process the object based on existence and type
foreach($TargetUser in $CSVList)
{
# Check for existence of any objects
$ADContact = Get-ADObject -LdapFilter "(&(CN=$($TargetUser.Name))(ObjectClass=contact))" -SearchScope Subtree -Properties * -Server $DomainController
$ADUser = Get-ADObject -LdapFilter "(&(CN=$($TargetUser.Name))(ObjectClass=user))" -SearchScope Subtree -Properties * -Server $DomainController
$ResultsRow = $ResultsTable.NewRow()
$ResultsRow.User = $TargetUser.Name
[string]$Status = $null

# If both contact and user exist, copy the info from the contact
# into the user's properties, remove the contact, then mail enable the user
if($ADContact -and $ADUser)
{
# First copy over any of the current profile details

if($ADContact.directReports)
{
$ADUser.directReports = $ADContact.directReports
}
if($ADContact.homePhone)
{
$ADUser.homePhone = $ADContact.homePhone
}
if($ADContact.facsimileTelephoneNumber)
{
$ADUser.facsimileTelephoneNumber = $ADContact.facsimileTelephoneNumber
}
if($ADContact.l)
{
$ADUser.l = $ADContact.l
}
if($ADContact.manager)
{
$ADUser.manager = $ADContact.manager
}
if($ADContact.mobile)
{
$ADUser.mobile = $ADContact.mobile
}
if($ADContact.physicalDeliveryOfficeName)
{
$ADUser.physicalDeliveryOfficeName = $ADContact.physicalDeliveryOfficeName
}
if($ADContact.postalCode)
{
$ADUser.postalCode = $ADContact.postalCode
}
if($ADContact.sn)
{
$ADUser.sn = $ADContact.sn
}
if($ADContact.st)
{
$ADUser.st = $ADContact.st
}
if($ADContact.streetAddress)
{
$ADUser.streetAddress = $ADContact.streetAddress
}
if($ADContact.telephoneNumber)
{
$ADUser.telephoneNumber = $ADContact.telephoneNumber
}
if($ADContact.title)
{
$ADUser.title = $ADContact.title
}
if($ADContact.department)
{
$ADUser.department = $ADContact.department
}
if($ADContact.Description)
{
$ADUser.Description = $ADContact.Description
}
if($ADContact.c)
{
$ADUser.c = $ADContact.c
}
if($ADContact.co)
{
$ADUser.co = $ADContact.co
}
if($ADContact.countryCode)
{
$ADUser.countryCode = $ADContact.countryCode
}
if($ADContact.info)
{
$ADUser.info = $ADContact.info
}
if($ADContact.initials)
{
$ADUser.initials = $ADContact.initials
}
if($ADContact.ipPhone)
{
$ADUser.ipPhone = $ADContact.ipPhone
}
if($ADContact.pager)
{
$ADUser.pager = $ADContact.pager
}
if($ADContact.wWWHomePage)
{
$ADUser.wWWHomePage = $ADContact.wWWHomePage
}
if($ADContact.postOfficeBox)
{
$ADUser.postOfficeBox = $ADContact.postOfficeBox
}
if($ADContact.company)
{
$ADUser.company = $ADContact.company
}

# Update the user with the current info
Set-ADObject -Instance $ADUser -Server $DomainController

# Loop through the groups and add the user to them
foreach($ADGroup in $ADContact.memberOf)
{
Add-ADGroupMember -Identity "$($ADGroup)" -Members $ADUser -Server $DomainController
}

# Next, remove the contact
Remove-MailContact -Identity $ADContact.DistinguishedName -Confirm:$false -DomainController $DomainController

# Enable the current user, then copy over the remaining attributes
Enable-MailUser -Identity $ADUser.DistinguishedName -ExternalEmailAddress $ADContact.mail -Alias $ADContact.mailNickname -DomainController $DomainController | Out-Null

# Add the X500 address if the contact was stamped with a legacyExchangeDN
if($ADContact.legacyExchangeDN)
{
$X500User = Get-MailUser -Identity $ADUser.DistinguishedName -DomainController $DomainController
$X500User.EmailAddresses += [Microsoft.Exchange.Data.CustomProxyAddress]("X500:$($ADContact.legacyExchangeDN)")

Set-MailUser -Identity $X500User -EmailAddresses $X500User.EmailAddresses -DomainController $DomainController
}

Write-Host "$($TargetUser.Name) converted."
$Status = "Success"
}
elseif($ADContact)
{
# First remove the contact
Remove-MailContact -Identity $ADContact.DistinguishedName -Confirm:$false -DomainController $DomainController

# Create the mail enabled user
New-MailUser -Name "$($TargetUser.Name)" -ExternalEmailAddress $ADContact.mail -Alias $ADContact.mailNickname -UserPrincipalName "$($ADContact.mailNickname)@domain.local" -Password $Password -ResetPasswordOnNextLogon:$true -DomainController $DomainController | Out-Null

# Then copy in the attributes
$ADUser = Get-ADObject -LdapFilter "(&(CN=$($TargetUser.Name))(ObjectClass=user))" -SearchScope Subtree -Properties * -Server $DomainController
$ADUser.directReports = $ADContact.directReports
$ADUser.homePhone = $ADContact.homePhone
$ADUser.facsimileTelephoneNumber = $ADContact.facsimileTelephoneNumber
$ADUser.l = $ADContact.l
$ADUser.manager = $ADContact.manager
$ADUser.mobile = $ADContact.mobile
$ADUser.physicalDeliveryOfficeName = $ADContact.physicalDeliveryOfficeName
$ADUser.postalCode = $ADContact.postalCode
$ADUser.sn = $ADContact.sn
$ADUser.st = $ADContact.st
$ADUser.streetAddress = $ADContact.streetAddress
$ADUser.telephoneNumber = $ADContact.telephoneNumber
$ADUser.title = $ADContact.title
$ADUser.department = $ADContact.department
$ADUser.Description = $ADContact.Description
$ADUser.c = $ADContact.c
$ADUser.co = $ADContact.co
$ADUser.countryCode = $ADContact.countryCode
$ADUser.info = $ADContact.info
$ADUser.initials = $ADContact.initials
$ADUser.ipPhone = $ADContact.ipPhone
$ADUser.pager = $ADContact.pager
$ADUser.wWWHomePage = $ADContact.wWWHomePage
$ADUser.postOfficeBox = $ADContact.postOfficeBox
$ADUser.company = $ADContact.company
# Copying over the X500 address if it exists
if($ADContact.legacyExchangeDN)
{
$X500User = Get-MailUser -Identity $ADUser.DistinguishedName -DomainController $DomainController
$X500User.EmailAddresses += [Microsoft.Exchange.Data.CustomProxyAddress]("X500:$($ADContact.legacyExchangeDN)")

Set-MailUser -Identity $X500User -EmailAddresses $X500User.EmailAddresses -DomainController $DomainController
}

Set-ADObject -Instance $ADUser -Server $DomainController

# Loop through the groups and add the user to them
foreach($ADGroup in $ADContact.memberOf)
{
Add-ADGroupMember -Identity "$($ADGroup)" -Members $ADUser -Server $DomainController
}

Write-Host -ForegroundColor Yellow "$($TargetUser.Name) created."
$Status = "Success"
}
elseif($ADUser)
{
# Only a user is found
Write-Host -ForegroundColor Cyan "$($TargetUser.Name) already exists."
$Status = "Exists"
}
else
{
Write-Host -ForegroundColor Magenta "$($TargetUser.Name) not found!"
$Status = "Failed"
}

# Update the results
$ResultsRow.Result = $Status

# Clear the variables to prevent false positives on the next loop
$ADContact = $null
$ADUser = $null
$Status = $null

# Update the results table
$ResultsTable.Rows.Add($ResultsRow)
}

# Check if HTML results should be written out
if($ResultsFile)
{
# Build the style sheet for the page
$Style = "<style>"
$Style = $Style + "body{font-family: `"Century Gothic`"; font-size: 10pt;}"
$Style = $Style + "table{border-width: 1px; border-style: solid; border-color black; border-collapse: collapse; }"
$Style = $Style + "th{border-width: 1px; border-style: solid; border-color: black; background-color: #CBFEFF; }"
$Style = $Style + "td{border-width: 1px; border-style: solid; border-color: black; text-align: center}"
$Style = $Style + "</style>"

# LINQ will be used for easier custom formatting
Add-Type -AssemblyName System.Xml.Linq

# Convert the desired columns into HTML and convert it to XML
$xml = [System.Xml.Linq.XDocument]::Parse("$($ResultsTable | Select User,Result | ConvertTo-Html -Head $Style)")

# Define the namespace
if($Namespace = $xml.Root.Attribute("xmlns").Value)
{
$Namespace = "{{{0}}}" -f $Namespace
}

#
# $xml.Descendants().Value is not returning the values for some reason here
# Have to resort to alternate index discovery
#
#$wsIndex = [Array]::IndexOf($xml.Descendants("${Namespace}th").Value, "Result")
[Array]$xmlArray = $xml.Descendants("${Namespace}th")
for($i=0;$i -le $xmlArray.length-1;$i++)
{
#We're color coding the Results column
if($xmlArray[$i] -match "Result")
{
$wsIndex = $i
}
}

# Loop through each row and assign a text color and background color to the result
foreach($row in $xml.Descendants("${Namespace}tr"))
{

switch(@($row.Descendants("${Namespace}td"))[$wsIndex])
{
{"Success" -eq $_.Value} {$_.SetAttributeValue("style", "color: #006600; background-color: #F1FEFE;"); continue}
{"Failed" -eq $_.Value} {$_.SetAttributeValue("style", "color: #660000; background-color: #F1FEFE;"); continue }
{"Exists" -eq $_.Value} {$_.SetAttributevalue("style", "color: #4D6600; background-color: #F1FEFE;"); continue }
}
}

# Save to the desired file
$xml.Save("$pwd/$($ResultsFile)")
}

Use safely!

Save those service accounts!

Have you ever woken up at 2:33 am to the phone ringing off the hook because some service is not longer running and it must be fixed right now? And this service not running is because someone reset the password on a service account last week? I’ve seen this scenario play out way more than it ever should. Let’s nip that problem of service accounts running wild right away.

<#
.SYNOPSIS
Checks all servers available in the domain for services running under service
accounts.
.DESCRIPTION
Find-AllServiceAccounts grabs a list of all servers from the current domain and
uses WMI calls to retrieve a list of services from the servers. It then loops
through the services checking for service accounts and writes any service
accounts found to a CSV specifically for that server.
.NOTES
You will be prompted for credentials when first running this.
#>
Import-Module ActiveDirectory
# Default system accounts to exclude
$ExcludedServices = "NT AUTHORITY\LocalService", "LocalSystem", "NT AUTHORITY\NETWORK SERVICE", "NT AUTHORITY\NetworkService"

# Will pop up a prompt to provide credentials for making WMI calls
# to the remote systems
$Credentials = Get-Credential

# Uses the LDAP search results for computers
$HostList = Get-ADObject -SearchBase (Get-ADRootDSE).defaultNamingContext -LDAPFilter "(&(objectCategory=computer)(|(operatingSystem=Windows Server*)(operatingSystem=Windows 2000 Server)))" -Properties Name
$HostName = Get-Content env:computername

# Pass a service name in and check to see if it matches the excluded
# service accounts that you've preconfigured
function Check-ServiceName($ThisServiceName)
{
<#
.SYNOPSIS
Check-ServiceName verifies if a service is not a default service account
.DESCRIPTION
Check-ServiceName checks the service account for the specified service against
the list of excluded services from the $ExcludedServices variable. Returns $true
or $false.
.PARAMETER ThisServiceName
The service account to be checked against.
#>
  foreach($ExcludedName in $ExcludedServices)
  {
     if($ThisServiceName -eq $ExcludedName.ToLower())
     {
       return $true
     }
  }
  return $false
}

[array]$BadHostList = $null
# Our main loop, iterates through each system listed in
# the file and pulls all of their services
foreach($HostServer in $HostList)
{
  # Grab the list of services from the system
  if ($HostServer.Name -ne $HostName)
  {
    $ServiceList = get-WMIObject -Class win32_service -ComputerName $HostServer.name -Credential $Credentials -Property name,startname
  }
  # If it is the localhost then we need to exclude the credentials parameter
  else
  {
    $ServiceList = get-WMIObject -Class win32_service -ComputerName $HostServer.name -Property name,startname
  }

  # If the host could be contacted, loop through and check each service
  if($ServiceList)
  {
    [array]$CSVArray = $null

    foreach($ThisService in $ServiceList)
    {
      $CheckResult = Check-ServiceName($ThisService.StartName.ToLower())
      if ($CheckResult -eq $false)
      {
        Write-Host "Service: " $ThisService.name
        Write-Host "Account: " $ThisService.StartName
        $CSVArray += ,$ThisService | select Name,StartName    
      }
    }

    if ($CSVArray.count -gt 0)
    {
      Write-Host "Saving .csv for $HostServer"
      Write-Host ""
      $CSVArray | Export-CSV "$HostServer.csv"
      $CSVCount++
    }
  }
  # The host could not be contacted, note it down
  else
  {
    Write-Host "$($HostServer.name) is unreachable."
    $BadHostList += $HostServer
  }
}

if($CSVCount -gt 1)
{
  Write-Host "Done, wrote out $($CSVCount) .csv files."
}
elseif($CSVCount -eq 1)
{
  Write-Host "Done, wrote out $($CSVCount) .csv file."
}
else
{
  Write-Host "No .csv files written out, no service accounts found."
}

if($BadHostList)
{
  foreach($TargetHost in $BadHostList)
  {
    Write-Host "$($TargetHost.Name) at $($TargetHost.DistinguishedName) was unreachable."
  }
}

What this does for you is that it runs through all of your servers (assuming they will respond to WMI queries, you’ll want to check your firewall policies) and dumps out a CSV for any server with service accounts reporting what services are using service accounts and which service accounts those are. This will save you much frustration in tracking down who will be impacted by a service account password reset, security audits, password reset frenzies and just plain documentation.

Speaking of security audits and password resets, in an upcoming post I’ll show you how to easily update all of your service account passwords so that you can spend the rest of the day playing Animal Crossing rather than remoting from server to server to server …

Did this post help you out? Do you have any questions or a specific topic you’d like me to delve into? Just let me know, thanks!

Easy Ways to find your Mail Enabled Public Folders

It looks like some of you are wanting to easily find your mail enabled public folders. Definitely something good to know, especially when you are planning out a migration or are being thrown into a new and probably fragile environment. The documentation is never up to date of course so you have to dig it out yourself. Here comes PowerShell to rescue you from manually slogging through it all!

$Servers = Get-PublicFolderDatabase
foreach($TargetServer in $Servers)
{
    Get-PublicFolder -Recurse -Server $TargetServer.Server | where {$_.MailEnabled -eq $true}
}

This will grab all of the mail enabled public folders from all of the servers in your organization. But in case you need to dig in further, say for instance if you are needing to dig into the AD objects for fixing things like missing homeMDB entries or other woes. Or even just plain documentation. This one liner will do it for you.

Get-ADObject -SearchBase ( "CN=Microsoft Exchange System Objects," + (Get-ADRootDSE).rootDomainNamingContext) -SearchScope OneLevel -Filter { (mail -like "*") -and (ObjectClass -eq "publicFolder")} | select Name

I hope this helps out a few of you out there.

5 Extremely Useful Tools to Simplify your Microsoft Exchange Life

Here’s what I find myself using in my day to day life in working with Exchange. If it weren’t for these then troubleshooting and automation would be a lot more difficult and I would find myself throwing my life away. Why waste time when you could be watching E3 game trailers instead?

  1. PowerShell! This one definitely has to take the first place spot in mention. Not that I have any particular order to how I’m listing everything. If it wasn’t for PowerShell then management of even a single Exchange server would be much more tiresome. Just look at mass creating a number of new users. HR sends you an Excel sheet with all of their details, you save the pertinent bits out to a CSV, then just run it through a little PowerShell script
    $NewUsers = Import-Csv -Path C:\Import\UserList.csv
    foreach($NewUser in $NewUsers)
    {
                    New-Mailbox -Name $NewUser.Name -Password $NewUser.Password -UserPrincipalName $NewUser.UPN -Alias $NewUser.Alias -Database $NewUser.TargetDatabase -DisplayName ($NewUser.FirstName + " " +$NewUser.LastName) -FirstName $NewUser.FirstName -LastName $NewUser.LastName -OrganizationalUnit $NewUser.OU -Phone $NewUser.Phone -ResetPasswordOnNextLogon:$true
    }

    Tada, creating new users has been simplified from hours of manual labor to a few minutes of CSV formatting and scripted importing.

  1. mxtoolbox.com This is a site that gets used often in troubleshooting. I can quickly check on the MX records for any domain including my own, run through a list of BLs to see if my domain is listed, and very importantly run diagnostics on my mail server’s external facing connectors to see what errors may come up. This is where I turn if I don’t have a way to telnet in from the outside. There are a number of other useful tools there as well though they don’t receive as much use as the BL and diagnostics.
  2. testexchangeconnectivity.com A very important site if you are running Exchange migrations. The ActiveSync and Outlook Anywhere tests help greatly for verifying all of your autodiscover functionality and your other CAS services. The Lync tests are great as well if you have that as part of your organization. It is so much easier than having to call up some external user and have them test such and such functionality, again.
  3. migrationwiz.com Speaking of Exchange migrations, I’ve found this particular service to be extremely useful when working with migrations that are not Exchange to Exchange. MigrationWiz will login to the target service and transfer mailboxes to your Exchange server. It is as simple as that. No annoying process of exporting and import PSTs. Furthermore since incremental transfers are available you can spread out your migration without needing to do an overnight cutover.
  4. telnet and nslookup Since these are small I figured I could combine them into one entry. These are also some extremely vital tools in your Microsoft Exchange swiss army knife. I’ve already talked about the usefulness of telnet in previous posts so I won’t bore you all over again. Nslookup is a fantastic way to quickly verify records. Such as comparing your internal DNS records to your external records. Run nslookup autodiscover.contoso.com and nslookup autodiscover.contoso.com 8.8.8.8. The first will check what it is resolving to internally while the second will query Google’s public DNS for the external autodiscover record. MX record lookups are simple as well, nslookup –type=mx contoso.com 8.8.8.8.

Any favorite tools that you find yourself using over and over for your Exchange servers or migrations? Please let me know in the comments.

Mail Queuing for Mail Enabled Public Folders?

Feeling pretty good about yourself you come into the office and sit down to get some work done. After all, someone has to retrieve that Amulet of Yendor so it might as well be you, right? Unfortunately it doesn’t look like today will be your day. The warnings are piling up that your mail queue is getting rather large and some users have been asking where their daily messages in their public folders are. Taking a peek at the queue you see a large and growing number of emails in your Unreachable Domain queue. But your public folder database looks like it is mounted OK. Not cool.

What broke?

This is a fairly common scenario I’ve run into after migrating off of Exchange 2003. Your public folders migrated over successfully and mail had been flowing for a while but as soon as you took down the 2003 server the mail starts queuing up for your mail enabled public folders. Or maybe you went in and started doing some manual cleanup with ADSI Edit. Sometimes even just the uninstall of Exchange 2003 has some unexpected side effects. You remembered to do a backup of your AD prior to that major change, right? There’s a good chance that your public folder hierarchy is missing.

Great, so can we fix this?

The good news is that there is a road to recovery. Let’s check on things first, is your public folder hierarchy actually missing? Pop open the good old Exchange Management Shell and let’s check on a few things.

Import-Module ActiveDirectory
$SearchPath = "CN=Folder Hierarchies,CN=Exchange Administrative Group (FYDIBOHF23SPDLT),CN=Administrative Groups," + (Get-OrganizationConfig).DistinguishedName
Get-ADObject -SearchBase $SearchPath -SearchScope OneLevel -Filter {CN -eq "Public Folders"}

Hopefully you will get a result such as below.

DistinguishedName             Name                          ObjectClass                   ObjectGUID

—————–             —-                          ———–                   ———-

CN=Public Folders,CN=Folde… Public Folders                msExchPFTree                  f6a3cbd4-10e5-452d-9abe-44…

If you get a directory object not found then your public folder hierarchy is missing and we’ll have to recreate it. That’s step one on our way to saving the day. Let’s step back one further and make sure about whether our Folder Hierarchies container is there.

$SearchPath = "CN=Exchange Administrative Group (FYDIBOHF23SPDLT),CN=Administrative Groups," + (Get-OrganizationConfig).DistinguishedName
Get-ADObject -SearchBase $SearchPath -SearchScope OneLevel -Filter {CN -eq "Folder Hierarchies"}

If you get no results then that is missing as well. If you do then at least our container is there and we just need to create the hierarchy. Here’s the bit of PowerShell code that will fix up the missing public folders hierarchy.

<#
.SYNOPSIS
Recreates your public folders hierarchy
.DESCRIPTION
Checks your AD for whether the Folder Hierarchies container exists and the 
Public Folders hierarchy. If one does not exist then it is created.
#>
Import-Module ActiveDirectory

# Build path to the container
$SearchPath = "CN=Exchange Administrative Group (FYDIBOHF23SPDLT),CN=Administrative Groups," + (Get-OrganizationConfig).DistinguishedName
$PFContainer = Get-ADObject -SearchBase $SearchPath -SearchScope OneLevel -Filter {CN -eq "Folder Hierarchies"}

# If it does not exist then create the container
if(!$PFContainer)
{
    New-ADObject -Name "Folder Hierarchies" -Type msExchPublicfolderTreeContainer -Path $SearchPath
    Write-Host "Folder Hierarchies container created."
}
else
{
    Write-Host "Folder Hierarchies container exists already." -ForeGroundColor Yellow
}

# Build path for the public folder tree
$SearchPath = "CN=Folder Hierarchies,CN=Exchange Administrative Group (FYDIBOHF23SPDLT),CN=Administrative Groups," + (Get-OrganizationConfig).DistinguishedName
$PFHierarchy = Get-ADObject -SearchBase $SearchPath -SearchScope OneLevel -Filter {CN -eq "Public Folders"}

# If it does not exist then create it
if(!$PFHierarchy)
{
    New-ADObject -Name "Public Folders" -Type msExchPFTree -Path $SearchPath -OtherAttributes @{msExchPFTreeType="1"}
    Write-Host "Public Folders hierarchy created."
}
else
{
    Write-Host "Public Folders hierarchy already exists." -ForeGroundColor Yellow
}

# Set to our PF hierarchy DN
$PFHierarchy = "CN=Public Folders," + $SearchPath

# DN for our databases
$SearchPath = "CN=Databases,CN=Exchange Administrative Group (FYDIBOHF23SPDLT),CN=Administrative Groups," + (Get-OrganizationConfig).DistinguishedName
$PFDatabases = Get-ADObject -SearchBase $SearchPath -SearchScope OneLevel -Filter {objectClass -eq "msExchPublicMDB"}

# Grab all of the public folder databases and loop through them
if($PFDatabases)
{
    foreach($PFDatabase in $PFDatabases)
    {
        $PFDatabase.msExchOwningPFTree = $PFHierarchy
        Set-ADObject -Instance $PFDatabase
        Write-Host "Fixed database $($PFDatabase.Name)"
    }
}
# Or if no public folder databases exist you have further problems ...
else
{
    Write-Host "No Public Folder Databases found." -ForeGroundColor Yellow
}

But you’ll find that your work is not quite done yet. Your public folders are missing their homeMDB. Or this could have been your problem all along without any need to recreate the public folder hierarchy. You can verify this as the problem with this quick search:

$PFPath = "CN=Public Folders,CN=Folder Hierarchies,CN=Exchange Administrative Group (FYDIBOHF23SPDLT),CN=Administrative Groups," + (Get-OrganizationConfig).DistinguishedName
$SearchBase = "CN=Microsoft Exchange System Objects," + (Get-ADRootDSE).rootDomainNamingContext 
Get-ADObject -SearchBase $SearchBase -SearchScope OneLevel -Filter { (homeMDB -notlike "*") -and (ObjectClass -eq "publicFolder")}

If you don’t see anything then you know that your mail enabled public folders are fine. But most likely you’ll get a few results. To quickly fix those up run through this script.

<#
.SYNOPSIS
Fix any missing homeMDB attributes on public folders.
.DESCRIPTION
The script runs an LDAP search for all mail enabled public folder objects and
sets the homeMDB attribute to the LDAP path to your public folder hierarchy.
.NOTES
The script needs to be run in all domains requiring the fix.
#>
Import-Module ActiveDirectory
# Build the DN to the public folders hierarchy
$PFPath = "CN=Public Folders,CN=Folder Hierarchies,CN=Exchange Administrative Group (FYDIBOHF23SPDLT),CN=Administrative Groups," + (Get-OrganizationConfig).DistinguishedName
# Build the DN to the public folder objects
$SearchBase = "CN=Microsoft Exchange System Objects," + (Get-ADRootDSE).rootDomainNamingContext
# Search for all PFs with a blank homeMDB
$TargetPFs = Get-ADObject -SearchBase $SearchBase -SearchScope OneLevel -Filter { (homeMDB -notlike "*") -and (ObjectClass -eq "publicFolder")}

# Fix all of the public folders
if($TargetPFs)
{
  foreach($TargetPF in $TargetPFs)
  {
    Write-Host "Fixing $($TargetPF.Name)"
    $TargetPF.homeMDB = $PFPath
    Set-ADObject -Instance $TargetPF
  }
}
# Good news (maybe), no public folders that require fixing
else
{
  Write-Host "No public folders missing homemdb."
}

Fantastic, are we done yet?

Nearly! Just give the mail queues a kick and you should see the emails quickly flushing out into your public folder databases. Now you can get back to your day of NetHack knowing that all is well with the world once again.

The Welcome News of the Death of SBS

SBSIntoTheNight

SBS wander off into the black abyss

Somehow I missed this bit of news last year. The Death of the Small Business Server. Possibly because I haven’t had to deal with SBS in a serious manner for a while now. Anyhow from the point of view of an Exchange administrator it is welcome news to see that SBS 2011 is the end of the line for the SBS. I have always found SBS to be a pain to work with. The migration wizards were prone to breaking in mysterious ways. The POP3 connector was horrid to troubleshoot and would still choke on a seemingly normal email all too easily. The most annoying thing was being forced to use the wizards for most things you do, and to suffer the consequences if someone else did not use a wizard. For small businesses there don’t seem to be any genuine alternatives unfortunately. Several Linux based alternatives are presented over at The VAR Guy but I don’t see a full Linux solution being a comfortable route, at least for most of the small businesses I’ve worked with in the past. There are a few odd ones out that embrace any alternatives to Microsoft of course. Windows Server 2012 Essentials and Zimbra Collaberation Server virtual appliance may be a better compromise for MSPs that are reluctant to touch a full Linux alternative for an on-site Exchange substitute. For me I’ll just be happy that I won’t have to worry about Exchange 2013 shoehorned into an SBS.

452 4.3.1 Insufficient System Resources – Continued Telnet Training

This is a problem that crops up fairly often if you have a lot of disparate Exchange servers out there without a solid monitoring solution in place. Very common for MSPs. Oh, and actually have somebody paying attention to those monitoring alerts. Nobody likes paying attention to monitoring alerts. There are reams of rules dedicated to keeping them out of sight in Outlook clients around the world. But that makes for an entirely separate topic/rant. The symptoms of this problem are that you’ll be getting reports from the end users that they don’t seem to be receiving any email, or at least any external email. But oddly enough sending out email is working just fine.

This is the point where a quick telnet test will focus you in on what is going on really fast. Continuing with what you learned from the post on Essential Exchange Troubleshooting – Send Email via Telnet you will want to telnet into the server from outside the organization. You may immediately get a response of:

452 4.3.1 Insufficient System Resources

But more likely you’ll receive a typical SMTP banner such as

220 myserver.contoso.com Microsoft ESMTP MAIL Service ready at Mon, 27 May 2013 08:19:44 -0700

If so then I recommend that you continue through with sending in an email via telnet. The next likely place that you’ll encounter this error is when you issue the RCPT TO: command to which you receive a response of

452 4.3.1 Insufficient System Resources

The fix for this is fairly simple. Check your Exchange server for low disk space usage on the partition where your queues reside, which will most likely be the partition with your Exchange installation. I find that most often what has eaten all of your space, in cases of single server Exchange 2007/2010 installations, is the IIS log files. When setting up your Exchange server it is a good idea to make sure that you have an archiving/recycling policy in place for your IIS logs to keep them from swallowing the entire partition over time. BES installations have the same problem as well with log files swallowing the drive.

The key phrase that you’ll want to keep in mind with this is “back pressure.” In a later post I’ll delve into this term.

More to the topic on hand, here’s an extra PowerShell fix for you to keep those IIS log files under control. It can also be easily customized for BES logs or other logging happy programs. Or even just keeping your temp files cleaned up regularly. You’ll want to set it to run as a scheduled task on a daily, weekly or monthly basis depending upon your organizations policies.

# CleanIISLogs.ps1
# Find and remove files older than $days
# Set $LogPath to where the IIS logs you want to recycle are kept
# 

$days = 31
$LogPath = C:\inetpub\logs\LogFiles\W3SVC1
# Find the target date
$startdate = Get-Date
$startdate = $startdate.AddDays(-$days)

# Clean the directory of log files older than the target date
Get-ChildItem -Path "$($LogPath)" -Recurse | where {$_.LastWriteTime -lt $startdate} | Remove-Item -Confirm:$false

Is this post helpful to you or is there something you would like me to go into greater detail on? Please let me know, thanks.

The Number One Easy Way to Setup a Failed Migration

It surprises me how much I run across this one but then again I have been guilty of it as well.

Eater of backups

I eat backups! Garr!

There is a very important first step that I find skipped over and forgotten quite often when it comes to running an Exchange migration. Or really any other kind of migration. Have you taken a system state backup of AD yet? No? Then you’re just spinning the bottle and hoping it doesn’t end up with you getting cozy with Microsoft’s support hoping that they can fixed your screwed up Active Directory.

Don’t make the mistake of assuming backups are working

I made this mistake once upon a time. It was from one of the first Exchange migrations I was running. I didn’t feel like being bothered to take a backup of AD as the server was a really slow server. I was confident that the nightly backups had taken care of everything anyhow. Though I didn’t bother to validate this. So I went directly into running the migration and everything was going smoothly at first. Everything looked like it was running great. But then part of the way through I found that AD replication had broken and that it possibly had been that way for a while. It would have been easy to roll back to an AD backup, correct the problem and then retrace my steps but unfortunately that wasn’t an option. Because I hadn’t taken a backup. The nightly backups hadn’t worked in several months either. That lead into a call with Microsoft later on and then having to spend even more hours fixing things manually via ADSIedit when they couldn’t figure it out.

I don’t want to be the one cleaning up after you

It is a very simple step to take at the very beginning. Just grab a backup before you run your first setup.com /PrepareAD. While you’re at it, why don’t you test the backup of your current mail server and make sure that it is working ok as well. Trust me on this. You don’t want to be the guy to explain to your boss that the data is gone as your only valid backup is from 3 years ago. Your backups are working, right? You might want to double check on that just to be sure. I recommend a mock restore for that extra bit of assurance.

The Magic in Troubleshooting the Black Box

Sometimes at work I feel like a magician.
This could be you!

This could be you, the magician at work! Now if only Exchange 2003 looked so good …

For instance some Microsoft Exchange or Hyper-V clustering issue bubbles up through the tiers of help desk, engineers and senior engineers that have had hours of troubleshooting and myriad eyes thrown at it without making a lick of progress. Then the issue gets slid onto my plate and 15-30 minutes later I have everything fixed up and a client looking much happier as he is able to get back to work. Sometimes they’ll ask why was I able to fix the issue so swiftly when all these other people spent hours at work with no resolution. That cues what I fondly call the magician moment.

I always wanted to be a magician when I was a kid. This is one way I get a small taste of it. The magician moment is when the audience is wondering how the rabbit was pulled out of the hat, or perhaps how the levitating woman disappeared. Now you can explain to the client how all of the magic happened and sometimes if they’re a truly technical client they will be very interested in the explanation. But the majority of the time I find that they prefer to think that you just have some special magic that the rest of the world does not have access to. It is a pretty good feeling most of the time.

Now there is a secret to being able to pull this off

time and time again even when the odds are stacked against you. It is all about the trail of logic and being able to follow it through the (logical, not physical) closed off black boxes along that trail. In computers as in science every action produces a reaction. So the first step to setting yourself up for success is to make sure that you know everything you can about how your system, and connected systems, at hand operate. You can’t just hide behind your known system be it Exchange or Hyper-V or Active Directory and declare all other territories unknown and not your problem. There has been many a time I have found the solution to a problem with a timely packet capture with WireShark or checking the routing topology on the local router. That falls in the area of the networking team but with applying my own knowledge of networking I was able to take a huge shortcut and point the problem either directly to the network (with solid evidence) or directly to the server and then fix it.

Love the black box

The point is that magic can happen when you break down the black boxes around you. You can’t completely eliminate the black boxes, but if you can reduce them then you can get an idea of how things are functioning in the neighboring black box. Let’s throw in another example. Recently a coworker of mine came to me with a problem he was stuck on. He’s great with server problems but he makes it very clear that networking is not his cup of tea. A server had successfully gone through a P2V and was up and running in the cluster, but a certain service was inaccessible remotely no matter how he looked at things. He made the assumption that things were incorrectly configured on the networking side of things beyond the server. Now there were several ways he could have tested that theory all of which would not involve looking into the black box of the networking equipment, but it does involve knowing what goes into it and what is expected to come out. A ping test would have verified routing and a probe of the port internally would have told him that it was not open on the server. Quickly checking on the server showed that the port was not open on the server’s firewall. An easy check, but since he considered the black box not his problem he was not able to easily and swiftly reach that conclusion.

So in summary make sure you’re always following logic in your troubleshooting and that you are always learning within and without of your realm of expertise. That will set you on the track of the magician as well. Perhaps you’ll be the next one to be asked how that rabbit came out of the hat.

Do you have any magician moment stories as well? Please share them in the comments as I would love to hear them.

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: