Start here

Very handy PowerShell commands to make your life easy breezy

When I get some down time, which is rare, around here I go through some of my PowerShell RSS posts. Ed Wilson always has some good ones to read through and I felt like I should pass along this gem from him. The Five Best PowerShell Cmdlets. I was curious so I bit and discovered that somehow I have been missing out on Out-GridView my entire life! It does require PowerShell-ISE to be installed though, so it is not quite out of the box usefulness.

Rescue that Workflow Manager from certain doom, or at least get that OutboundCertificate fixed!

SharePoint 2013 and Workflow Manager have always proven to be a winning combination for late nights of

Doom I tell you. Doom!

Doom I tell you. Doom!

troubleshooting involving copious amounts of coffee and a complete loss of sleep. Or whatever may be your preferred means of caffeine intake. Plus your spouse may not appreciate yet another late night without you at home. Workflow Manager is a frustrating, burdensome beast, and it is not on the fun sunny side of life.

In this particular instance we are needing to resuscitate Workflow Manager from its current undead state. It pretends to be running and responsing to your commands. The users believe otherwise, and are wanting to lynch you because their workflows are showing angry messages about no longer being able to talk with the server. Looking at the server you find that the management databases are shot and cannot be worked with in their current state. In my recent case in particular, it was due to a fabulous mixture of expired certificates, revoked certificates and certificate templates that “update” your current certificates to certificates that are incompatible with Workflow Manager. This restore is also a method that can be used to replace the OutboundCertificate in the Workflow Manager farm if the Set-WFNextOutboundCertificateReference and Set-WFNextOutboundCertificateAsCurrent are not working for you.

Microsoft has a pretty decent article on disaster recovery for Workflow Manager 1.0. The problem I found with it as that it was incomplete, so thusly why I am putting together this post. Now, the topology we are working with in this scenario is a single SharePoint 2013 server, with a separate single SQL server, and a separate single Workflow Manager server. This scenario also requires you to have either working backups of your Workflow Manager databases, or that only the WFManagementDB and/or SbManagementDB are the only shot databases. You do have valid backups of everything, right? Go check again, right now, just to be safe. If you are doing a restore on a farm of multiple Workflow Manager servers then you may need a few extra steps to update those servers to the new databases. Also, check and make sure your certificates are up to date and that you know which service accounts are in use on your Workflow Manager farm and what their passwords are.

If you’re skipping ahead to the details on how to do this, here is where you need to start paying attention!

First off we need to uninstall Workflow Manager. Hopefully an easy enough step. If you’re installing 1.0 Refresh and you’re running Service Bus 1.0 then this would be a good time to move to Service Bus 1.1. It worked flawlessly for me when I did this. If that is the direction you are going to go then uninstall Service Bus 1.0

Next step! Let’s install Service Bus 1.1 followed by Workflow Manager Refresh 1.0. Hopefully that went smoothly for you.

Now we need to get the Service Bus farm up and running. Check your SQL server and make sure you remove your SbManagementDB and your WFManagementDB, just in case those still exist. Alternatively when rebuilding things you could name the databases something else, but I don’t see much of a point to that as it will just cause confusion further down the line. Identify your service account you are using for Service Bus and then we’ll get the database recreated. Pop open PowerShell and run

Import-Module ServiceBus

Restore-SBFarm -RunAsAccount DOMAIN\servicebussvc -GatewayDBConnectionString “Data Source=sql.jefferyland.com;Initial Catalog=SbGatewayDatabase;Integrated Security=SSPI;Asynchronous Processing=True” -SBFarmDBConnectionString “Data Source=sql.jefferyland.com;Initial Catalog=SbManagementDB;Integrated Security=SSPI;Asynchronous Processing=True” -FarmCertificateThumbprint 814AA8261BE6F0DD9031F802A4D26EBAD020770D -EncryptionCertificateThumbprint 814AA8261BE6F0DD9031F802A4D26EBAD020770D

That will get your replacement SbManagementDB created. The output of a successful run of the command will look something like the following, which don’t you love how on very critical commands like this it defaults to Yes?

This operation will restore the entire service bus farm
Are you sure you want to restore the service bus farm?
[Y] Yes [N] No [S] Suspend [?] Help (default is “Y”):
FarmType : SB
SBFarmDBConnectionString : Data Source=sql.jefferyland.com;Initial Catalog=SbManagementDB;Integrated
Security=True;Asynchronous Processing=True
ClusterConnectionEndpointPort : 9000
ClientConnectionEndpointPort : 9001
LeaseDriverEndpointPort : 9002
ServiceConnectionEndpointPort : 9003
RunAsAccount : DOMAIN\servicebussvc
AdminGroup : BUILTIN\Administrators
GatewayDBConnectionString : Data Source=sql.jefferyland.com;Initial Catalog=SbGatewayDatabase;Integrated
Security=True;Asynchronous Processing=True
HttpsPort : 9355
TcpPort : 9354
MessageBrokerPort : 9356
AmqpsPort : 5671
AmqpPort : 5672
FarmCertificate : Thumbprint: 814AA8261BE6F0DD9031F802A4D26EBAD020770D, IsGenerated: False
EncryptionCertificate : Thumbprint: 814AA8261BE6F0DD9031F802A4D26EBAD020770D, IsGenerated: False
Hosts : {}
RPHttpsPort : 9359
RPHttpsUrl :
FarmDNS :
AdminApiUserName :
TenantApiUserName :
BrokerExternalUrls :

The Service Bus farm has been successfully restored.

Note that it will complain if SbManagementDB already exists, so you will have to delete it or name this one something new. Now we’ll connect in the SbGatewayDatabase.

Restore-SBGateway -GatewayDBConnectionString “Data Source=sql.jefferyland.com;Initial Catalog=SbGatewayDatabase;Integrated Security=SSPI;Asynchronous Processing=True” -SBFarmDBConnectionString “Data Source=sql.jefferyland.com;Initial Catalog=SbManagementDB;Integrated Security=SSPI;Asynchronous Processing=True”

This operation will restore the Service Bus gateway database. This may require upgrading of gateway database and
message container databases.
Are you sure you want to restore the Service Bus gateway database?
[Y] Yes [N] No [S] Suspend [?] Help (default is “Y”):
Re-encrypting the global signing keys.
The following containers database has been restored:
WARNING: Failed to open a connection to the following dB: ”
WARNING: The database associated with container ‘1’ is not accessible. Please run Restore-SBMessageContainer -Id 1
-DatabaseServer <correct server> -DatabaseName <correct name> to restore container functionality.
Id : 1
Status : Active
Host :
DatabaseServer :
DatabaseName :
ConnectionString :
EntitiesCount : 13
DatabaseSizeInMB : 0

Restore-SBGateway : The operation has timed out.
At line:1 char:1
+ Restore-SBGateway -GatewayDBConnectionString “Data Source=sql.jefferyland.com; …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Restore-SBGateway], SqlCommandTimeoutException
+ FullyQualifiedErrorId : Microsoft.Cloud.ServiceBus.Common.Sql.SqlCommandTimeoutException,Microsoft.ServiceBus.Co
mmands.RestoreSBGatewayCommand

Do not be alarmed by the scary messages in there. I was alarmed at first but apparently everything went well. Next check your SQL server for SBMessageContainer* databases and you’ll need to run this command for each one. At least, according to Microsoft’s documentation. According to the command I ran it wasn’t necessary.

Restore-SBMessageContainer -Id 1 -SBFarmDBConnectionString “Data Source=sql.jefferyland.com;Initial Catalog=SbManagementDB;Integrated Security=SSPI;Asynchronous Processing=True” -ContainerDBConnectionString “Data Source=sql.jefferyland.com;Initial Catalog=SBMessageContainer01;Integrated Security=SSPI;Asynchronous Processing=True”

Id : 1
Status : Active
Host :
DatabaseServer : sql.jefferyland.com
DatabaseName : SBMessageContainer01
ConnectionString : Data Source=sql.jefferyland.com;Initial Catalog=SBMessageContainer01;Integrated
Security=True;Asynchronous Processing=True
EntitiesCount : 13
DatabaseSizeInMB : 48.6875

All entities are up to date. No changes were made to entities.
Please run Start-SBHost.

Now we need to add our host to the Service Bus farm.

Add-SBHost -SBFarmDBConnectionString “Data Source=sql.jefferyland.com;Initial Catalog=SbManagementDB;Integrated Security=SSPI;Asynchronous Processing=True” -RunAsPassword (ConvertTo-SecureString -Force -AsPlainText password!) -EnableFirewallRules:$true

FarmType : SB
SBFarmDBConnectionString : Data Source=sql.jefferyland.com;Initial Catalog=SbManagementDB;Integrated
Security=True;Asynchronous Processing=True
ClusterConnectionEndpointPort : 9000
ClientConnectionEndpointPort : 9001
LeaseDriverEndpointPort : 9002
ServiceConnectionEndpointPort : 9003
RunAsAccount : DOMAIN\servicebussvc
AdminGroup : BUILTIN\Administrators
GatewayDBConnectionString : Data Source=sql.jefferyland.com;Initial Catalog=SbGatewayDatabase;Integrated
Security=True;Asynchronous Processing=True
HttpsPort : 9355
TcpPort : 9354
MessageBrokerPort : 9356
AmqpsPort : 5671
AmqpPort : 5672
FarmCertificate : Thumbprint: 814AA8261BE6F0DD9031F802A4D26EBAD020770D, IsGenerated: False
EncryptionCertificate : Thumbprint: 814AA8261BE6F0DD9031F802A4D26EBAD020770D, IsGenerated: False
Hosts : {Name: workflow.jefferyland.com, Configuration State: HostConfigurationCompleted}
RPHttpsPort : 9359
RPHttpsUrl : https://workflow.jefferyland.com:9359/
FarmDNS :
AdminApiUserName :
TenantApiUserName :
BrokerExternalUrls :

We’ve finished up the Service Bus farm, hopefully successfully, so now we’re ready for the Workflow Manager farm. Fighting!

This can get a little bit messy if you’re running Service Bus 1.1 as there is a buggy cmdlet. If you’re not using Service Bus 1.1, or you do not receive an error like

Could not load file or assembly
'Microsoft.ServiceBus, Version=1.8.0.0, Culture=neutral,
PublicKeyToken=31bf3856ad364e35' or one of its dependencies.
The system cannot find the file specified.

then you can skip the following. If we are using Service Bus 1.1, then we need to work around a call to an old ServiceBus assembly in one of the cmdlets. Thanks to these posts, http://www.wictorwilen.se/issue-when-installing-workflow-manager-1.0-refresh-using-powershell and https://carolinepoint.wordpress.com/2012/07/10/sharepoint-2010-powershell-and-bindingredirects/ we have a valid work around.

Create or edit a file named C:\Windows\SysWOW64\WindowsPowerShell\v1.0\powershell.exe.config and paste the following into it:

<?xml version=”1.0″ encoding=”utf-8″ ?>
<configuration>
<runtime>
<assemblyBinding xmlns=”urn:schemas-microsoft-com:asm.v1″>
<dependentAssembly>
<assemblyIdentity name=”Microsoft.ServiceBus”
publicKeyToken=”31bf3856ad364e35″
culture=”en-us” />
<bindingRedirect oldVersion=”1.8.0.0″ newVersion=”2.1.0.0″ />
</dependentAssembly>
</assemblyBinding>
</runtime>
</configuration>

Then restart your PowerShell session to make this active. You may want to undo this part after you’re done restoring the farm just to be safe.

Continuing on with the farm build run the following.

Import-Module WorkflowManager

Restore-WFFarm -InstanceDBConnectionString “Data Source=sql.jefferyland.com;Initial Catalog=WFInstanceManagementDB;Integrated Security=SSPI;Asynchronous Processing=True” -ResourceDBConnectionString “Data Source=sql.jefferyland.com;Initial Catalog=WFResourceManagementDB;Integrated Security=SSPI;Asynchronous Processing=True” -WFFarmDBConnectionString “Data Source=sql.jefferyland.com;Initial Catalog=WFManagementDB;Integrated Security=SSPI;Asynchronous Processing=True” -OutboundCertificateThumbprint 814AA8261BE6F0DD9031F802A4D26EBAD020770D -EncryptionCertificateThumbprint 814AA8261BE6F0DD9031F802A4D26EBAD020770D -SslCertificateThumbprint 814AA8261BE6F0DD9031F802A4D26EBAD020770D -InstanceStateSyncTime (Get-Date)  -ConsistencyVerifierLogPath “C:\temp\wfverifierlog.txt” -RunAsAccount DOMAIN\workflowsvc -Verbose

A successful run through should get you output similar to this:

VERBOSE: [5/14/2015 11:56:58 PM]: Created and configured farm management database.
VERBOSE: [5/14/2015 11:56:58 PM]: Created and configured Workflow Manager resource management database.
VERBOSE: [5/14/2015 11:56:58 PM]: Created and configured Workflow Manager instance management database.
VERBOSE: [5/14/2015 11:56:58 PM]: Configuration added to farm management database.
VERBOSE: [5/14/2015 11:56:58 PM]: Workflow Manager configuration added to the Workflow Manager farm management
database.
VERBOSE: [5/14/2015 11:56:58 PM]: New-WFFarm successfully completed.
FarmType : Workflow
WFFarmDBConnectionString : Data Source=sql.jefferyland.com;Initial Catalog=WFManagementDB;Integrated
Security=True;Asynchronous Processing=True
RunAsAccount : DOMAIN\workflowsvc
AdminGroup : BUILTIN\Administrators
Hosts : {}
InstanceDBConnectionString : Data Source=sql.jefferyland.com;Initial Catalog=WFInstanceManagementDB;Integrated
Security=True;Asynchronous Processing=True
ResourceDBConnectionString : Data Source=sql.jefferyland.com;Initial Catalog=WFResourceManagementDB;Integrated
Security=True;Asynchronous Processing=True
HttpPort : 12291
HttpsPort : 12290
OutboundCertificate : Thumbprint: 814AA8261BE6F0DD9031F802A4D26EBAD020770D, IsGenerated: False
Endpoints : {}
SslCertificate : Thumbprint: 814AA8261BE6F0DD9031F802A4D26EBAD020770D, IsGenerated: False
EncryptionCertificate : Thumbprint: 814AA8261BE6F0DD9031F802A4D26EBAD020770D, IsGenerated: False

This will get our WFManagementDB recreated as well. Time to add the host back in!

Add-WFHost -WFFarmDBConnectionString “Data Source=sql.jefferyland.com;Initial Catalog=WFManagementDB;Integrated Security=SSPI;Asynchronous Processing=True” -RunAsPassword (ConvertTo-SecureString -Force -AsPlainText password!) -EnableFirewallRules:$true

This should have your farm up and running. Let’s check the status.

Get-WFFarmStatus

HostName ServiceName ServiceStatus
——– ———– ————-
workflow.jefferyland.com WorkflowServiceBackend Running
workflow.jefferyland.com WorkflowServiceFrontEnd Running

Restoration is done! This is where Microsoft’s documentation leaves you hanging. You need to reconnect the farm with SharePoint.

Register-SPWorkflowService -SPSite “https://sharepoint.jefferyland.com/&#8221; -WorkflowHostUri “https://workflow.jefferyland.com:12290&#8221; -AllowOAuthHttp -Force

Your workflows should now be showing up once again but we’re not done yet, we need to perform some maintenance on the SharePoint server. First clean-up the old certificates using the thumbprint of the old certificate for your filtering criteria:

Get-SPTrustedRootAuthority | ?{$_.Certificate -match “BF5CA00B6A639FE5B7FF5688C9A38FEBFBF03552”} | Remove-SPTrustedRootAuthority -Confirm:$false

Next we need to run some jobs to update the security token, otherwise you’ll get a HTTP 401 Invalid JWT token error. Alternatively you can wait until after midnight for the timer jobs to run themselves, but I’m pretty sure that would not be the healthiest decision here.

In Central Administration go to Monitoring->Timer Jobs:Job Definitions
Run these jobs:
Refresh Trusted Security Token Services Metadata feed.
Workflow Auto Cleanup
Notification Timer Job c02c63c2-12d8-4ec0-b678-f05c7e00570e
Hold Processing and Reporting
Bulk workflow task processing

Now check in your workflows. They should be running nice and healthy! That wraps up this post on rescuing your Workflow Manager farm and saves you from losing a night or two of sleep.

Copying Receive Connectors Hither and Yon

If you have done a number of Exchange migrations, or have a large number of servers to migrate in a single migration, I am sure you have run into the pain of replicating the receive connectors to the new server. Lots of settings to copy down and move over plus there is the headache of explicit permissions granted on the connector in the case of relays or other special use connectors. That can waste a lot of time that you would much rather spend on the finale of Sherlock season 3. Let’s see if we can simplify that today with this script for Copy-ReceiveConnector. You call the script as follows:

Copy-Receive Connector -SourceConnector "EXCHANGE\Alternate Receive Connector" -DestinationServer NEWEXCHANGE -DomainController dc01

This will create a new receive connector on the destination server with all of the settings specified on the old receive connector. It will then loop through all of the non-inherited permissions on the connector and copy those over. You can also specify a new name for the connector via -Name. Onto the code.

<# .SYNOPSIS Copy-ReceiveConnector - Copies a receive connector from a source server to a  destination server .DESCRIPTION Takes the source receive connector a creates a copy on the destination server with values populated from the source receive connector. .PARAMETER SourceConnector Identity of the source receive connector .PARAMETER DestinationServer Server name of the destination Exchange server .PARAMETER DomainController Target domain controller for setting the configuration .PARAMETER Name Optional new name for the connector .EXAMPLE Copy-Receive Connector -SourceConnector "EXCHANGE\Alternate Receive Connector"  -DestinationServer NEWEXCHANGE -DomainController dc01 #>
[CmdletBinding()]
param(
[Parameter(Mandatory=$True)][string]$SourceConnector,
[Parameter(Mandatory=$True)][string]$DestinationServer,
[Parameter(Mandatory=$True)][string]$DomainController,
[Parameter(Mandatory=$False)][string]$Name
)
Import-Module ActiveDirectory
# Get the values for the old connector
$Source = Get-ReceiveConnector -Identity $SourceConnector
# Update the name if specified
if($Name)
{
 $Source.Name = $Name
}
# Custom permission group is not allowed in Exchange 2013 so we need to remove it
# Nothing to be concerned about since the ACEs are explicitly copied over.
$TempArray = @($Source.PermissionGroups) -split ", " | Select-String -Pattern "Custom" -NotMatch
$TempString = "$($TempArray)"
$Source.PermissionGroups = $TempString.Replace(" ", ", ")
# Copy all the values over to create the new connector on the 2013 server
New-ReceiveConnector -Bindings $Source.Bindings -Server $DestinationServer -DomainController $DomainController -Name $Source.Name -RemoteIPRanges $Source.RemoteIPRanges -AdvertiseClientSettings $Source.AdvertiseClientSettings -AuthMechanism $Source.AuthMechanism -Banner $Source.Banner -BinaryMimeEnabled $Source.BinaryMimeEnabled -ChunkingEnabled $Source.ChunkingEnabled -Comment $Source.Comment -ConnectionInactivityTimeout $Source.ConnectionInactivityTimeout -ConnectionTimeout $Source.ConnectionTimeout -DefaultDomain $Source.DefaultDomain -DeliveryStatusNotificationEnabled $Source.DeliveryStatusNotificationEnabled -DomainSecureEnabled $Source.DomainSecureEnabled -EightBitMimeEnabled $Source.EightBitMimeEnabled -EnableAuthGSSAPI $Source.EnableAuthGSSAPI -Enabled $Source.Enabled -EnhancedStatusCodesEnabled $Source.EnhancedStatusCodesEnabled -ExtendedProtectionPolicy $Source.ExtendedProtectionPolicy -Fqdn $Source.Fqdn -LongAddressesEnabled $Source.LongAddressesEnabled -MaxAcknowledgementDelay $Source.MaxAcknowledgementDelay -MaxHeaderSize $Source.MaxHeaderSize -MaxHopCount $Source.MaxHopCount -MaxInboundConnection $Source.MaxInboundConnection -MaxInboundConnectionPercentagePerSource $Source.MaxInboundConnectionPercentagePerSource -MaxInboundConnectionPerSource $Source.MaxInboundConnectionPerSource -MaxLocalHopCount $Source.MaxLocalHopCount -MaxLogonFailures $Source.MaxLogonFailures -MaxMessageSize $Source.MaxMessageSize -MaxProtocolErrors $Source.MaxProtocolErrors -MaxRecipientsPerMessage $Source.MaxRecipientsPerMessage -MessageRateLimit $Source.MessageRateLimit -MessageRateSource $Source.MessageRateSource -PermissionGroups $Source.PermissionGroups -PipeliningEnabled $Source.PipeliningEnabled -ProtocolLoggingLevel $Source.ProtocolLoggingLevel -RequireEHLODomain $Source.RequireEHLODomain -RequireTLS $Source.RequireTLS -ServiceDiscoveryFqdn $Source.ServiceDiscoveryFqdn -SizeEnabled $Source.SizeEnabled -SuppressXAnonymousTls $Source.SuppressXAnonymousTls -TarpitInterval $Source.TarpitInterval -TlsDomainCapabilities $Source.TlsDomainCapabilities -TransportRole $Source.TransportRole
# Next we need to copy over all of the explicity created permissions
$ConnectorPermissions = Get-ReceiveConnector -Identity $SourceConnector | Get-ADPermission | where {$_.IsInherited -eq $false}
$ConnectorPermissions | foreach {
 Get-ReceiveConnector "$($DestinationServer)\$($Source.Name)" | Add-ADPermission -DomainController $DomainController -User $_.User -Deny:$_.Deny -AccessRights $_.AccessRights -ExtendedRights $_.ExtendedRights
}

And as a bonus here’s a script for just copying over the permissions configured on a connector, in case you wanted to roll your own connector but didn’t want to spend the time on redefining all of the permissions. Usage is not quite the same as above as you are just specifying a source and destination connector.

Copy-ReceiveConnectorPermissions -SourceConnector "EXCHANGE\Alternate Receive Connector" -DestinationConnector "NEWEXCHANGE\New Receive Connector"
<# .SYNOPSIS Copy-ReceiveConnectorPermissions - Copies the permissions from the source  connector to the destination connector .DESCRIPTION Takes the source receive connector, retrieves all of the explicitly defined  permissions, then applies them to the destination receive connector .PARAMETER SourceConnector Identity of the source receive connector .PARAMETER DestinationConnector Identity of the destination receive connector .EXAMPLE Copy-Receive Connector -SourceConnector "EXCHANGE\Alternate Receive Connector"  -DestinationConnector "NEWEXCHANGE\New Receive Connector" #>
[CmdletBinding()]
param(
[Parameter(Mandatory=$True)][string]$SourceConnector,
[Parameter(Mandatory=$True)][string]$DestinationConnector
)
Import-Module ActiveDirectory
# We need to copy over all of the explicity created permissions
$ConnectorPermissions = Get-ReceiveConnector -Identity $SourceConnector | Get-ADPermission | where {$_.IsInherited -eq $false}
$ConnectorPermissions | foreach {
 Get-ReceiveConnector "$($DestinationConnector)" -DomainController $DomainController | Add-ADPermission -User $_.User -Deny:$_.Deny -AccessRights $_.AccessRights -ExtendedRights $_.ExtendedRights
}

A Mass Contact Conversion Experience

This may save you a fair bit of trouble in the future. I was working on a client with an Exchange 2007 server that was overrun with contacts. That’s not necessarily a problem in and of itself, but these contacts had served their purpose and now needed to be turned into mail enabled users. They already had a number of duplicates between AD users and contacts and it was causing a lot of trouble in their SharePoint hierarchy. Fortunately PowerShell can step in and save the day! Or would you really enjoy manually copying over all of the contact’s profile data to an AD user? Not my idea of fun.

Here’s the code. You’ll want to tweak it to your purposes of course. I wanted some clear on the screen results as it progressed and also have an HTML report that I can reference afterwards. This marked my first venture in LINQ with moderate success. I will have to revisit things later to get it working exactly how I want.

<#
.SYNOPSIS
Reads in a CSV of contacts or mail enabled users and moves the contacts to mail enabled users or updates
the current mail enabled users

.DESCRIPTION
Convert-CSVContactsToUsers reads in the CSV specified in the command line. This CSV contains a list
of users that either currently have a contact which needs to be converted to a mail enabled
user, have both a contact and a user that is not mail enabled which needs the info from the contact
copied over to the user which is then mail enabled, or have neither a contact or a user and need
to have a mail enabled user created for them.

.PARAMETER FilePath
The path and filename to the CSV containing the required users.

.PARAMETER DomainController
The NetBIOS or FQDN for the target DC to use.

.PARAMETER ResultsFile
The filename to save the HTML formatted results to.

.EXAMPLE
Convert-CSVContactstoUsers -FilePath .\UserList.csv -DomainController dc01 -ResultsFile .\Results.html

.NOTES
Require Server 2008 R2 and should be run from the EMS.
.NET 3.5 and above required for HTML output.
Appropriate domain and Exchange rights are required.

Revision History
v1.0 - Initial release
v1.1 - Added X500 address, moved status report
#>

[CmdletBinding()]
param(
[Parameter(Mandatory=$True)][string]$FilePath,
[Parameter(Mandatory=$True)][string]$DomainController,
[Parameter(Mandatory=$False)][string]$ResultsFile
)

Import-Module ActiveDirectory

#$ErrorActionPreference= 'silentlycontinue'
$CSVList = Import-CSV -Path "$($FilePath)"
$Password = ConvertTo-SecureString -String "expPassword!!" -AsPlainText -Force

# Create a custom table for storing the results
$ResultsTable = New-Object System.Data.DataTable "Conversion Results"
$column01 = New-Object System.Data.DataColumn User
$column02 = New-Object System.Data.DataColumn Result

$ResultsTable.columns.add($column01)
$ResultsTable.columns.add($column02)

# Loop through each CSV entry, check for object existence
# then process the object based on existence and type
foreach($TargetUser in $CSVList)
{
# Check for existence of any objects
$ADContact = Get-ADObject -LdapFilter "(&(CN=$($TargetUser.Name))(ObjectClass=contact))" -SearchScope Subtree -Properties * -Server $DomainController
$ADUser = Get-ADObject -LdapFilter "(&(CN=$($TargetUser.Name))(ObjectClass=user))" -SearchScope Subtree -Properties * -Server $DomainController
$ResultsRow = $ResultsTable.NewRow()
$ResultsRow.User = $TargetUser.Name
[string]$Status = $null

# If both contact and user exist, copy the info from the contact
# into the user's properties, remove the contact, then mail enable the user
if($ADContact -and $ADUser)
{
# First copy over any of the current profile details

if($ADContact.directReports)
{
$ADUser.directReports = $ADContact.directReports
}
if($ADContact.homePhone)
{
$ADUser.homePhone = $ADContact.homePhone
}
if($ADContact.facsimileTelephoneNumber)
{
$ADUser.facsimileTelephoneNumber = $ADContact.facsimileTelephoneNumber
}
if($ADContact.l)
{
$ADUser.l = $ADContact.l
}
if($ADContact.manager)
{
$ADUser.manager = $ADContact.manager
}
if($ADContact.mobile)
{
$ADUser.mobile = $ADContact.mobile
}
if($ADContact.physicalDeliveryOfficeName)
{
$ADUser.physicalDeliveryOfficeName = $ADContact.physicalDeliveryOfficeName
}
if($ADContact.postalCode)
{
$ADUser.postalCode = $ADContact.postalCode
}
if($ADContact.sn)
{
$ADUser.sn = $ADContact.sn
}
if($ADContact.st)
{
$ADUser.st = $ADContact.st
}
if($ADContact.streetAddress)
{
$ADUser.streetAddress = $ADContact.streetAddress
}
if($ADContact.telephoneNumber)
{
$ADUser.telephoneNumber = $ADContact.telephoneNumber
}
if($ADContact.title)
{
$ADUser.title = $ADContact.title
}
if($ADContact.department)
{
$ADUser.department = $ADContact.department
}
if($ADContact.Description)
{
$ADUser.Description = $ADContact.Description
}
if($ADContact.c)
{
$ADUser.c = $ADContact.c
}
if($ADContact.co)
{
$ADUser.co = $ADContact.co
}
if($ADContact.countryCode)
{
$ADUser.countryCode = $ADContact.countryCode
}
if($ADContact.info)
{
$ADUser.info = $ADContact.info
}
if($ADContact.initials)
{
$ADUser.initials = $ADContact.initials
}
if($ADContact.ipPhone)
{
$ADUser.ipPhone = $ADContact.ipPhone
}
if($ADContact.pager)
{
$ADUser.pager = $ADContact.pager
}
if($ADContact.wWWHomePage)
{
$ADUser.wWWHomePage = $ADContact.wWWHomePage
}
if($ADContact.postOfficeBox)
{
$ADUser.postOfficeBox = $ADContact.postOfficeBox
}
if($ADContact.company)
{
$ADUser.company = $ADContact.company
}

# Update the user with the current info
Set-ADObject -Instance $ADUser -Server $DomainController

# Loop through the groups and add the user to them
foreach($ADGroup in $ADContact.memberOf)
{
Add-ADGroupMember -Identity "$($ADGroup)" -Members $ADUser -Server $DomainController
}

# Next, remove the contact
Remove-MailContact -Identity $ADContact.DistinguishedName -Confirm:$false -DomainController $DomainController

# Enable the current user, then copy over the remaining attributes
Enable-MailUser -Identity $ADUser.DistinguishedName -ExternalEmailAddress $ADContact.mail -Alias $ADContact.mailNickname -DomainController $DomainController | Out-Null

# Add the X500 address if the contact was stamped with a legacyExchangeDN
if($ADContact.legacyExchangeDN)
{
$X500User = Get-MailUser -Identity $ADUser.DistinguishedName -DomainController $DomainController
$X500User.EmailAddresses += [Microsoft.Exchange.Data.CustomProxyAddress]("X500:$($ADContact.legacyExchangeDN)")

Set-MailUser -Identity $X500User -EmailAddresses $X500User.EmailAddresses -DomainController $DomainController
}

Write-Host "$($TargetUser.Name) converted."
$Status = "Success"
}
elseif($ADContact)
{
# First remove the contact
Remove-MailContact -Identity $ADContact.DistinguishedName -Confirm:$false -DomainController $DomainController

# Create the mail enabled user
New-MailUser -Name "$($TargetUser.Name)" -ExternalEmailAddress $ADContact.mail -Alias $ADContact.mailNickname -UserPrincipalName "$($ADContact.mailNickname)@domain.local" -Password $Password -ResetPasswordOnNextLogon:$true -DomainController $DomainController | Out-Null

# Then copy in the attributes
$ADUser = Get-ADObject -LdapFilter "(&(CN=$($TargetUser.Name))(ObjectClass=user))" -SearchScope Subtree -Properties * -Server $DomainController
$ADUser.directReports = $ADContact.directReports
$ADUser.homePhone = $ADContact.homePhone
$ADUser.facsimileTelephoneNumber = $ADContact.facsimileTelephoneNumber
$ADUser.l = $ADContact.l
$ADUser.manager = $ADContact.manager
$ADUser.mobile = $ADContact.mobile
$ADUser.physicalDeliveryOfficeName = $ADContact.physicalDeliveryOfficeName
$ADUser.postalCode = $ADContact.postalCode
$ADUser.sn = $ADContact.sn
$ADUser.st = $ADContact.st
$ADUser.streetAddress = $ADContact.streetAddress
$ADUser.telephoneNumber = $ADContact.telephoneNumber
$ADUser.title = $ADContact.title
$ADUser.department = $ADContact.department
$ADUser.Description = $ADContact.Description
$ADUser.c = $ADContact.c
$ADUser.co = $ADContact.co
$ADUser.countryCode = $ADContact.countryCode
$ADUser.info = $ADContact.info
$ADUser.initials = $ADContact.initials
$ADUser.ipPhone = $ADContact.ipPhone
$ADUser.pager = $ADContact.pager
$ADUser.wWWHomePage = $ADContact.wWWHomePage
$ADUser.postOfficeBox = $ADContact.postOfficeBox
$ADUser.company = $ADContact.company
# Copying over the X500 address if it exists
if($ADContact.legacyExchangeDN)
{
$X500User = Get-MailUser -Identity $ADUser.DistinguishedName -DomainController $DomainController
$X500User.EmailAddresses += [Microsoft.Exchange.Data.CustomProxyAddress]("X500:$($ADContact.legacyExchangeDN)")

Set-MailUser -Identity $X500User -EmailAddresses $X500User.EmailAddresses -DomainController $DomainController
}

Set-ADObject -Instance $ADUser -Server $DomainController

# Loop through the groups and add the user to them
foreach($ADGroup in $ADContact.memberOf)
{
Add-ADGroupMember -Identity "$($ADGroup)" -Members $ADUser -Server $DomainController
}

Write-Host -ForegroundColor Yellow "$($TargetUser.Name) created."
$Status = "Success"
}
elseif($ADUser)
{
# Only a user is found
Write-Host -ForegroundColor Cyan "$($TargetUser.Name) already exists."
$Status = "Exists"
}
else
{
Write-Host -ForegroundColor Magenta "$($TargetUser.Name) not found!"
$Status = "Failed"
}

# Update the results
$ResultsRow.Result = $Status

# Clear the variables to prevent false positives on the next loop
$ADContact = $null
$ADUser = $null
$Status = $null

# Update the results table
$ResultsTable.Rows.Add($ResultsRow)
}

# Check if HTML results should be written out
if($ResultsFile)
{
# Build the style sheet for the page
$Style = "<style>"
$Style = $Style + "body{font-family: `"Century Gothic`"; font-size: 10pt;}"
$Style = $Style + "table{border-width: 1px; border-style: solid; border-color black; border-collapse: collapse; }"
$Style = $Style + "th{border-width: 1px; border-style: solid; border-color: black; background-color: #CBFEFF; }"
$Style = $Style + "td{border-width: 1px; border-style: solid; border-color: black; text-align: center}"
$Style = $Style + "</style>"

# LINQ will be used for easier custom formatting
Add-Type -AssemblyName System.Xml.Linq

# Convert the desired columns into HTML and convert it to XML
$xml = [System.Xml.Linq.XDocument]::Parse("$($ResultsTable | Select User,Result | ConvertTo-Html -Head $Style)")

# Define the namespace
if($Namespace = $xml.Root.Attribute("xmlns").Value)
{
$Namespace = "{{{0}}}" -f $Namespace
}

#
# $xml.Descendants().Value is not returning the values for some reason here
# Have to resort to alternate index discovery
#
#$wsIndex = [Array]::IndexOf($xml.Descendants("${Namespace}th").Value, "Result")
[Array]$xmlArray = $xml.Descendants("${Namespace}th")
for($i=0;$i -le $xmlArray.length-1;$i++)
{
#We're color coding the Results column
if($xmlArray[$i] -match "Result")
{
$wsIndex = $i
}
}

# Loop through each row and assign a text color and background color to the result
foreach($row in $xml.Descendants("${Namespace}tr"))
{

switch(@($row.Descendants("${Namespace}td"))[$wsIndex])
{
{"Success" -eq $_.Value} {$_.SetAttributeValue("style", "color: #006600; background-color: #F1FEFE;"); continue}
{"Failed" -eq $_.Value} {$_.SetAttributeValue("style", "color: #660000; background-color: #F1FEFE;"); continue }
{"Exists" -eq $_.Value} {$_.SetAttributevalue("style", "color: #4D6600; background-color: #F1FEFE;"); continue }
}
}

# Save to the desired file
$xml.Save("$pwd/$($ResultsFile)")
}

Use safely!

Save those service accounts!

Have you ever woken up at 2:33 am to the phone ringing off the hook because some service is not longer running and it must be fixed right now? And this service not running is because someone reset the password on a service account last week? I’ve seen this scenario play out way more than it ever should. Let’s nip that problem of service accounts running wild right away.

<#
.SYNOPSIS
Checks all servers available in the domain for services running under service
accounts.
.DESCRIPTION
Find-AllServiceAccounts grabs a list of all servers from the current domain and
uses WMI calls to retrieve a list of services from the servers. It then loops
through the services checking for service accounts and writes any service
accounts found to a CSV specifically for that server.
.NOTES
You will be prompted for credentials when first running this.
#>
Import-Module ActiveDirectory
# Default system accounts to exclude
$ExcludedServices = "NT AUTHORITY\LocalService", "LocalSystem", "NT AUTHORITY\NETWORK SERVICE", "NT AUTHORITY\NetworkService"

# Will pop up a prompt to provide credentials for making WMI calls
# to the remote systems
$Credentials = Get-Credential

# Uses the LDAP search results for computers
$HostList = Get-ADObject -SearchBase (Get-ADRootDSE).defaultNamingContext -LDAPFilter "(&(objectCategory=computer)(|(operatingSystem=Windows Server*)(operatingSystem=Windows 2000 Server)))" -Properties Name
$HostName = Get-Content env:computername

# Pass a service name in and check to see if it matches the excluded
# service accounts that you've preconfigured
function Check-ServiceName($ThisServiceName)
{
<#
.SYNOPSIS
Check-ServiceName verifies if a service is not a default service account
.DESCRIPTION
Check-ServiceName checks the service account for the specified service against
the list of excluded services from the $ExcludedServices variable. Returns $true
or $false.
.PARAMETER ThisServiceName
The service account to be checked against.
#>
  foreach($ExcludedName in $ExcludedServices)
  {
     if($ThisServiceName -eq $ExcludedName.ToLower())
     {
       return $true
     }
  }
  return $false
}

[array]$BadHostList = $null
# Our main loop, iterates through each system listed in
# the file and pulls all of their services
foreach($HostServer in $HostList)
{
  # Grab the list of services from the system
  if ($HostServer.Name -ne $HostName)
  {
    $ServiceList = get-WMIObject -Class win32_service -ComputerName $HostServer.name -Credential $Credentials -Property name,startname
  }
  # If it is the localhost then we need to exclude the credentials parameter
  else
  {
    $ServiceList = get-WMIObject -Class win32_service -ComputerName $HostServer.name -Property name,startname
  }

  # If the host could be contacted, loop through and check each service
  if($ServiceList)
  {
    [array]$CSVArray = $null

    foreach($ThisService in $ServiceList)
    {
      $CheckResult = Check-ServiceName($ThisService.StartName.ToLower())
      if ($CheckResult -eq $false)
      {
        Write-Host "Service: " $ThisService.name
        Write-Host "Account: " $ThisService.StartName
        $CSVArray += ,$ThisService | select Name,StartName    
      }
    }

    if ($CSVArray.count -gt 0)
    {
      Write-Host "Saving .csv for $HostServer"
      Write-Host ""
      $CSVArray | Export-CSV "$HostServer.csv"
      $CSVCount++
    }
  }
  # The host could not be contacted, note it down
  else
  {
    Write-Host "$($HostServer.name) is unreachable."
    $BadHostList += $HostServer
  }
}

if($CSVCount -gt 1)
{
  Write-Host "Done, wrote out $($CSVCount) .csv files."
}
elseif($CSVCount -eq 1)
{
  Write-Host "Done, wrote out $($CSVCount) .csv file."
}
else
{
  Write-Host "No .csv files written out, no service accounts found."
}

if($BadHostList)
{
  foreach($TargetHost in $BadHostList)
  {
    Write-Host "$($TargetHost.Name) at $($TargetHost.DistinguishedName) was unreachable."
  }
}

What this does for you is that it runs through all of your servers (assuming they will respond to WMI queries, you’ll want to check your firewall policies) and dumps out a CSV for any server with service accounts reporting what services are using service accounts and which service accounts those are. This will save you much frustration in tracking down who will be impacted by a service account password reset, security audits, password reset frenzies and just plain documentation.

Speaking of security audits and password resets, in an upcoming post I’ll show you how to easily update all of your service account passwords so that you can spend the rest of the day playing Animal Crossing rather than remoting from server to server to server …

Did this post help you out? Do you have any questions or a specific topic you’d like me to delve into? Just let me know, thanks!

Easy Ways to find your Mail Enabled Public Folders

It looks like some of you are wanting to easily find your mail enabled public folders. Definitely something good to know, especially when you are planning out a migration or are being thrown into a new and probably fragile environment. The documentation is never up to date of course so you have to dig it out yourself. Here comes PowerShell to rescue you from manually slogging through it all!

$Servers = Get-PublicFolderDatabase
foreach($TargetServer in $Servers)
{
    Get-PublicFolder -Recurse -Server $TargetServer.Server | where {$_.MailEnabled -eq $true}
}

This will grab all of the mail enabled public folders from all of the servers in your organization. But in case you need to dig in further, say for instance if you are needing to dig into the AD objects for fixing things like missing homeMDB entries or other woes. Or even just plain documentation. This one liner will do it for you.

Get-ADObject -SearchBase ( "CN=Microsoft Exchange System Objects," + (Get-ADRootDSE).rootDomainNamingContext) -SearchScope OneLevel -Filter { (mail -like "*") -and (ObjectClass -eq "publicFolder")} | select Name

I hope this helps out a few of you out there.

5 Extremely Useful Tools to Simplify your Microsoft Exchange Life

Here’s what I find myself using in my day to day life in working with Exchange. If it weren’t for these then troubleshooting and automation would be a lot more difficult and I would find myself throwing my life away. Why waste time when you could be watching E3 game trailers instead?

  1. PowerShell! This one definitely has to take the first place spot in mention. Not that I have any particular order to how I’m listing everything. If it wasn’t for PowerShell then management of even a single Exchange server would be much more tiresome. Just look at mass creating a number of new users. HR sends you an Excel sheet with all of their details, you save the pertinent bits out to a CSV, then just run it through a little PowerShell script
    $NewUsers = Import-Csv -Path C:\Import\UserList.csv
    foreach($NewUser in $NewUsers)
    {
                    New-Mailbox -Name $NewUser.Name -Password $NewUser.Password -UserPrincipalName $NewUser.UPN -Alias $NewUser.Alias -Database $NewUser.TargetDatabase -DisplayName ($NewUser.FirstName + " " +$NewUser.LastName) -FirstName $NewUser.FirstName -LastName $NewUser.LastName -OrganizationalUnit $NewUser.OU -Phone $NewUser.Phone -ResetPasswordOnNextLogon:$true
    }

    Tada, creating new users has been simplified from hours of manual labor to a few minutes of CSV formatting and scripted importing.

  1. mxtoolbox.com This is a site that gets used often in troubleshooting. I can quickly check on the MX records for any domain including my own, run through a list of BLs to see if my domain is listed, and very importantly run diagnostics on my mail server’s external facing connectors to see what errors may come up. This is where I turn if I don’t have a way to telnet in from the outside. There are a number of other useful tools there as well though they don’t receive as much use as the BL and diagnostics.
  2. testexchangeconnectivity.com A very important site if you are running Exchange migrations. The ActiveSync and Outlook Anywhere tests help greatly for verifying all of your autodiscover functionality and your other CAS services. The Lync tests are great as well if you have that as part of your organization. It is so much easier than having to call up some external user and have them test such and such functionality, again.
  3. migrationwiz.com Speaking of Exchange migrations, I’ve found this particular service to be extremely useful when working with migrations that are not Exchange to Exchange. MigrationWiz will login to the target service and transfer mailboxes to your Exchange server. It is as simple as that. No annoying process of exporting and import PSTs. Furthermore since incremental transfers are available you can spread out your migration without needing to do an overnight cutover.
  4. telnet and nslookup Since these are small I figured I could combine them into one entry. These are also some extremely vital tools in your Microsoft Exchange swiss army knife. I’ve already talked about the usefulness of telnet in previous posts so I won’t bore you all over again. Nslookup is a fantastic way to quickly verify records. Such as comparing your internal DNS records to your external records. Run nslookup autodiscover.contoso.com and nslookup autodiscover.contoso.com 8.8.8.8. The first will check what it is resolving to internally while the second will query Google’s public DNS for the external autodiscover record. MX record lookups are simple as well, nslookup –type=mx contoso.com 8.8.8.8.

Any favorite tools that you find yourself using over and over for your Exchange servers or migrations? Please let me know in the comments.

Mail Queuing for Mail Enabled Public Folders?

Feeling pretty good about yourself you come into the office and sit down to get some work done. After all, someone has to retrieve that Amulet of Yendor so it might as well be you, right? Unfortunately it doesn’t look like today will be your day. The warnings are piling up that your mail queue is getting rather large and some users have been asking where their daily messages in their public folders are. Taking a peek at the queue you see a large and growing number of emails in your Unreachable Domain queue. But your public folder database looks like it is mounted OK. Not cool.

What broke?

This is a fairly common scenario I’ve run into after migrating off of Exchange 2003. Your public folders migrated over successfully and mail had been flowing for a while but as soon as you took down the 2003 server the mail starts queuing up for your mail enabled public folders. Or maybe you went in and started doing some manual cleanup with ADSI Edit. Sometimes even just the uninstall of Exchange 2003 has some unexpected side effects. You remembered to do a backup of your AD prior to that major change, right? There’s a good chance that your public folder hierarchy is missing.

Great, so can we fix this?

The good news is that there is a road to recovery. Let’s check on things first, is your public folder hierarchy actually missing? Pop open the good old Exchange Management Shell and let’s check on a few things.

Import-Module ActiveDirectory
$SearchPath = "CN=Folder Hierarchies,CN=Exchange Administrative Group (FYDIBOHF23SPDLT),CN=Administrative Groups," + (Get-OrganizationConfig).DistinguishedName
Get-ADObject -SearchBase $SearchPath -SearchScope OneLevel -Filter {CN -eq "Public Folders"}

Hopefully you will get a result such as below.

DistinguishedName             Name                          ObjectClass                   ObjectGUID

—————–             —-                          ———–                   ———-

CN=Public Folders,CN=Folde… Public Folders                msExchPFTree                  f6a3cbd4-10e5-452d-9abe-44…

If you get a directory object not found then your public folder hierarchy is missing and we’ll have to recreate it. That’s step one on our way to saving the day. Let’s step back one further and make sure about whether our Folder Hierarchies container is there.

$SearchPath = "CN=Exchange Administrative Group (FYDIBOHF23SPDLT),CN=Administrative Groups," + (Get-OrganizationConfig).DistinguishedName
Get-ADObject -SearchBase $SearchPath -SearchScope OneLevel -Filter {CN -eq "Folder Hierarchies"}

If you get no results then that is missing as well. If you do then at least our container is there and we just need to create the hierarchy. Here’s the bit of PowerShell code that will fix up the missing public folders hierarchy.

<#
.SYNOPSIS
Recreates your public folders hierarchy
.DESCRIPTION
Checks your AD for whether the Folder Hierarchies container exists and the 
Public Folders hierarchy. If one does not exist then it is created.
#>
Import-Module ActiveDirectory

# Build path to the container
$SearchPath = "CN=Exchange Administrative Group (FYDIBOHF23SPDLT),CN=Administrative Groups," + (Get-OrganizationConfig).DistinguishedName
$PFContainer = Get-ADObject -SearchBase $SearchPath -SearchScope OneLevel -Filter {CN -eq "Folder Hierarchies"}

# If it does not exist then create the container
if(!$PFContainer)
{
    New-ADObject -Name "Folder Hierarchies" -Type msExchPublicfolderTreeContainer -Path $SearchPath
    Write-Host "Folder Hierarchies container created."
}
else
{
    Write-Host "Folder Hierarchies container exists already." -ForeGroundColor Yellow
}

# Build path for the public folder tree
$SearchPath = "CN=Folder Hierarchies,CN=Exchange Administrative Group (FYDIBOHF23SPDLT),CN=Administrative Groups," + (Get-OrganizationConfig).DistinguishedName
$PFHierarchy = Get-ADObject -SearchBase $SearchPath -SearchScope OneLevel -Filter {CN -eq "Public Folders"}

# If it does not exist then create it
if(!$PFHierarchy)
{
    New-ADObject -Name "Public Folders" -Type msExchPFTree -Path $SearchPath -OtherAttributes @{msExchPFTreeType="1"}
    Write-Host "Public Folders hierarchy created."
}
else
{
    Write-Host "Public Folders hierarchy already exists." -ForeGroundColor Yellow
}

# Set to our PF hierarchy DN
$PFHierarchy = "CN=Public Folders," + $SearchPath

# DN for our databases
$SearchPath = "CN=Databases,CN=Exchange Administrative Group (FYDIBOHF23SPDLT),CN=Administrative Groups," + (Get-OrganizationConfig).DistinguishedName
$PFDatabases = Get-ADObject -SearchBase $SearchPath -SearchScope OneLevel -Filter {objectClass -eq "msExchPublicMDB"}

# Grab all of the public folder databases and loop through them
if($PFDatabases)
{
    foreach($PFDatabase in $PFDatabases)
    {
        $PFDatabase.msExchOwningPFTree = $PFHierarchy
        Set-ADObject -Instance $PFDatabase
        Write-Host "Fixed database $($PFDatabase.Name)"
    }
}
# Or if no public folder databases exist you have further problems ...
else
{
    Write-Host "No Public Folder Databases found." -ForeGroundColor Yellow
}

But you’ll find that your work is not quite done yet. Your public folders are missing their homeMDB. Or this could have been your problem all along without any need to recreate the public folder hierarchy. You can verify this as the problem with this quick search:

$PFPath = "CN=Public Folders,CN=Folder Hierarchies,CN=Exchange Administrative Group (FYDIBOHF23SPDLT),CN=Administrative Groups," + (Get-OrganizationConfig).DistinguishedName
$SearchBase = "CN=Microsoft Exchange System Objects," + (Get-ADRootDSE).rootDomainNamingContext 
Get-ADObject -SearchBase $SearchBase -SearchScope OneLevel -Filter { (homeMDB -notlike "*") -and (ObjectClass -eq "publicFolder")}

If you don’t see anything then you know that your mail enabled public folders are fine. But most likely you’ll get a few results. To quickly fix those up run through this script.

<#
.SYNOPSIS
Fix any missing homeMDB attributes on public folders.
.DESCRIPTION
The script runs an LDAP search for all mail enabled public folder objects and
sets the homeMDB attribute to the LDAP path to your public folder hierarchy.
.NOTES
The script needs to be run in all domains requiring the fix.
#>
Import-Module ActiveDirectory
# Build the DN to the public folders hierarchy
$PFPath = "CN=Public Folders,CN=Folder Hierarchies,CN=Exchange Administrative Group (FYDIBOHF23SPDLT),CN=Administrative Groups," + (Get-OrganizationConfig).DistinguishedName
# Build the DN to the public folder objects
$SearchBase = "CN=Microsoft Exchange System Objects," + (Get-ADRootDSE).rootDomainNamingContext
# Search for all PFs with a blank homeMDB
$TargetPFs = Get-ADObject -SearchBase $SearchBase -SearchScope OneLevel -Filter { (homeMDB -notlike "*") -and (ObjectClass -eq "publicFolder")}

# Fix all of the public folders
if($TargetPFs)
{
  foreach($TargetPF in $TargetPFs)
  {
    Write-Host "Fixing $($TargetPF.Name)"
    $TargetPF.homeMDB = $PFPath
    Set-ADObject -Instance $TargetPF
  }
}
# Good news (maybe), no public folders that require fixing
else
{
  Write-Host "No public folders missing homemdb."
}

Fantastic, are we done yet?

Nearly! Just give the mail queues a kick and you should see the emails quickly flushing out into your public folder databases. Now you can get back to your day of NetHack knowing that all is well with the world once again.

The Welcome News of the Death of SBS

SBSIntoTheNight

SBS wander off into the black abyss

Somehow I missed this bit of news last year. The Death of the Small Business Server. Possibly because I haven’t had to deal with SBS in a serious manner for a while now. Anyhow from the point of view of an Exchange administrator it is welcome news to see that SBS 2011 is the end of the line for the SBS. I have always found SBS to be a pain to work with. The migration wizards were prone to breaking in mysterious ways. The POP3 connector was horrid to troubleshoot and would still choke on a seemingly normal email all too easily. The most annoying thing was being forced to use the wizards for most things you do, and to suffer the consequences if someone else did not use a wizard. For small businesses there don’t seem to be any genuine alternatives unfortunately. Several Linux based alternatives are presented over at The VAR Guy but I don’t see a full Linux solution being a comfortable route, at least for most of the small businesses I’ve worked with in the past. There are a few odd ones out that embrace any alternatives to Microsoft of course. Windows Server 2012 Essentials and Zimbra Collaberation Server virtual appliance may be a better compromise for MSPs that are reluctant to touch a full Linux alternative for an on-site Exchange substitute. For me I’ll just be happy that I won’t have to worry about Exchange 2013 shoehorned into an SBS.

452 4.3.1 Insufficient System Resources – Continued Telnet Training

This is a problem that crops up fairly often if you have a lot of disparate Exchange servers out there without a solid monitoring solution in place. Very common for MSPs. Oh, and actually have somebody paying attention to those monitoring alerts. Nobody likes paying attention to monitoring alerts. There are reams of rules dedicated to keeping them out of sight in Outlook clients around the world. But that makes for an entirely separate topic/rant. The symptoms of this problem are that you’ll be getting reports from the end users that they don’t seem to be receiving any email, or at least any external email. But oddly enough sending out email is working just fine.

This is the point where a quick telnet test will focus you in on what is going on really fast. Continuing with what you learned from the post on Essential Exchange Troubleshooting – Send Email via Telnet you will want to telnet into the server from outside the organization. You may immediately get a response of:

452 4.3.1 Insufficient System Resources

But more likely you’ll receive a typical SMTP banner such as

220 myserver.contoso.com Microsoft ESMTP MAIL Service ready at Mon, 27 May 2013 08:19:44 -0700

If so then I recommend that you continue through with sending in an email via telnet. The next likely place that you’ll encounter this error is when you issue the RCPT TO: command to which you receive a response of

452 4.3.1 Insufficient System Resources

The fix for this is fairly simple. Check your Exchange server for low disk space usage on the partition where your queues reside, which will most likely be the partition with your Exchange installation. I find that most often what has eaten all of your space, in cases of single server Exchange 2007/2010 installations, is the IIS log files. When setting up your Exchange server it is a good idea to make sure that you have an archiving/recycling policy in place for your IIS logs to keep them from swallowing the entire partition over time. BES installations have the same problem as well with log files swallowing the drive.

The key phrase that you’ll want to keep in mind with this is “back pressure.” In a later post I’ll delve into this term.

More to the topic on hand, here’s an extra PowerShell fix for you to keep those IIS log files under control. It can also be easily customized for BES logs or other logging happy programs. Or even just keeping your temp files cleaned up regularly. You’ll want to set it to run as a scheduled task on a daily, weekly or monthly basis depending upon your organizations policies.

# CleanIISLogs.ps1
# Find and remove files older than $days
# Set $LogPath to where the IIS logs you want to recycle are kept
# 

$days = 31
$LogPath = C:\inetpub\logs\LogFiles\W3SVC1
# Find the target date
$startdate = Get-Date
$startdate = $startdate.AddDays(-$days)

# Clean the directory of log files older than the target date
Get-ChildItem -Path "$($LogPath)" -Recurse | where {$_.LastWriteTime -lt $startdate} | Remove-Item -Confirm:$false

Is this post helpful to you or is there something you would like me to go into greater detail on? Please let me know, thanks.