Reliance on Google

Before Google, would I have been able to do my job effectively?

I’ve been thinking about this question the past few weeks, after I saw it raised on a reddit post. If I was no longer able to use a search engine during daily work, what kind of impact would that have on my ability to perform my responsibilities?

The most immediate result that comes to mind is the lack of ability to pick up syntax corrections and small code snippets/examples. I would be forced to be more organized in keeping my own examples and comments, and likely purchase additional resources to reference. In the scope of PowerShell it wouldn’t be as bad, as the help system built within PowerShell is excellent.

I’ve built up enough experience and understanding about “how things work” that I would be able to succeed in my role, albeit certainly less efficient.

Overall, I think I’d do alright with the lack of the Google, because I’m curious by nature. I investigate and seek to understand the concepts of technology I work with, not just the steps 1-2-3 to put it in place. I look for inter-dependencies in these platforms, and build upon a body of knowledge in how all the pieces fit together.

 

My thoughts turned to knowledge transfer, and how I can model some of these skills I have attained with my team members. I expect that 20 years ago there was a much higher reliance on mentorship and relationship building in order to acquire the experience needed to deal with technology solutions, and I think the additional benefits of those relationships are lost on us in the workforce now.

How can I continue to exceed my own expectations, and help uplift my team members in their quest to do the same? Here are a couple minor ideas that have big implications:

  • Try, and sometimes fail
  • Exhaustively read documentation
  • Learn to dig for log files and packet captures
  • Write post-mortem documents for completed projects; what went right, what went wrong

 

Server 2016 VM freeze up

I recently deployed a couple Server 2016 virtual machines within my environment, and have been having an issue with them freezing up after periods of inactivity. Symptoms would be inaccessible on the network, locked up from the Hyper-V console (i.e. unresponsive to the Ctrl+Alt+Del command), and not responding to any shutdown commands (from Hyper-V or command line).

I initially thought this might be an incompatibility with the hypervisor, as it is still Server 2012 R2, or perhaps a missing hotfix/update but after some research this doesn’t seem to be the case.

Yesterday I finally hit on a lead sourced from this discussion thread, which indicates the problem originates from the Pagefile being sourced on a separate VHDX.

Turns out this is exactly how I configure my VMs, so that if I ever decide to protect or Hyper-V replica the VM I can exclude it.

The resolution for this particular issue is to:

  • Right click Start Menu
  • Choose “System”
  • Click “Advanced System Settings”
  • Under Startup and Recovery, click “Settings”
  • Change the Write debugging information dropdown to “None”

The implications of this setting are that if Windows crashes due to unexpected failure, it will not create a memory dump file. More detail can be found here.

UltraVNC and DirectAccess

I’ve finally got UltraVNC working properly with a computer connected through DirectAccess. Here’s how a very brief description of how its working.

First I deploy UltraVNC 1.2.1.2 through a group policy shutdown script.

UltraVNCInstall

This script uninstalls previous versions, installs the latest version, copies the ultravnc.ini for standard settings, an ACL file for Active Directory group integration, and then overwrites winvnc.exe with the IPv6 version provided by the creator. Then it deletes the desktop icons and starts the service.

The important part is that IPv6 executable, because otherwise the traffic will not pass through to the DirectAccess client.

The other key note is that to make a connection, it needs to be a Reverse VNC connection; initiated from the server (user) side.

First, the Technician needs to start a VNCViewer listening session. The easiest way is to create a shortcut with this command target:

"c:\program files\uvnc bvba\UltraVNC\vncviewer.exe" -listen -dsmplugin SecureVNCPlugin64.dsm

This will place an icon in your task tray, and prepare your computer for receiving the request.

We deploy a shortcut to all workstations in the Start Menu that runs this command:

"c:\program files\uvnc bvba\UltraVNC\winvnc.exe" -connect

When clicking on the shortcut, the user will be presented with a window, which they can enter the Technician’s workstation name to connect.

In this method, there is no authentication needed because it was initiated by the server.

 

Azure Automation – Distribution Group

I’ve been wanting to expand my skills and abilities within the Azure platform recently, and this week a great opportunity arose for this very thing.

We provide dynamic distribution groups within Office 365 for staff, based upon AD attributes populates with DirSync. The primary example here is our “SubCompany All Staff” list, which is created with this query filter:

New-DynamicDistributionGroup -alias SubCompanyAllStaff -Name "SubCompany All Staff" -RecipientFilter { ((RecipientType -eq 'UserMailbox') -and (Company -like 'SubCompanyName*')) }

Now we have executive staff who are transitioning from this SubCompany to a parent company, which is causing an issue. Their Company attributes are no longer “SubCompanyName”, but they wish to remain on the distribution list to receive communications. However, IT does not (will not) maintain a manual membership list for this type of group. We could create an “exceptions” group, add these staff in, mail-enable the group and make sure it’s part of the query filter above, but this means every time someone changes company, we have to remember to add them into the exception group.

A better way is to filter on the ProxyAddress attribute; anyone who has an @subcompany.com email should be part of the distribution group. Those who switch companies maintain legacy @subcompany.com addresses as aliases. For example:

Get-Recipient -Filter {EmailAddresses -like *@subcompany.com*}

However, while the Get-Recipient command works here, this type of filtering is NOT supported within a dynamic group recipient filter; it returns no results.

This leads me to the solution: use an Azure Automation runbook to run the Get-Recipient command to populate a static group on a schedule.

The Query

First, I needed a functioning PowerShell query. I initially arrived at this here:

# Add users matching criteria to list.
 
$users = Get-Recipient -Filter {EmailAddresses -like "*subcompany.com*" -and RecipientTypeDetailsValue -eq 'UserMailbox'}
 
foreach($user in $users) { Add-DistributionGroupMember -Identity CompanyTestAll@company.onmicrosoft.com -member $user.WindowsLiveID }
 
# Remove those who no longer have an alias
 
$users = Get-DistributionGroupMember -Identity "Company TestAll" | Get-Recipient -Filter {EmailAddresses -notlike "*@subcompany.com*"} 
 
foreach ($user in $users) {Remove-DistributionGroupMember -Identity "Company TestAll" -member $user.WindowsLiveID -confirm:$false}

While this did perform what I was after, it was messy. I wanted this to run on a schedule, so the first ‘addition’ command would effectively error out on the majority of the group members. When I tried to run the second command as my service account in the run book, it threw a large amount of errors about the users not being found on the domain controller.

After a lot of back-and-forth testing, I had this command that works perfectly:

Update-DistributionGroupMember "Company TestAll" -Members (Get-Recipient -Filter {EmailAddresses -like "*subcompany.com*" -and RecipientTypeDetailsValue -eq 'UserMailbox'}|select -ExpandProperty Alias) -confirm:$false

 

Automation

Having never looked at Azure Automation before, I started reading. This blog post was a helpful start, and I went through quite a bit of the docs.microsoft.com site reading about Runbooks, PowerShell vs. PowerShell Workflow, credential access, and so on.

I created an Automation account that is intended to be specific to Office 365 scripts, along with a corresponding Resource Group. Since this isn’t going to touch any other Azure resources, I did not create a RunAs account with it.

Next I began looking at the best way to connect to Exchange Online. I quickly found this module which looked very promising. However, after importing it as an Asset in the account, the example given of “Set-ModuleCredential” wasn’t found as a valid cmdlet, and I couldn’t see it in the .psm1 file either. Since there were many other examples of connecting to Exchange, I moved on.

The next problem I found was that when I tried to do the standard RemoteSession to connect, testing the runbook would fail without any meaningful error message; just 3 attempts and then “suspended”. Based on that behavior, I found this on the feedback portal for Azure, which is exactly what was happening. The method provided by Rune Myrhaug works great:

# Pull credential from Automation assets
$credObject = Get-AutomationPSCredential -Name "svc_auto_azure"
 
#Connect to Exchange Online
$ExchangeOnlineSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.outlook.com/PowerShell-LiveID -Credential $credObject -Authentication Basic -AllowRedirection
Import-Module (Import-PSSession -Session $ExchangeOnlineSession -AllowClobber -DisableNameChecking) -Global

I created a service account in Active Directory, synchronized it to O365 with DirSync, and then used the Exchange Admin portal to ensure it had appropriate roles to manage Distribution Groups. I added this account in as a credential asset, and created a daily schedule as well.

The final step was to publish the runbook and associate it with my schedule, and now I have an automated quasi-dynamic distribution group!

I can forsee this potentially being useful for groups that are not built along attribute filters. Perhaps something like a SharePoint list that is populated by administrative staff or department heads, which is then consumed in the runbook and applied to a distribution group.

 

 

 

Sage 300 PayStub Self-Service PDF export

In April we deployed an add-on for our Sage 300 payroll software, which provides a self-service web interface for employees to access their pay stubs.

We recently discovered that when an employee is set to Inactive or terminated within Sage 300, their access to the portal stops working, which is a good thing. However, in our case this employee had simply transferred to the parent company, and still required access.

The self-service module is self-contained code which I can’t modify, so instead I sought out a method of extracting the PDF paystub from the database for a particular user.

Most of the information below was derived from this StackOverflow question and answer.

First, configure your SQL server to allow advanced command line commands:

EXEC sp_configure 'show advanced options', 1;  
GO  
RECONFIGURE;  
GO
EXEC sp_configure 'xp_cmdshell',1  
GO  
RECONFIGURE;  
GO

Next, you need to export a format file for the specific table we’re looking for. I used this command from my own client computer which has SQL Management Studio installed:

bcp dbo.PAYROLLCHECK format nul -S servername -T -d COMDat -n -f formatfile.fmt

This produced a file which looked like the following:

12.0
5
1       SQLCHAR             0       12      ""   1     EMPLOYEEID                     SQL_Latin1_General_CP1_CI_AS
2       SQLDATETIME         0       8       ""   2     PERIODENDDATE                  ""
3       SQLINT              0       4       ""   3     ENTRYSEQ                       ""
4       SQLIMAGE            4       0       ""   4     PDF                            ""
5       SQLDECIMAL          1       19      ""   5     CHECK                          ""

I trimmed this file down so that it looked exactly like this:

10.0
1
1       SQLIMAGE            0       0       ""   1     PDF                            ""

Some things to note here:

  1. The first row needs to match the SQL native client version that your SQL server is running; it was 12.0 when I ran it on my Windows 10 machine with SQL Server 2012 Management Studio, but my actual server is running SQL Server 2008 R2. Thus, I changed the digit to 10.0
  2. The column with “PDF” needs to match the column name that contains your file data. Some examples I found on the Internet weren’t clear here

The last step was to build the query to do the export. It is important to note that in this query the BCD command runs from the server, so any local paths where you’re going to store the format file or output PDFs must be accessible on the SQL Server itself.

This query fills a temporary table with all rows matching a specific employee ID.
Then the while loop iterates through each of those rows, and exports the PDF, naming it based on the Period End date as an increment value.

I declare the employee ID we’re looking for twice in between each ‘GO’ statement, because I’m lazy and didn’t look for a better way.

IF OBJECT_ID('dbo.tmp_for_export', 'U') IS NOT NULL
 DROP TABLE tmp_for_export
 GO
 --First Declaration of the Employee ID to use
 DECLARE @EmployeeID VARCHAR(20) = '1234';
 SELECT
 ROW_NUMBER() OVER(ORDER BY [CHECK]) AS rownum
 ,p.PDF
 ,p.employeeid
 ,CONVERT(CHAR(10),[PERIODENDDATE],126) AS PeriodEnd
 INTO tmp_for_export
 FROM payrollcheck p
 WHERE employeeid = @EmployeeID
 GO
 
DECLARE @EmployeeID VARCHAR(20) = '1234';
 DECLARE @cnt INT
 DECLARE @i INT
 DECLARE @filename VARCHAR(512)
 DECLARE @extension VARCHAR(20)
 DECLARE @SQL VARCHAR(4000)
 SET @cnt = (SELECT COUNT(*) FROM Payrollcheck WHERE EmployeeID = @EmployeeID)
 SET @i = 1
 SET @extension = '.pdf'
 
while @i <= @cnt
 BEGIN
 SET @filename = (SELECT PeriodEnd FROM tmp_for_export WHERE rownum = @i)
 SET @SQL = 'BCP "SELECT PDF from dbo.tmp_for_export WHERE rownum = '+str(@i)+'" QUERYOUT C:\Temp\'+@EmployeeID+'-'+@filename+@extension+' -f C:\Temp\formatfile.fmt -S servername -d COMDAT -T '
 EXEC master.dbo.xp_CmdShell @sql
 set @i = @i+1
 end
 GO

 

This resulted in a bunch of PDFs in C:\Temp on my SQL Server, which I could then go and grab, and give to the payroll department.