01 November 2018

Upgrading Microsoft Orchestrator from 2012 and 2016

It was time for us to upgrade the Microsoft Orchestrator to the newest 1801 version. We were three versions behind as we have been using 2012. Luckily starting with 1801, upgrades are performed via windows updates.

Microsoft provides a well-documented page on setting up Orchestrator located here. The problem with upgrading from Orchestrator 2012 and 2016 is the fact that you must uninstall the old version and reinstall the new version. The SQL server Orchestrator was connected to had not been documented in the beginning. We could find nothing in the console on what it was connected to. The registry was useless and the event viewer logs did not help. We started going through the Orchestrator logs located at %Programdata%\Microsoft System Center 2012\Orchestrator\. Each subdirectory has a logs folder. We finally located the log that contained SQL server and instance Orchestrator was connecting to. The log is located at %ProgramData%\Microsoft System Center 2012\Orchestrator\ManagementService.exe\Logs. You will likely have to go through each log in that directory to find the SQL server. The line will look like this:

  • <Param>App=<Instance>;Provider=SQLOLEDB;Data Source=<SQLServer>\<Database>;Initial Catalog=Orchestrator;Integrated Security=SSPI;Persist SecurityInfo=False;</Param>
Once we found the SQL server information, we were able to successfully upgrade the server to 1801 using the classical uninstall/reinstall, and then onto 1809 via windows updates. 

24 October 2018

User Logon Reporting

If you have to track the login times for a specific user, this tool will generate a report for you that scans the event viewer logs for ID 4624. The tool parses each event and retrieves the user name, securityID, type of logon, computer name, and time stamp. It formats the output and writes it to a centralized CSV file in the event this tool is deployed to multiple machines at once. The tool has the ability to 'wait for its turn' to write to the file when it is deployed to multiple systems.

I have the script translate what each of the logon types is. If you do not want a specific logon type to be reported, you can comment out that type within the switch cmdlet and it will not appear in the report.

NOTE: I originally wrote this script to have Get-WinEvent remotely execute on a machine using the -computer parameter and the time required was huge, especially on older systems with three months plus of event viewer data. It took almost 30 minutes. It ended up being much quicker to deploy the script via an SCCM package.

You can download the script from my GitHub site located here.


 <#  
      .SYNOPSIS  
           Logon Reporting  
        
      .DESCRIPTION  
           This script will report the computername, username, IP address, and date/time to a central log file.  
        
      .PARAMETER LogFile  
           A description of the LogFile parameter.  
        
      .NOTES  
           ===========================================================================  
           Created with:     SAPIEN Technologies, Inc., PowerShell Studio 2017 v5.4.142  
           Created on:       10/22/2018 10:13 AM  
           Created by:       Mick Pletcher  
           Filename:         LogonReport.ps1  
           ===========================================================================  
 #>  
 [CmdletBinding()]  
 param  
 (  
      [ValidateNotNullOrEmpty()]  
      [string]$LogFile = 'LogonReport.csv'  
 )  
   
 $Entries = @()  
 $IPv4 = foreach ($ip in (ipconfig) -like '*IPv4*') {($ip -split ' : ')[-1]}  
 $DT = Get-Date  
 foreach ($IP in $IPv4) {  
      $object = New-Object -TypeName System.Management.Automation.PSObject  
      $object | Add-Member -MemberType NoteProperty -Name ComputerName -Value $env:COMPUTERNAME  
      $object | Add-Member -MemberType NoteProperty -Name UserName -Value $env:USERNAME  
      $object | Add-Member -MemberType NoteProperty -Name IPAddress -Value $IP  
      $object | Add-Member -MemberType NoteProperty -Name DateTime -Value (Get-Date)  
      $object  
      $Entries += $object  
 }  
 foreach ($Entry in $Entries) {  
      Do {  
           Try {  
                Export-Csv -InputObject $Entry -Path $LogFile -Encoding UTF8 -NoTypeInformation -NoClobber -Append  
                $Success = $true  
           } Catch {  
                $Success = $false  
                Start-Sleep -Seconds 1  
           }  
      } while ($Success -eq $false)  
 }  
   

17 October 2018

PowerShell One-Liners to ensure Dell system is configured for UEFI when imaging

While planning and configuring the Windows 10 upgrades, we had to also include the transition to UEFI from BIOS. I wanted to make sure that when the build team builds new models that they are configured for UEFI when applicable, otherwise the build fails within seconds after it starts.

We use Dell systems, so interacting with the BIOS is simple. The Dell Command | Configure allows for the BIOS to be queried, which is what we need here to verify specific models are set correctly. We do have a few models that are not compatible with UEFI, so those have to be exempted. In looking at Dell Latitude models, anything newer than the E6320 is compatible with UEFI. Granted, there may be other models that we never had that could be compatible.

There are four key settings in the BIOS that determine if a system is compatible with UEFI. Those settings are the Boot List Option, Legacy Option ROMs, UEFI Network Stack, and Secure Boot. I have found the most reliable one of the four to verify compatibility is the UEFI Network Stack. If a system does not have this option, then UEFI is not compatible.

I set this up as four task sequences within a folder called Verify UEFI. The folder performs two WMI queries to make sure it is a Dell machine, and it is not one of the five models we still have in production that are not UEFI compatible. The conditions are set up as shown in the screenshot below.


The first WMI query makes sure the system is a Dell.

  • select * from Win32_ComputerSystem WHERE Manufacturer like "%Dell%"

The second WMI query makes sure the system is not one of the specified models that are not compatible with UEFI.

  • select * from Win32_ComputerSystem WHERE (model != "Latitude E6320") and (model != "Latitude E6410") and (model != "Optiplex 980") and (model != "Optiplex 990") and (model != "Optiplex 9010")
This is the setup in MDT that I have configured

Now that the folder is set up, you will need to create each of the four Run Command Line task sequences. Before doing this, you will need to have Dell Command | Configure installed and loaded into the WinPE environment. You can refer to my blog posting that details how to load this into WinPE. 

Each one of the four tests is a Run Command Line. They will look like the pic below. All you will need to do is to copy the PowerShell one-liner code below and paste it into the command line of each task sequence.

Here is the PowerShell one-liner code for each task sequence:
  • Boot List Option
    • powershell.exe -executionpolicy bypass -command "&{If ((x:\cctk\cctk.exe bootorder --activebootlist) -like '*uefi') {exit 0} else {exit 1}}"
  • Legacy Option ROMs
    • powershell.exe -executionpolicy bypass -command "&{If ((x:\cctk\cctk.exe --legacyorom) -like '*disable') {exit 0} else {exit 1}}"
  • UEFI Network Stack
    • powershell.exe -executionpolicy bypass -command "&{If ((x:\cctk\cctk.exe --uefinwstack) -like '*enable') {exit 0} else {exit 1}}"
  • Secure Boot
    • powershell.exe -executionpolicy bypass -command "&{If ((x:\cctk\cctk.exe --secureboot) -like '*enable') {exit 0} else {exit 1}}"
As you can see, if any of these fail, they will return an error code 1 and then fail the build. 

05 October 2018

Application List Report

We have started the Windows 10 upgrades and part of this process is installing applications for users that are not included in the standard build. One option is to use the SCCM Resource Explorer for a list of apps installed. The problem with that is it is a blanket report. It shows everything and all we were wanting is a report of the additional apps installed after a build.

I wrote this PowerShell script that can be executed as a package in SCCM against machines to generate an application report. The tool is specifically designed to work with MDT. You will define both the reference and production task sequences. The script will read the XML files and know to exclude those applications from the listing. Specifically, the script reads tasks that are Install Application types. There are going to be applications installed that you do not care about such as video driver packages that got installed automatically. They can be filtered out by populating the add/remove programs exclusions file ($ARPExclusionsFile). There is also the task sequence exclusions file which you can specify items that will get excluded from the task sequence. The final parameter to define is the $OutputDIR, which is the UNC path to the location where you want the text file written to containing a list of additional apps needing to be installed.

You can download the script from my GitHub site located here.

Here is an example of my ARPExclusions.txt file:

64 Bit HP CIO Components Installer
Active Directory Authentication Library for SQL Server
Active Directory Authentication Library for SQL Server (x86)
Administrative Templates (.admx) for Windows 10 April 2018 Update
Adobe Refresh Manager
AMD Catalyst Control Center
AMD Fuel
Apple Application Support (32-bit)
Apple Application Support (64-bit)
Apple Mobile Device Support
Apple Software Update
Bonjour
Catalyst Control Center - Branding
Catalyst Control Center InstallProxy
Catalyst Control Center Localization All

Here is an example of my TSExclusions.txt file:

.Net Framework 3.5
Activate Office and Windows
Avenir Fonts
Bitlocker System
Configure Dell Power Management Settings
Configure NIC Advanced Properties
Delete Dell Command Configure Shortcut

07 September 2018

Robocopy User Profile Contents to UNC Path

The Windows 10 upgrades required us to move profile contents off of the machines to a file share and then move them back. This was because USMT could not be used due to the architecture changing from 32-bit to 64-bit.

This script I wrote will copy all of the pertinent data from a user profile to a specified UNC path. I made two text files to include all exclusions for directories and files. The exclusion files need to reside in the same directory as this script. I have added the examples of the files and directories exclusion files that we excluded. The other variable you need to define is the DestinationUNC which is the path to the folder where the profile will be backed up to. The script also creates a 0RobocopyLogs directory at the specified UNC path containing a log of each transfer. One more thing I added to the script is the ability to check if the computer name is correct, which includes is online and if the username is correct. At the end of the process, it will return the robocopy error code.

You can download the script from my GitHub Repository.

Below is the contents of the DirectoryExclusions.txt file.

Application*
LocalService
*Games*
NetworkService
*Links*
*temp
*TEMPOR~1
*cache
Local*
cookies

Below is the contents of the FileExclusions.txt file.

ntuser.*
*.exd
*.nk2
*.srs
extend.dat
*cache*
*.oab
index.*
{*
*.ost
UsrClass.*
SharePoint*.pst
history*
*tmp*

31 August 2018

MDT Build Application Report One-Liner

While building a new reference image, I always want to make sure every application got installed before the WIM is generated. I have done this in the past by placing a pause in the build immediately after the windows update post-application installation is completed. It definitely takes time for me to go through the list of apps and verify they are there.

While researching the process, I found the ZTIApplication.log file that is generated during a build. It contains the list of all applications and the return code after the install. I wrote this script that will query that file and generate a report of all installs so that I can quickly look at the list to make sure everything is there before continuing with the WIM file generation. One thing I did have to do is to move some Run Command Line tasks to application tasks so they would be in the above-listed log file. 

This was all achievable with a PowerShell one-liner that can be executed in the build via a Run Command Line task as shown below. 


When the one-liner executes in the task sequence, it will generate a report as shown below. The report also pauses the build until you click OK giving you time to review the report and possibly install something that may have failed. 



Below is the one-liner you can copy and place in the task sequence.

 powershell.exe -executionpolicy bypass -command "&{$Apps=@();$ZTIAppsLog=Get-Content -Path ($env:SystemDrive+'\MININT\SMSOSD\OSDLOGS\ZTIApplications.log');$AppNames=($ZTIAppsLog|Where-Object {$_ -like '*Name:*'})|ForEach-Object {(($_.split('[')[2]).split(':')[1]).split(']')[0].Trim()};Foreach ($App in $AppNames) {$obj=New-Object -TypeName PSObject;If (($ZTIAppsLog|Where-Object {$_ -like ('*Application'+[char]32+$Application+'*')}|ForEach-Object {($_.split('[')[2]).split(']')[0]}|ForEach-Object {$_ -replace 'Application ',''}) -like '*installed successfully*') {$Status='Installed'} else {$Status='Failed'};$obj|Add-Member -MemberType NoteProperty -Name Application -Value $App;$obj|Add-Member -MemberType NoteProperty -Name Status -Value $Status;$Apps+=$obj};$Apps|Out-GridView -PassThru}"  

22 August 2018

Deleting Previous MDT Build Logs with this PowerShell One-Liner

If you have the SLShare variable defined in MDT to write logs to the specified UNC path and the SLShareDynamicLogging defined to write to the same path including the %ComputerName%, you have probably run into the issue of the logs being enormous. This is since each time a system is reimaged with the same computer name, the new logs are appended to the old logs, and they do get big.

This PowerShell one-liner will delete the folder containing all of the old logs if it exists. To get this to work, you will need to get the full UNC path to the ZTIUtility folder, which I found is located at %DeployRoot%\Tools\Modules\ZTIUtility. To map to this, you need to explicitly define the UNC path as the PowerShell one-liner cannot read the task sequence variable %DeployRoot% until this module is loaded.

To use this in MDT, I created a Run Command Line task as shown below.



The task was placed into the Initialization phase, so the log directory is deleted at near the start of the task sequence.





Here is the  one-liner. You will need to set the executionpolicy as it will be initially set to restricted and you will need to update <UNC Path to the MDTDeploymentShare> to the path of the MDT deployment share.




 powershell.exe -executionpolicy bypass -command "&{import-module '<UNC Path to the MDTDeploymentShare>\Tools\Modules\ZTIUtility';$LogDir=$TSEnv:DEPLOYROOT+'\Logs\'+$TSEnv:OSDCOMPUTERNAME;If ((Test-Path $LogDir) -eq $true) {Remove-Item -Path $LogDir -ErrorAction SilentlyContinue -Recurse -Force}}"  

14 August 2018

Profile Size Reporting

While in the middle of the planning phase for the Windows 10 rollout, we wanted a report on the size of the My Documents and Desktops of all users. These will be the folders we have decided to back up. USMT is not possible in our environment due to the cross-architectures. Plus, we want users to have new profiles for the new OS.

The first thing I thought about was writing a script that would custom report this data back to SCCM, but then I thought this is a one-time task and we will probably never look at it again. I decided to write a script that would gather the sizes of the two folders for each profile on a system and then report that to a single excel spreadsheet.

I wrote the script so that it can be used in the new SCCM scripts section, or it can be deployed as a package. You can even manually execute it. There are two lines you will need to modify. Those are lines 2 and 4. Line 2 will need the full path to the CSV file. Line 4 is the list of profiles to exclude. I have included the three that are included in all systems. There are two additional ones I added for the environment I work in.

You are probably wondering why I put in lines 19 through 23, as that would seem somewhat odd. Because a lot of systems are simultaneously competing for the same CSV file, there can be only one write to the file. To do this, I put all content to be written inside a single variable and use the [char]13 (CR) for line breaks. The next part is where the script enters a loop until $success equals $true. Each time the script tries to write to the CSV file and fails due to it being locked, $Success is set to $false.

To use this with the newer SCCM, you can enter it into the scripts section as shown below.


You can download the script from my GitHub Site

I would like to thank Mike Roberts from The Ginger Ninja for the resource on how to calculate folder sizes. That helped a lot in writing this script.

 #Full path and filename of the file to write the output to  
 $File = "<Path to CSV file>\ProfileSizeReport.csv"  
 #Exclude these accounts from reporting  
 $Exclusions = ("Administrator", "Default", "Public")  
 #Get the list of profiles  
 $Profiles = Get-ChildItem -Path $env:SystemDrive"\Users" | Where-Object { $_ -notin $Exclusions }  
 #Create the object array  
 $AllProfiles = @()  
 #Create the custom object  
 foreach ($Profile in $Profiles) {  
      $object = New-Object -TypeName System.Management.Automation.PSObject  
      #Get the size of the Documents and Desktop combined and round with no decimal places  
      $FolderSizes = [System.Math]::Round("{0:N2}" -f ((Get-ChildItem ($Profile.FullName + '\Documents'), ($Profile.FullName + '\Desktop') -Recurse | Measure-Object -Property Length -Sum -ErrorAction Stop).Sum))  
      $object | Add-Member -MemberType NoteProperty -Name ComputerName -Value $env:COMPUTERNAME.ToUpper()  
      $object | Add-Member -MemberType NoteProperty -Name Profile -Value $Profile  
      $Object | Add-Member -MemberType NoteProperty -Name Size -Value $FolderSizes  
      $AllProfiles += $object  
 }  
 #Create the formatted entry to write to the file  
 [string]$Output = $null  
 foreach ($Entry in $AllProfiles) {  
      [string]$Output += $Entry.ComputerName + ',' + $Entry.Profile + ',' + $Entry.Size + [char]13  
 }  
 #Remove the last line break  
 $Output = $Output.Substring(0,$Output.Length-1)  
 #Write the output to the specified CSV file. If the file is opened by another machine, continue trying to open until successful  
 Do {  
      Try {  
           $Output | Out-File -FilePath $File -Encoding UTF8 -Append -Force  
           $Success = $true  
      } Catch {  
           $Success = $false  
      }  
 } while ($Success = $false)  
   

08 August 2018

Install Dell Command Configure in WinPE

Dell Command | Configure can be of great use in the WinPE environment. It allows you to configure and/or query the BIOS before an operating system is laid down. This is easy to do.

The first thing is to determine the architecture of the WinPE environment. This will determine which Dell Command | Configure to use. If you use a 64-bit machine, you will have two folders under %PROGRAMFILES(X86)%\Dell\Command Configure\. They are x86 and x86_64. Depending on the WinPE architecture, you will use the appropriate directory. The first thing you will do is to copy the contents of the chosen directory to a UNC path. The files in that directory are what will be used to execute the CCTK.exe.

I used to have a complete PowerShell script written to do all of the steps, but I have gotten away from that and gone more task sequence steps so I don't have to maintain a .PS1 file. Below is a screenshot of the configuration I use for installing this.


I first map a T: drive to the UNC path containing the files. I arbitrarily chose T:. All of the commands listed below are entered and executed through the Run Command Line task sequence.

Here is a screenshot of the directory. I used to have to also explicitly enter credentials with the net use command, but with recent windows updates, that no longer works. Now all I enter is net use t: \\<UNC Path>.

The next thing is copying the above files and directory to the x: drive. The x: drive contains the WinPE operating system. I create a folder in the root named CCTK. To do the copy, I use the following command.

xcopy.exe "t:\*.*" "x:\CCTK\" /E /C /I /H /R /Y /V



Next comes the HAPI driver. This is necessary to interface CCTK with the Dell system hardware. Here is the command for running HAPI. This will be different if it is being executed in an x86 environment. Instead of hapint64.exe, it would be hapint.exe.

x:\CCTK\HAPI\hapint64.exe -i -k C-C-T-K -p X:\CCTK\HAPI\



Finally, I unmap the mapped t: drive by using the following command line.

net use t: /delete


This is an example of using the CCTK.exe to clear the BIOS password after the utility is installed.


NOTE: You can update the CCTK for the WinPE environment. To do so, execute the Dell Command | Update which will also update the Dell Command | Configure utility. Once that is updated, recopy the contents as described above to the designated UNC path. 

27 July 2018

Cleaning Up and Automating the Backup of Bitlocker Passwords to Active Directory

Recently, I was reviewing the bitlocker recovery password backups. We still use active directory to store them, and yes, we are planning on moving to MBAM. That is a ways off as we're in the process of the Windows 10, Exchange 2016, and Office 2016 migrations. While looking over the AD backups, I noticed some machines stored multiple recovery passwords due to systems being reimaged and then some had duplicates.

To solve this, I was initially going to write a PowerShell script to delete the AD entry during a task sequence build process for a clean slate. Going through the testing phase, I ran into other issues that required more testing in the script. Therefore a one-liner was out of the question. In the end, this cleaned up our active directory bitlocker password entries and verified all stored passwords were valid. 

This script I have written does the following:

  1. Queries the bitlocker password and ID from the system
  2. Queries active directory for the backed up Bitlocker ID(s) and password(s)
  3. Cycles through the active directory entries and deletes those that do not match the stored local ones and removes duplicates.
  4. Queries active directory once again for the stored ID and password to see if it matches the locally stores ones
  5. If there are no entries, the info is backed up to active directory and verified the backup was successful
  6. If an error -2147024809 occurs during the backup, the system checks if Bitlocker is enabled and returns system is not Bitlockered. Otherwise, an unspecified error message is displayed. 
  7. Does not exist in active directory, the info is backed up. If it does exist but does not match, the key in AD is deleted, and then the new key is uploaded. If there are duplicates in AD that match to locally stored key, all are deleted, except for one. If bitlocker is not enabled on a machine, then an error 3 is returned. If an unspecified error occurs, an error 2 is returned. These return codes allow for the script to alert to issues within a build or if it is used in the scripts section of SCCM. 
  8. If the Bitlocker info matched on both the local system and AD, then the info is displayed on the screen and an exit code 0 is returned. 
This script requires domain admin access to run as it needs to have access to active directory. The SCRIPTS section of SCCM cannot run this as it uses the system account. The same goes for an SCCM package. The only way this can be executed through SCCM is to implement it in a task sequence. The Run Command Line is what needs to be used, specifying a domain admin account under the Run this step as the following account, as shown below. This same thing has to be done in MDT and/or SCCM to get this to work in cleaning up active directory when building a new system. 



You can download the script from my GitHub site


19 July 2018

Accessing MDT and SCCM Task Sequence Variables

While rewriting a PowerShell automation script to move machines in Active Directory, I had been trying to pass the MachineObjectOU task sequence variable to the PowerShell script within the command line task sequence. It constantly failed. I finally put in a cmd.exe task sequence to pause the build and allow me to interact directly with the task sequence. What I found out is that MDT and SCCM task sequence variables from a build process can only be accessed under the administrator account. In the screenshot below, the command line on the top is what was launched by the task sequence with no specified account credentials. The command line at the bottom was opened using domain admin credentials. As you can see, the command line on the top was able to access the task sequence variable, whereas the one on the bottom was not. If a task sequence uses any other account, task sequence variables are null. 


17 July 2018

Moving Computers to Designated OU during Build Process

It has been four years since I published the last version of the PowerShell script to move systems from one OU to another during a build process. In that version, it required making a task sequence for each OU, which if there are a lot of OUs, that would be a very daunting process.

In this new version, I have done two things to improve it. The first is making it into a one-liner, so you don't have to maintain the script on some share. Second, it can now move systems to any OU using the one-liner thereby cutting down on the number of required task sequences.

To use this method, RSAT must be installed on the system. I have RSAT as part of our reference image, so it is already present when the reference image is laid down. The next step is to create the task sequences. There are three task sequences required. The first is to get the OU that was selected in the initial build screen. This is done by querying the MachineObjectOU task sequence variable. Task sequence variables are only accessible to the administrator account. If you try and access them from any other user account, the output is null. This is the reason why three task sequences are required for this process. So to pass the MachineObjectOU to the next task sequence which will move the system, I have it write the OU to the text file OU.txt located in the c:\ directory.

This is the one-liner for creating the file containing the OU to move the system to:

 powershell.exe -executionpolicy bypass -command "&{$TSEnv = New-Object -ComObject Microsoft.SMS.TSEnvironment;$TSEnv.Value('MachineObjectOU') | out-file c:\OU.txt}"  



The next step is moving the system. This one-liner will read the OU in the text file, check to see if the system is in the desired OU, and then move it if it is. Lastly, the one-liner will do a check to make sure the system is in the correct OU after the system is supposedly moved. It exits with a 0 if successful and a 1 if unsuccessful.

This is the one-liner for moving the system:

 powershell.exe -executionpolicy bypass -command "&{Import-Module ActiveDirectory;[string]$CurrentOU=((Get-ADComputer $env:ComputerName).DistinguishedName.substring((Get-ADComputer $env:ComputerName).DistinguishedName.IndexOf(',')+1));[string]$NewOU=Get-Content c:\OU.txt;If ((Get-WmiObject Win32_Battery) -ne $null) {$NewOU=$NewOU.Insert(0,'OU=Laptops,')};If ($CurrentOU -ne $NewOU) {Move-ADObject -identity (Get-ADComputer $env:ComputerName).DistinguishedName -TargetPath $NewOU};$CurrentOU=((Get-ADComputer $env:ComputerName).DistinguishedName.substring((Get-ADComputer $env:ComputerName).DistinguishedName.IndexOf(',')+1));If ($CurrentOU -eq $NewOU) {Exit 0} else {Exit 1};}"  



Lastly, the third task sequence will delete the OU.txt file. There are no WMI queries that have to be done in the newly updated script as was required in the old one.

 powershell.exe -executionpolicy bypass -command "&{Remove-Item -Path c:\OU.txt -Force}"  



This is how I have it in the task sequence hierarchy:



11 July 2018

Google Chrome Installation One-Liner

Instead of wanting to keep up with the latest Google Chrome installers, I wanted to create a one-liner that can download Chrome from the Google URI and then silently install it in the reference image build process. This one-liner does just that. It will download the latest Chrome version and install it. This means you no longer have to take time keeping up with the latest installer and the code is compacted down to a one-liner so it is only in the task sequence.

I placed the $URI variable at the beginning, as shown below, so if it ever needs to be changed, it is easily accessible in the one-liner within either SCCM or MDT. If the download site is changed and the installer fails, it will return the error code back so you are aware of the issue. 

As you can see below, I used a Run Command Line task sequence to incorporate this into the build. 


NOTE: The URI in the one-liner downloads the Chrome 64-Bit public version. You will need to update the URI if you use a different version.

Here is the one-liner:


 powershell.exe -executionpolicy bypass -command "&{$URI='http://dl.google.com/chrome/install/375.126/chrome_installer.exe';$ChromeInstaller=$env:TEMP+'ChromeInstaller.exe';Invoke-WebRequest -Uri $URI -OutFile $ChromeInstaller -ErrorAction SilentlyContinue;$ErrCode=(Start-Process -FilePath $ChromeInstaller -ArgumentList '/silent /install' -Wait -Passthru).ExitCode;Remove-Item $ChromeInstaller -ErrorAction SilentlyContinue -Force;Exit $ErrCode}"  

28 June 2018

Shortcut GPO for Root Network Share

If you have tried adding a GPO to create a shortcut to the root network share, \\contoso for instance, then you have probably seen that it is not working. The solution is to not use Shell Object and not File System Object. Once you select Shell Object, click on Browse for Shell Object and select Network. This will take some time for it to scan the network. In my experience, I have to do this twice. The first time shows a couple of items under Network. When I click cancel and then go back and browse, click on Network, the full list appears at which point I can select the network share I needed to make a shortcut to. Here is the example below that once I browsed for and selected the shell object on the network, the shortcut was successfully created.



14 June 2018

Office 2016 Outlook Cannot Log On Upon the First Launch

While building the new Windows 10 image with Office 2016 along with exchange 2016 on the backend, I got the following error message every time I tried to open up Outlook for the first time.


What was so frustrating about this issue was that if I went into Mail, within the control panel, and deleted the existing profile, Outlook would open and configure with no problems.

The first thing I did was to check with the exchange admins to make sure autodiscovery was enabled. It was. The second thing was to make sure the newly imaged machine could ping the exchange servers and it could. While going through the troubleshooting process, I learned that even though the creation of PRF files exists in the office customization tool, it is not valid for office 365/2016. At that point, I went back in and removed the PRF content from the customization tool (.MSP file), and the issue persisted. Finally, I found a PRF file in the following directory: c:\Program Files (x86)\Microsoft Office\. After making changes to the contents of the file, the problem still persisted.

The final fix was to delete that PRF file (OutlookProfile.PRF) shown below. The may or may not exist in other environments. How the file was being created, I have no idea. I did not find it anywhere in the directory tree of the office 2016 installer. The MSP file had been completely recreated from scratch to make sure no remnants were left over from the original MSP file that did contain parameters for generating a PRF file. The only other fix, which I did not want to do was to possibly create an entirely new installation tree with a fresh download of Office 2016. My fix was adding a line to my PowerShell installer to delete the PRF file. It may also be possible to include that deletion within the office customization tool.


How to Remove Bulk Facebook Profile Content

Here is a video guide on how to remove bulk Facebook content from your Facebook profile. Some will just say to delete the profile and create a new one. That can be rather troublesome, especially if you have a lot of Facebook contacts and/or are well known in certain areas of expertise. In the video, I go through the easiest way I found to delete Facebook posts and likes. My video includes several gotchas in the process, along with how the Chrome add-in works. I also include valid reasons why you might want to clean up the profile.

Reasons for removing bulk Facebook content:

  • International travel where certain posts may be illegal in other countries
  • A potential employer may want to look at your Facebook profile
  • You broke up with someone and want all associated content deleted
  • You are applying to colleges and are concerned they may look at your Facebook content that may have an impact on admissions
Here are some links that pertain to some of the content of the video:


01 June 2018

Filtering out Windows Activations When Imaging from Test MDT Share or Task Sequence

The environment I work in entails using MAK activation instead of KMS. This means that we have a set number of MAK activations given by Micorosft. One of the issues is that you can use quite a few when needing to build a new image while working through problems. 

In my environment, we do have SCCM, but we use MDT for imaging. We have two MDT deployment shares, one for testing and one for production. I use a PowerShell script to activate the Office and Windows licenses. My test task sequences are exactly like my production task sequences when they are synchronized after testing. I don't like to have any differences in them. To stop my test shares from activating every time I image, I came up with the following solution. 

In the task sequence(s) used to activate windows and/or office, I implemented a Task Sequence Variable condition under the Options tab. In the Variable field, enter DEPLOYROOT. The conditions field should be equals. Finally, the Value field should contain the UNC path to your production MDT share. This will stop the task sequence from executing when building from your test build share. 


Although I have not tried this, if you have only one deployment, I think you can add a Task Sequence Variable Condition to the task sequence where you defined TaskSequenceID for the Variable, equals for the Condition, and the name of your production task sequence for the Value. This should stop that TS from executing if run from the test TS. 

21 May 2018

Fix for FAILURE (9705): Unable to find USMT file, cannot capture/restore user state

Recently, I upgraded to the new MDT 6.3.8450.1000. It was a fresh install of the new MDT product in which I created an entirely new MDT share for our Windows 10 build. I had done an upgrade to the other MDT share that contains the Windows 7 builds, and it busted all of them at which point I had to do a restore.

After creating the new task sequences, applications, drivers, OS, and packages, I went to capture my first Windows 10 image and got the error below.


The third line down was the key to the error. I went in and looked at the ZTIUserState.wsf and saw this line:
sScanStateFolder = oEnvironment.Item("DeployRoot") & "\Tools\" & sUSMTArchitecture & "\" & sUSMTVersion

I went to the MDT deployment share and under it to tools\x64\USMT5. There was nothing in that directory. I then looked at tools\x86\USMT5, and there was nothing there either. It was evident that this was the cause. To fix this, I needed to download the latest WADK, which also required uninstalling the current version first. Once it was downloaded, I checked off the User State Migration Tool (USMT) option. Once it completed, I copied over the USMT tool x86 and x64 from

  • C:\Program Files (x86)\Windows Kits\10\Assessment and Deployment Kit\User State Migration Tool\x86
  • C:\Program Files (x86)\Windows Kits\10\Assessment and Deployment Kit\User State Migration Tool\amd64
to the following directories

  • %MDTDeploymentShare%\Tools\x86\USMT5
  • %MDTDeploymentShare%\Tools\x64\USMT5
Once I did this, the task sequence can now capture an image with no problems. 

17 May 2018

Explaining the %WINDIR%\Installer Folder

While recently writing an MSI uninstaller script, I needed to be able to associate the GUID with the applications name. In the process, I was finally able to associate the MSI files inside the Installer folder with the application.

You may wonder why you want to know this. Most of this is probably just for general knowledge, but there are a couple of instances I can think of on why you might want to know. For one, I used it as described above. You may also want to know this if you are checking to see if the MSI is available to do an uninstall or a repair.

The MSI and MSP files inside the %WINDIR%\Installer folder are associated with the installed application or update in the following registry location:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Installer\UserData\S-1-5-18\Products\<Unique Identitier>\InstallProperties

The screenshot below shows an example of the unique MSI filename in the %Windir%\Installer directory associated with the installed application. The association is made under the LocalPackage.


07 May 2018

Blank Screen after Enabling Secure Boot in the BIOS

I am working on the Windows 10 image, and part of this project is converting to UEFI. We do have several older systems that are still in production because systems that are used for temporary or loaner aren't as important to keep up-to-date. I use the oldest model system when creating a reference image so I am sure the image will work across all models.

I first started with a Dell Optiplex 990 and quickly realized it was not compatible with Windows 10 UEFI because of missing BIOS features. I then moved up to a Dell Optiplex 9010, and it included secure boot. Once I set the BIOS for UEFI and rebooted, there was no more screen. The monitor was blank. The first thing I tried was turning off the machine, unplugging the power cord, holding in on the power button for 15 seconds, removing the battery and holding in on the power button for 30 seconds, and then reinstalling the battery before turning the machine back on. This did not reset the BIOS, and the screen was still blank.

The next thing I did was to remove the DVI cable. It was replaced with the VGA cable and was plugged into the built-in motherboard video port. When I powered on the system, I got the following screen to display.


The instructions provided do not work. The video card I experienced this with was the AMD Radeon HD 6350.

What I did to resolve this was to completely remove the video card, connect the VGA cable, and then power the system up. After that, I got the following screen.


Once I got this screen, I was able to go into the BIOS. It ended up being two settings in the BIOS. The first was the Enable Legacy Option ROMs. It must be turned off for the secure boot to be enabled. It was this option that actually caused the screen to be blank when using the video card.


The second option is the Secure Boot Enable. To use this, Enable Legacy Option ROMs has to be turned off.



Once I set these back to the defaults, the system was then able to be boot up and be displayed on the monitor using the DVI video card. It is not the computer if it is an Optiplex 9010 or higher. It is the video card that cannot support the secure boot in Windows 8 or higher due to the lack of UEFI Option ROM drivers as described by Dell and ZDNet. The solution would be to replace the video cards that can support the UEFI Option ROM drivers.

04 May 2018

MDT Not assigning the correct Drive Letter to the Windows Primary Partition with UEFI

When I installed the new MDT 6.3.8450.1000 to build the deployment package for Windows 10 1709, I ran into issues with the operating system deployment. In the process of building out the new task sequence, I also decided to convert over to UEFI. The OS was laying down, but it was installed on the wrong drive letter, D: instead of C:. After trying many things, I finally decided to abandon the Format and Partition Disk task sequence provided by Microsoft and create my own using PowerShell.

The first step of this is to create the text configuration file. I could have created the file and had the build point to it, but I would rather PowerShell do this. After researching the failed build from other issues, I found it created the following drives listed below:


  • Boot (EFI) with the drive letter W:
  • (MSR) with no designated drive letter
  • Windows (Primary) with the drive letter C:
  • Recovery (Recovery) with the drive letter E:
To achieve this, I use two task sequences in MDT as shown below,

The first task sequence consists of the PowerShell one-liner as a Run Command Line shown below. The one-liner creates the DiskpartHDD.txt file on the WinPE bootable drive (X:). The numbers I used for the sizes came from MDT. 


This is the PowerShell one-liner within the task sequence:

powershell.exe -command "&{new-item -Name X:\WINDOWS\system32\DiskpartHDD.txt -ItemType File -Force; add-content -value 'select disk 0' -path X:\WINDOWS\system32\DiskpartHDD.txt; add-content -value 'clean' -path X:\WINDOWS\system32\DiskpartHDD.txt; add-content -value 'convert gpt' -path X:\WINDOWS\system32\DiskpartHDD.txt; add-content -value 'create partition efi size=499' -path X:\WINDOWS\system32\DiskpartHDD.txt; add-content -value 'format quick fs=fat32 label=System' -path X:\WINDOWS\system32\DiskpartHDD.txt; add-content -value 'assign letter=W' -path X:\WINDOWS\system32\DiskpartHDD.txt; add-content -value 'create partition msr size=128' -path X:\WINDOWS\system32\DiskpartHDD.txt; add-content -value 'create partition primary' -path X:\WINDOWS\system32\DiskpartHDD.txt; add-content -value 'shrink minimum=50000' -path X:\WINDOWS\system32\DiskpartHDD.txt; add-content -value 'format quick fs=ntfs label=Windows' -path X:\WINDOWS\system32\DiskpartHDD.txt; add-content -value 'assign letter=C' -path X:\WINDOWS\system32\DiskpartHDD.txt; add-content -value 'create partition primary' -path X:\WINDOWS\system32\DiskpartHDD.txt; add-content -value 'format quick fs=ntfs label=WinRE' -path X:\WINDOWS\system32\DiskpartHDD.txt; add-content -value 'set id="de94bba4-06d1-4d40-a16a-bfd50179d6ac"' -path X:\WINDOWS\system32\DiskpartHDD.txt; add-content -value 'assign letter=E' -path X:\WINDOWS\system32\DiskpartHDD.txt}"

As for the Diskpart task sequence, I used the following Run Command Line task sequence:


This is all that is required for creating the partitions. I commented out the ones provided by Microsoft and it now assigns drive C: for the Windows operating system. 

23 April 2018

Oracle Java Runtime Installer

As often as Java must be updated, I wanted to have an auto installer that would make the update a breeze. The way this installer has been written is it will first determine if the system is x86 or x64. At that point, it will uninstall the old version first and then install the x86  if the system is 32-bit, or it will install the x86 and x64 versions if the system is 64-bit. The parameters are the same for both 32-bit and 64-bit versions so you can define the parameters once.

When a new version is released, all you need to do is swap out the installer executables and update the package in SCCM. The script will find the appropriate executable associated the architecture, as Oracle includes the architecture within the filename.

You can download and view the installer code from my GitHub site.


10 April 2018

Inno Setup PowerShell Uninstaller

I recently encountered an application that uses the Inno Setup installer. Part of my process when I deploy an application is to also create an uninstaller. While creating the uninstaller, I decided to make a function for uninstalling Inno Setup installed applications.

The way I have written this function is that you need to use the exact name as displayed in the add/remove programs for the AppName parameter. The function will then query the Add/Remove programs registry entries to get the quiet uninstall string and execute it.

You can download the function from my GitHub site.


03 April 2018

One-Liner that Updates the Dell Application Component Updates in the Reference Image

While building out the Windows 10 reference image task sequence, it dawned on me that I should be making sure the latest Dell Application Component Updates are installed. Since this is a reference image, the system drivers being up-to-date is not essential to me because they will be stripped during the Sysprep process. This does require that you already have the Dell applications installed before executing this one-liner.

I devised this one-liner that can be implemented as a command line task sequence to check for the latest application component updates only. To limit this down to just application component updates, you will need to open the Dell Command | Update GUI application to create an XML file to reference from the command line. Once in the GUI app, click on the Settings icon. Click on Update Filter. Under Recommendation Level, I checked everything. Under Update Type, I checked Application Software. Everything else is left unchecked. Configure every other settings tab the way you want. Now click on Import/Export and click Export. Export the XML to the desired UNC path in which the one-liner below can access. You can also download the XML file I use from my GitHub site.

As for the one-liner below, update the <UNC Path> to the location where the Applications.XML file is located. It does not need to be called Applications.XML. That was my choice.


 powershell.exe -command "&{If ((Test-Path 'C:\Program Files\Dell\CommandUpdate\dcu-cli.exe') -eq $true) {$ExitCode = (Start-Process -FilePath 'C:\Program Files\Dell\CommandUpdate\dcu-cli.exe' -ArgumentList '/policy \\<UNC Path>\Applications.xml' -Wait -PassThru).ExitCode} elseif ((Test-path 'C:\Program Files (x86)\Dell\CommandUpdate\dcu-cli.exe') -eq $true) {$ExitCode = (Start-Process -FilePath 'C:\Program Files (x86)\Dell\CommandUpdate\dcu-cli.exe' -ArgumentList '/policy \\<UNC Path>\Applications.xml' -Wait -PassThru).ExitCode};Exit ($ExitCode)}"  

Putting this into MDT or SCCM is easy. Once you have the one-liner customized and tested, copy and paste it into a Run Command Line task sequence as shown below. That is all it takes to implement this.


23 March 2018

KB40888878 Patch for Spectre and Meltdown on Windows 7 x86 and x64 systems

Recently, Dell released the BIOS updates covering systems starting with the Intel Family 6 Model 42 and later processors. This is the first part of the patching process. The second part is to apply all windows updates, which I also included all optional updates. That was my personal preference. The third step is to apply the appropriate KB4088878 patch. 

The first two systems, Dell Optiplex 990s with Windows 7 64-Bit, I did these patches on were successful. GRC's InSpectre tool was executed and returned the following. 


The next two failed. These systems were Windows 7 32-Bit installed on Dell Optiplex 990s with 64-Bit processors. The BIOS was patched with the latest A23 version Dell had published. The windows updates were all installed. When the windows6.1-kb4088878-x86_7512ab54d6a6df9d7e3d511d84a387aaeaeef111.msu was applied, the following crash screen appeared when the OS booted back up.


One tactic I tried was to configure the registry to clear out the page file when the system shuts down by changing the value of HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\Memory Management\ClearPageFileAtShutdown to a 1. The next thing I did was to boot the system into safe mode to execute the patch. I got the following message. 


In conclusion, the only solution is to have the hardware architecture match the OS architecture. If they match then applying the appropriate patch will be successful. 

Here is a note on patching. Applying the latest BIOS does not pass the GRC Inspectre test. The Microsoft OS patch must also be applied for the system to pass the test.