Updating Trusted Sites in a User Friendly Way

Everyone knows that when you edit trusted sites through Group Policy, you are locking down the configuration. Once that happens it seems like the entire internet needs to be come a trusted site for one reason for another.

This correct way of editing trusted sites is User Configuration > Administrative Templates > Windows Components > Internet Explorer > Internet Control Panel > Security Page > Site to Zone Assignment List noted here – http://blogs.msdn.com/b/microsoft_press/archive/2014/04/14/from-the-mvps-setting-internet-explorer-trusted-site-settings-via-group-policy-object-in-windows-server-2012-r2.aspx

But if you don’t want the to bog yourself down with every single site that any of your users plan to use, try doing it a differently, though it may not be fully supported.

If you use a Registry Group Policy Preference: User Configuration > Preferences > Windows Settings > Registry, you can add sites to the trusted sites list and still allow users to edit the list adding the other ones that are important to them.

Take this example of adding https://manage.windowsazure.com to Trusted Sites.

 IETrustReg

Adding sites in this way is a little more raw than using the other methods but you are better catering to your users needs. Also, consider that you have the ability to remove sites from this location that allows you to make sure that users are not able to add (or get removed upon refresh) certain sites though they still maintain some usability and control that makes them feel more comfortable with the management that you provide.

 

 

 

Posted in Group Policy | Tagged | Leave a comment

Active Directory Administrative Center Search Carshing fixed

For those of us that were using the Active Directory Administrative Center on Windows 8.1 through the RSAT and connecting to an older domain like Windows 2008 R2 AD DS. Any time you would try to search the ADAC (Active Directory Administrative Center) would crash hard. This also broke the ability to run Get-ADUser like this.

Looking at the ADAC, you will now see the “Global Search” again and it will work. If your schema was more up to date, then it would not have been a issue.

Search example

dscas

With Windows 8.1 Update, all of this is working for me now!

Hopefully I was not the only one experiencing this.

Thanks!

Posted in PowerShell | Leave a comment

Snovers Puzzle Starting to Show with PowerShell V5 – SCCM

With PowerShell V5 Preview, we get things such as OneGet and Network Swtich Management. These are 2 things down different tracks that will meet each other as things come more obvious.

If you think about DSC and OneGet together, you have all of SCCM except a big feature for reporting. All reporting has ever been is a complex working of WMI. Look forward to say SCCM 2015 and here is what you will get.

SCCM Configuration Management – PowerShell DSC
SCCM Software and Updates Deployment – OneGet using SCCM Software Repo
SCCM Reporting - Cloud Service endpoint to upload Get-CIM*

This brings me to the data collection part and where I think all this could be going.

  • 1. No Client SCCM deployments. With PowerShell on the endpoints (Computers, Switches, Routers, Storage controllers, and so on) why do you need a SCCM client? Sum of these endpoints could be simple data pulls and pushes from a engine like SMA. Though a network switches does not have the software to automatically push into a web service, PowerShell combined with a managed Workflows can collect, test and do whatever is needed on the endpoint.
  • For Windows, data collection could be solely done using Scheduled task. In SCCM now, the MOF extensions to WMI are used to help format the data (from what I can tell – not an expert). Using PowerShell and the CIM cmdlets, it would be possible to create a local workflow that can capture, transform and upload the data to a collection point.
  • Service Discovery can be done at the beginning of the scheduled task just using a rest call as a locator, embedded in your internal or Microsoft Azure deployment. This endpoint would provide the client with needed collections, configuration changes and anything you can dream up.
  • Your OneGet repo can be secured and trusted by the client so that it’s would be available anywhere.
  • Customizations to the engine processes is easy enough for any knowledgeable PowerShell staff member.

You get the idea. Thanks for reading!

Posted in PowerShell | Leave a comment

PowerShell DSC need for v5 offical or v6 – Pull Test

As a person that has enjoyed PowerShell since the early days just before the first release, in the latest few increments the tool has become an amazing asset. Features like Workflows and Desired State Configuration are such a change to the industry! Only a few of us small time developers could reach this kind of level in the past with huge VBScripts or little VB.Net programs. Now it is so easy to fan out and in that I cannot imagine a world without it.

DSC is a crowning feature but with it needs the ability have a pull server that can test all possible configurations on a server to create a resulting configuration. Since all resources in the pull servers catalog have a test that must be passed. Many of these configurations can be tested and stored to snapshot of the new systems configuration. Now with OneGet you can test a server or system for installed software which is a big thing.

Assume that I have a pull server and on this server I have configurations stored for a few hundred or thousand systems. If a cmdlet can be created to read all of the configurations for all of the existing servers, it would be able to compare that to any running system that it has access and build a desired state for future reference. It would be entirely possible for the Local Configuration Manager on a system to generate the basics like services, WindowsFeatures, OneGet  without the need of a pull server to tell it how to test other than request the results. There may be a special switch on a Configuration block to tag this as a testing collection. With testing configuration, the server could be inelegant and find the matching servers. This could be for AD, Exchange, Lync, Web and many more that have base configuration your server is set to identify allowing you to baby step your way up to a desired state with servers that are already running. Then you can take the next step to a fluid environment ready to handle anything.

Though something like this may be completely a 3rd party add-on, it will help the On-boarding process of DSC though it is growing rapidly.

Posted in Uncategorized | Leave a comment

Windows PowerShell Best Practices by Ed Wilson, Microsoft Press

If you have been around Microsoft scripting and automation technologies for any amount of time you will recognize Ed Wilson as the center of the community. In this book he covers PowerShell in such a way that any beginner or expert will learn something new. While letting you feel like you are just learning best practices, the sidebars from many in the PowerShell community help give you some real world understanding of PowerShell and great ideas that will you with your own projects. I especially enjoyed the section “Handling missing WMI providers” because it never fails that while trying automate some process for a large company you run into computers that have some sort of WMI error that can keep you from finishing. Make sure to pay close attention to chapter 8 that covers how to design your script! If you design your script properly it helps you focus on the problem you are trying solve and get it fixed properly. Also, a well written script will allow you to share it easier with others so they can be more productive as well. This a foundational book on PowerShell for anyone that is just starting out or wanting make better, more supportable scripts. I think this is a great book for anyone using PowerShell.

Check it out yourself at http://shop.oreilly.com/product/0790145347268.do

 

Posted in PowerShell | Tagged | Leave a comment

Windows Update Home Brew

Last night, was an odd occurrence, many of our servers were to skip a round of Windows Updates because in a few days we have a very large go-live of a new product. Management just did not want to take any chances that it would cause something to go wrong at the last minute. As one of the primary Active Directory administrators, I let them know that the Domain Controllers needed to be patched anyway. They only run MS software and it is critical to more than just a few applications. Other than that I did not want to explain to a different team why our security report all the sudden turned all shades of red. My real problem started when all of the updates in SCCM had their deadline move a month ahead and I could not depend on getting all that setup in time. I told them, I would just handle it.

Since I have to wake up at 2AM for the maintenance window, I did a quick Bing for Windows Update and PowerShell, knowing there has got to be something these days. What I find was an article by the Windows Scripting Guy about this script/module from the TechNet Repository bookmarked it and off to sleep.

When 2AM rolled around, I got the module and start looking around and found all sorts of goodness. There are 2 commands that I really found very useful to get my job done. Get-WUInstall which will start an install process and Invoke-WUInstall that will create a schedule task on the target system to run the Get-WUInstall. If you have a good background in the Windows Update process, you will know that the actual execution must be ran from the local system. Due to security, it is not possible to directly invoke the Windows Update process through a 2nd hop. After running a few quick test, I realize that I must have the Windows Update module on the remote system and have the ability to execute the PowerShell to get the job done. By 2:30, I had created the code below and my first Domain Controller was under way.

This is a pretty raw way to get it done but you must understand that there was a little crunch time here. The Start-Sleep is just making sure that I am not rebooting all of the Domain Controllers at the same time. After running this on one system, and confirming it worked, I just let it do its thing. It allowed me to update 32 Domain Controllers in 3 different states in about 45 minutes manually on the fly, right out of a dead sleep. But, if you needed to get many more systems done in the same way, it is completely possible.

After returning to bed and waking again a few hours later to start the day, I did a quick check of the PSWindowsUpdate.log file on all of the systems to confirm things were good. It was easy to delete the PSWU folder, remove the scheduled task and move the PSWindowsUpdate.log file to an archive location with standard PowerShell.

The module also contains the Get-WURebootStatus to let me know if there was a system that needed an extra reboot or had issues rebooting. Thanks Michal Gajda, this module is great!

One nice little thing to remember is the Invoke-WUInstall does not have the sole role of kicking off Windows Updates. With the Script parameter, you can really make it do whatever you want.

Posted in PowerShell | Comments Off

Workflow : Check the Checkpoint

I was testing some workflow processes the other day and I ran into something I thought was odd and I hope I can explain it well here.

The issues is that when you create a workflow, if you do not plan to use the -PSPersist switch when calling the workflow, make sure you spend plenty time figuring out where you are going to put you Checkpoint-Workflow activities in your workflow. Take the below script for example.

You would think if I ran the workflow by calling Test1 -AsJob and immediately there after called Suspend-Job -Id ..  I would get no output. That is NOT the case! When I call Suspend-Job -id .. the job will go into a “Suspending” state until it reaches the Checkpoint-Workflow activity. This will give me an output of 1 and 2 after 60 seconds of execution.

If I were to call the workflow with Test1 -PSPresist:$True -AsJob and immediately call Suspend-Job -Id .. Then oddly, I actually get an output of 1. This is odd because the job will checkpoint to disk after every activity allowing for a suspension  but note that it actually did not stop after the current running activity, it just stopped before the next activity, the second Start-Sleep. The $i = 1 and $i lines were allowed to execute before suspension.

The reason you need to take real care of when you put in your checkpoint activities is because the data or objects gathered in one step may require completion of another step or activity. In addition, Step A may be useless if Step B does not run with in a few minutes. This would mean that using the -PSPresist could be a bad thing if the workflow was interrupted just after Step A.

When designing your workflow, just be careful and plan your activities and checkpoints at the same time, it could be difficult after you get it up and a few versions down the road.

Enjoy.

Posted in PowerShell | Comments Off

Workflow – Persist to Disk

When using workflows it is common to rely on persistence to allow for external interaction and recovery. The PSPersist parameter on a workflow can be used by the end user to ensure recovery. As the creator, you have the ability to control the persistence of each activity with the PSPersist parameter, or by calling CheckPoint-Workflow or Suspend-Workflow at any point to force a checkpoint

Using the PSPersist would be the most common way for a user to interact with the persistence process. When calling a workflow, you can specify -PSPersist like this:

When using the PSPersist parameter, you have two options, $True or $False. Used when invoking a workflow, this is a all or nothing. I cannot see why anyone would call this with $False because this would be equal to the default action which would not add any extra checkpoints into the workflow other than the ones that are already added by the creator. When you specify $True, the workflow will create a checkpoint to disk after every activity. Doing this consider the disk resources needed to create the checkpoint. When resuming a workflow, you do not have the option to pick which checkpoint you want to use, it will resume from the last checkpoint taken. Depending on what you are doing in your workflow, the disk space used in the process can become considerable because all variables are stored.

 When you invoke this workflow with the -PSPersist $True parameter, you will be persisting the entire file and folder structure to disk after the Get-ChildItem is called. Though it looks like you are just calling a simple Get-ChildItem, this cmdlet is actually an activity execution that is just being stored to $FileNames. If this is something you want to do just make sure that there is ample storage space to do the checkpoint.

If you are creating workflows yourself the PSPersist parameter can be used after each activity with the same $True and $False options. Again if you use it and specify $False you are not going to take a checkpoint when the activity completes. Using $True would force a checkpoint to be taken when the activity is done. If the activity Checkpoint-Workflow is just after the activity, either way a checkpoint is taken but it can be taken twice if you use the PSPersist $True.

The reasons you would want to use workflow persistence is for recovery and suspension. Up to this point I have mainly spoken about the ability to control checkpoints. If you properly take checkpoints and something goes very wrong, like a network or power outage, you can call Resume-Job on the and the workflow will continue where it last took a checkpoint. The Suspend-Workflow is more interesting. Calling Suspend-Workflow will execute a checkpoint and stop the workflow. This is useful if you need some external intervention like approvals. A good example would be a process that would require a lot of validation before a series of actions are taken. When you call the workflow and specify all of your parameters, you can have the workflow call Suspend-Workflow if it finds something that does not match up in the logic somewhere. This will allow you to ensure that the correct information was typed in and you are wanting to really carry though or just remove the job all together.

A workflow checkpoint is stored in the users profile.

Checkpoint2disk

Checkpoint to Disk

Feel free to reboot or what every you need to at this point, the workflow will remain here until you decide to resume or remove it. On my test computer the only snag was that I needed the PSWorkflow module loaded to properly see the suspended workflow. Once a workflow is suspended or you are recovering from a failure to a checkpoint, the workflow is a PSWorkflowJob

Workflow Get-Job

Workflow Get-Job

Thanks for reading!

Posted in PowerShell | Tagged , , | Comments Off

Upgrading to Windows Server 2012 GUI mode issue

Much like myself, I assume many of you are going to upgrade many of your Windows 2008 and Windows 2008 R2 system to Windows Server 2012. There is a known issue after the upgrade that may impact you if you start to remove and add back the GUI features noted in this URL.
If you upgrade from a Full installation of Windows Server2008 or Windows Server 2008& R2 to Windows Server 2012 in Server with a GUI mode, and then switch Windows Server 2012 to Server Core mode, conversion back to Server with a GUI mode will fail.
To avoid this, delete these registry keys with the following commands:
reg delete HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\WINEVT\Publishers\{bc2eeeec-b77a-4a52-b6a4-dffb1b1370cb}
reg delete HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\WINEVT\Publishers\{57e0b31d-de8c-4181-bcd1-f70e880b49fc}
reg delete HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\WINEVT\Publishers\{8c9dd1ad-e6e5-4b07-b455-684a9d879900}
After running these commands, restart the upgrade.

Posted in Uncategorized | Comments Off

Cleaning your Windows Server 2012 Server of Source Files

If you install Windows Server 2012 (with a GUI) and then remove the GUI features after you have configured your server, most of the source install files are still on your system. To remove the unneeded source files, use the following command:

After this point when installing new roles you will need to specify a source or allow the installer to get the files from Windows update.

Posted in PowerShell | Comments Off