Monday, December 23, 2013

Microsoft Virtual Academy: PowerShell M05

Today I continue forward on my goal of completing Microsoft's Virtual Academy's PowerShell 3.0 training.
(http://www.microsoftvirtualacademy.com/training-courses/getting-started-with-powershell-3-0-jump-start?o=3276#?fbid=aVs9FfAH2DJ)
(rewritten based on my notes available here: https://drive.google.com/file/d/0B1fwreWrAZioUWpaeFNSTFVXUEU/edit?usp=sharing)

Module 5 "Te pipeline:deeper" consists of 3 video segments totaling ≈ 45 minutes along with a PowerPoint of 12 slides.

The focus of this segment is on the pipeline ( | )

As briefly mentioned in past segments the pipeline allows us to pass objects along a string of cmdlets. The pipeline has four (4) ways to be used:

  1. ByValue
  2. ByPropertyName
  3. Customized properties
  4. Parenthetical
First and most common use is through the ByValue. In order to use ByValue we need to know the following: type of object we sending and if the receiving cmdlet can accept it.  To determine the type of our object we use get-member (gm).  Know to validate the receiving cmdlet we run a get-help [cmdlet].  There are two (2) things we are looking for in the help file.  First, if any parameters accept pipeline input and second if any parameters accept our object type.  

Example: Get-Service | Stop-Service
  1. Determine object type: 
    • Get-Service | gm
    • Results: TypeName: System.ServiceProcess.ServiceController
  2. Verify receive cmdlet will accept
    • Get-Help stop-service -full
      • If using v3 use -showWindow and utilize the search feature
    • Verify accept pipeline input: true (.., ByValue,..)
    • Verify input value is sending object type
      • -InputObject <ServiceController[]>
Hint:  if the nouns match most likely will work.


Next we move to the ByPropertyName.  Works very similar to ByValue but is more forgiving.  We follow the same process of verify.  This process works on the concept of a property of sending object matching a parameter in receiving cmdlet.


Example: Get-Service | Stop-Process
  1. Determine object type: 
    • Get-Service | gm
    • Results: TypeName: System.ServiceProcess.ServiceController
  2. Verify receive cmdlet will accept
    • Get-Help stop-process -full
    • Verify accept pipeline input: true (.., ByValue,..)
    • Verify input value is sending object type
      • -InputObject <Process[]>
        • Not the same type so now we look for ByPropertyName
      • -Name <String[]>
        • Will accept input from sending object's property name and process based on that
So what if ByValue and ByPropertyName don't work?  We can create a custom property.  This is accomplished by creating a custom column (calculated property).  With a custom property are goal is to meet the ByPropertyName process.  In the example below and in segment we try to send ADComputer to Get-Service.

Command used:  @{name=’[propertyname]’;expression={$_.[property]}}

Example: Get-ADComputer -filter * | Get-Service -name bits

  1. Determine object type: 
    • Get-ADComputer | gm
    • Results: TypeName: Microsoft.ActiveDirectory.Management.ADComputer
  2. Verify receive cmdlet will accept
    • Get-Help Get-Service -full
      • ByValue: -InputObject <ServiceController[]> - No Go
      • ByPropertyName: -ComputerName (for systems), -Name (for services)
        • We want to use the -ComputerName and not -Name, this is where custom property comes in
  3. Map ADComputer -Name to -ComputerName
    • @{name=’ComputerName’;expression={$_.name}}
  4. Verify Object Property
    • Get-ADComputer | gm
      • Should now see a property called ComputerName

Finally we use Parenthetical when all else has failed us.  Command process what is in parenthesis first.

For this example we want to pull all BIOS details for all ADComputers.

Example: Get-ADComputer –filter * | get-WMIObject –class win32bios

  1. Determine object type: 
    • Get-ADComputer | gm
    • Results: TypeName: Microsoft.ActiveDirectory.Management.ADComputer
  2. Verify receive cmdlet will accept
    • Get-Help Get-WMIObject -full
      • Notice there is no pipeline support
As we see WMIObject does not support pipeline.  To work around this we use -ExpandProperty.  -ExpandProperty will expand a single property into a string object. There are two ways to accomplish this dependent on version you are running.

Command used: Get-adcomputer –filter * | select –ExpandProperty name

If we run this through gm we find object type is no a string.

  • v2 and v3
    • Get-wmiobject –class  win32_bios –ComputerName (Get-adcomputer –filter * | select –ExpandProperty name)
  • v3 only
    • Get-wmiobject –class  win32_bios –ComputerName (Get-adcomputer –filter *).name
That wraps up segment 5.  I hope to have segment 6 up next week but dependent on the holidays and work it may not be up until Mid-January. 

I hope everyone has Happy Holidays.

Friday, December 20, 2013

Holiday IT Poem

'Twas the week before Christmas
And all through SpiceWorks
There were plenty of tickets
Put in from dumb jerks.
Their computers weren't working
Or so they all said
But the IT folks knew
Users are brain-dead
Still every ticket was logged
Every problem looked at
Encountering every user
And pictures of their cats
There was a printer jam
And one with no ink
One had no paper
And someone clogged the sink
That shouldn't be IT
That we all know
But the ticket still came in
So a plumbing I will go
As the weekend draws near
I just impatiently wait
It's almost Saturday
And won't that be great!
My presents are bought
My shopping is done
Everything's even wrapped
So now I just get to have fun
A weekend of Xbox
Playstation and Wii
So now I might be at work
But soon I'll be free!
And then next week comes
And everyone will be merry
And they'll all call off work
So my tickets won't be scary
Plus a day off on Wednesday
Oh what a delight
So Merry Christmas to all!
Especially SpiceRex

SharePoint 2010: Content Query and Announcement's Body

So recently was asked to make our homepage announcements more informational.  As you can see from below not much in detail.
Currently the items above are pulled from each departments site using Content Query Webpart and SharePoint 2010: Aggregating Announcements (http://cleeit.blogspot.com/2013/08/sharepoint-2010-aggregating-annoucements.html).

So first thing I tried was just to display the Body of the announcement.  Well as you can see below that didn't work.

Content Query does not understand HTML.  So how to escape HTML in Content Query, well off to Google I went. It did not take long to locate Kappa Solutions Technology Blog (at the time rated number 3).

The exact article can be found at:  http://blog.kappasolutions.ca/blog/post/2010/09/12/How-to-Display-HTML-in-Content-Query-Web-Part.aspx

Spiceworks How-To write up: http://community.spiceworks.com/how_to/show/62085-sharepoint-2010-content-query-and-announcement-s-body

As there write up is quite nice I will only link to them for now.

End Results

Monday, December 2, 2013

Microsoft Virtual Academy: PowerShell M04

Today I continue forward on my goal of completing Microsoft's Virtual Academy's PowerShell 3.0 training.
(http://www.microsoftvirtualacademy.com/training-courses/getting-started-with-powershell-3-0-jump-start?o=3276#?fbid=aVs9FfAH2DJ)
(rewritten based on my notes available here:
https://drive.google.com/file/d/0B1fwreWrAZioVFozYmprdHJGT3c/edit?usp=sharing)

Module 4 "Objects for the Admin" consists of 3 video segments totaling ≈ 50 minutes along with a PowerPoint of 13 slides.

Focus on this module is Objects (just as the title states, huh weird).

First off we learn what an object is and how it makes our lives easier in PowerShell.

  • Object
    • has properties
      • things we can view about the object
    • has methods
      • things we can do to the object
Now that we know what an object is how can we see what properties, methods, etc it has.   Well this is where they give us a powerful cmdlet: get-member (gm).  By using the pipeline to send an object through Get-Member PowerShell spits out data providing all the details.  Some key areas to pay attention to are:
  • ObjectType
    • Tells us what type of object we are viewing, import in later modules
  • MemberType
    • Displays all the properties, methods, etc
An interesting segment of the module was importing third party xml file and showing how easily PowerShell can manipulate and pull data from it.
  • Demo of Romeo and Juliet XML play, truly a powerful example of PowerShell
Introduction to filtering and limiting within PowerShell with the use of Select and Where.
  • Select-Object (Select)
    • By piping a cmdlet to Select you can limit what is displayed
      • get-process | Select -Property name, ID
  • Where-Object (where)
    • 2 version
      • Most powerful and versatile uses filterscript {} ({ })
        • assigns current object to variable ($_ or $PSItem)
        • evaluates the code (comparison operators)
          • More details on comparison operators or operators use get-help about_operators
        • acts on results
          • true passing it forward
          • false thrown away
      • Simpler is just the where
        • where property operator value
        • where name -like wmi*
    •  

During their responses to Q&A we learn that for the most part PowerShell v3 is case insensitive when it can, usually third party modules. 


References mentioned in this module:

  • PowerShell.org
    • Forums, FREE eBOOKs, articles, podcasts, script repository
  • Learn Windows PowerShell 3 in a Month of Lunches
    • Don Jones
    • Chapter 9


Monday, November 25, 2013

Microsoft Virtual Academy: PowerShell M03

Today I continue forward on my goal of completing Microsoft's Virtual Academy's PowerShell 3.0 training.
(http://www.microsoftvirtualacademy.com/training-courses/getting-started-with-powershell-3-0-jump-start?o=3276#?fbid=aVs9FfAH2DJ)
(rewritten based on my notes available here:
https://drive.google.com/file/d/0B1fwreWrAZioQ0twZXB3b0J2eXM/edit?usp=sharing)

Module 3 "The Pipeline"consists of 3 video segments totaling ≈ 30 minutes along with two PowerPoints of 12 and 10 slides.

This module was short and sweet, teasing you with the power of Powershell while also showing you how to protect yourself from destroying your system OS.

To start of the module we learn what is the pipeline (|) which if you are wondering is the key above the enter on keyboard (shift + \).   Now what is the benefit of the pipeline in Powershell, well is allows the connecting of cmdlets to accomplish a larger task. In the simplest form it passes the results of one cmdlet to an other to be processed.

  • get-service -name bits | stop-service   is the same as    stop-service -name bits
Now this is the simplest form and may be hard to see the benefit in above case but it is just for reference. Now as you notice with the above example couldn't I just send all services to stop and yes you could, though you would in essence be crippling/bricking your system as Powershell by default acts on all commands without questioning you.  This is were the last part covered is important.  Powershell provides ways to safeguard yourself with -whatif and -confirm.
  • -whatif can be added to almost every cmdlet statement to see output results of the cmdlet without actually executing it
  • -confirm will prompt you if you wish to execute (take note of Yes, Yes to all, No, Not to all)
One of the huge benefits brought to light in this module is auto loading of modules which was not possible pre-v3.  In addition to PowerShell v3 auto loading modules for you it also has a complete understanding of the modules help (requirement is module must be installed on the system).  In pre-v3 you had to mount a snap-in before you could use the cmdlets or even have help understand them.  This is a huge time saver and will reduce frustration. 

Lastly they touched on features of exporting, importing and some comparing. They didn't go into to much detail as they will be covering this later in the series.
  • Export to many popular formats
    • csv, xml
  • Import files back in for processing
  • Ability to compare file to running system
    • In the module they demonstrated making a known good xml file of system process and then comparing to another system
    • Great way to see what has changed from baseline of systems
    • I will be creating baselines for all system types when I re-image next time for future troubleshooting 
      • Get-process | export-clixml –path C:\good.xml
        • creates the xml
      •  Compare-Object -ReferenceObject (Import-Clixml C:\good.xml) -DifferenceObject (Get-Process) -Property name
        • compares the baseline created before to current running process names
Until next week.

Friday, November 22, 2013

Microsoft Virtual Academy: PowerShell M02

Today I continue forward on my goal of completing Microsoft's Virtual Academy's PowerShell 3.0 training.
(http://www.microsoftvirtualacademy.com/training-courses/getting-started-with-powershell-3-0-jump-start?o=3276#?fbid=aVs9FfAH2DJ)
(rewritten based on my notes available here:
https://drive.google.com/file/d/0B1fwreWrAZioNW5OZzFLTmFHTzA/edit?usp=sharing)

Module 2 "The Help System"consists of 3 video segments totaling ≈ 50 minutes along with a PowerPoint of 10 slides.

The first 13 minutes are more of an overview of help in PowerShell v3 and older systems.  During this 13 minutes they cover how to update help, briefly explain the difference between get-help, help, and man and their biggest point learn to discover.

One of the biggest improvements to PowerShell v3 is the ability to update outside of product releases.  By using the cmdlet update-help PowerShell v3 will download current update files from the internet.  If your system is not internet connected you can use save-help from an internet connected system to save the help updates for offline update (http://technet.microsoft.com/en-us/library/hh849724.aspx). Now v2 doesn't support this feature but does have a way for you to ensure you view the most recent help information.  Simply add the parameter -online to your PowerShell v2 get-help statement:

  • get-help get-service -online
    • will open a browser window to must current help for that cmdlet


Next, they explain the difference between get-help, help and man (well help and man are the same just aliases).

  • Get-help
    • partial help file displayed
    • focus is at the end of the file, requiring you to scroll up to see everything
  • help / man
    • displays full help file
    • allows paging through the file one screen at a time
Lastly the biggest point to take away from Module 2 is learn to discover not memorize.  By learning how to use the help cmdlet you set yourself up to discover functions easier and quicker then trying to remember the thousands of cmdlets. There are over 96 verbs alone (get-verb | measure).

The remaining time is spent going over PowerShell syntax to include wildcards, how to navigate the CLI, reading a get-help and more.  One of the features they don't mention until near the end is the use of tab completion.  I feel this should be more towards the beginning as it is a huge time saver.  As you are type cmdlets you can press tab to cycle forward through matching cmdlets or shift-tab to cycle backwards. In addition to tab completion they mention using semi-colon (;) as a cmdlet separator allowing more then one cmdlet statement to be written at a time.  By now you may have already noticed some of the ways to navigate the CLI, such as moving left and right with the arrows but did you know you can recall past cmdlets by pressing up and cancel the current cmdlet by pressing ESC?

Now one of PowerShell's greatest features is the use of  wildcards (*).  Basically the asterisk (*) states allow anything before or after my location.

  • get-help *service*
    • returns all cmdlets with service anywhere in them
  • get-help g*service
    • returns any cmdlets the start with g and contain service within them
To further breakdown the help system they go into explaining the segments/sections of get-help: Name, Synopsis, Syntax, Description, Related links, and remarks.  I will let you check out the video for their descriptions of these.  Next they covered some of the parameters that can be used with get-help:

  • -detailed
    • more detail then just get-help
    • includes listing of cmdlet parameters and examples
  • -examples
    • displays just the cmdlets examples
  • -full
    • you guessed it this displays it all
  • -showwindow
    • new parameter in v3
    • opens a window displaying the cmdlet help information
    • can be configured to display only certain sections via the settings
    • great for copying and pasting syntax
    • over a find/search function to current displayed help file

Syntax explained?!?!
By now you have seen a few cmdlets using the get-help/help cmdlet.  What does all the information after the cmdlet mean/do.  Well they are call parameters and they allow the refining/altering of the cmdlet.   Simple enough right.  Well there is a certain format/order that these parameters may need to applied.

  • First there is the [ ] this one has two meanings depending on its placement
    1. At its top most level it means optional parameter.  This can be seen in most cmdlets as most parameters are enclosed in [ ].  But further review and you will find some cmdlets that over optional content within a parameter
      • [[-name] <string[]>]
        • This whole parameter is optional but if you choose to use it you can with out first calling -name and instead providing the string value
    2. Deeper is the defining of multiple values separated by commas
      • [[-name] <string[]>]
        • for the string value we could define: bob, rob, *ob
          • (Yes you can include wildcards)
  • Next there is the use of < > as you may have already noticed this typically denotes format the parameter is looking for arguments/values
    • string = alphanumeric
That concludes module 2. If you stop here you will already have placed yourself ahead of many other users.  Just remember to learn to discover.

Thursday, October 31, 2013

Tuesday, October 29, 2013

Spiceworld

Unable to attend Spiceworld 2013 check out the uStream feed:

  • http://www.ustream.tv/channel/spiceworld-2013

or following directions to watch via VLC:


  • Install LiveStream (https://github.com/chrippa/livestreamer/releases/tag/v1.6.1) 
  • Open CMD browse to Livestream folder Execute following: livestream.exe
  • http://www.ustream.tv/channel/spiceworld-2013 best should open in VLC
Check out the Agenda:

Monday, October 28, 2013

Bamboo SharePoint Solutions

****Warning*****

Something I just learned after I tried to remove a Bamboo SP Calendar Solution from our farm.  It changes webconfig to reference new/updated Telerik command/extension.

The removal of their solution cripples your SharePoint rendering all sites outside of Central Admin unreachable.

Solution is to re-install their core wsp from trail solution.

If you wish to remove completely without crippling your SP check out their write up:

http://community.bamboosolutions.com/blogs/sharepoint-2010/archive/2012/08/23/how-to-remove-a-web-config-modification-using-powershell.aspx

http://store.bamboosolutions.com/KB/article.aspx?id=12486&query=Telerik.Web.UI

New Vulnerability Source

We recently had a visit from the FBI to discuss our system security as we are becoming more of an international player.

One of the source they provided me is US-cert.  There are several newsletters you can register for.

I received my first issue of US-CERT Cyber Security Bulletin (https://www.us-cert.gov/ncas/bulletins/SB13-301) today.

A quick glance down the list showed some surprising issues I had to verify on our network.

WatchGuard and VMware: Glad to report I was already patched and no issues there.

Highly recommend signing up for at least this newsletter.

Microsoft Virtual Academy: PowerShell M01

Looking to learn more about Power Shell I stumbled across a link on SpiceWorks that lead me to Microsoft Virtual Academy. http://www.microsoftvirtualacademy.com/training-courses/getting-started-with-powershell-3-0-jump-start?o=3276#?fbid=aVs9FfAH2DJ

I have decided to complete and write a review of each module over the next 9 wks (1 module a week).

Today starts out Module 1.

  • Contains two videos:
    1. Intro - Don't fear the shell (6.41 mins)
    2. Getting familiar with the shell (53:00 mins)
  • Presentations
    1. 01-Dont fear the shell 1.pptx (15 slides)





Starting off we get a brief introduction from Jeffrey Snover, the inventor of PowerShell, and Jason Helmick, Senior Technologist at Concentrated Technology.

Then we start going over understanding and becoming comfortable with the shell.

Discuss history of PowerShell.  Warn against running out to install 3.0 with out checking release notes first. There are systems that do not support V3 (SP 2010 is one of them).

First introduction to PowerShell is how to tell if you are in Administrator mode or not.  In the Program's title bar your should see Administrator: Windows PowerShell and not Windows PowerShell. If you are not in administrator mode you will have limited access and rights.

Second is a quick lesson on customizing the window to work better for coding.

  • Accessing preferences
    • Right click the title bar to get drop down, select preferences
  • Changing the font to make it easier to differentiate ` vs '
    • recommend Lucida Console
    • set to bold if presenting
  • Setup window size
    • Layout tab
      • Adjust Window to fit on desktop (black screen)
        • ensure blue does not fall off the screen
      • Ensure Screen buffer and Window size width match
      • Ensure Screen buffer Height is 3000+
        • allows you to see more past commands
        • many PowerShell Console default to 300
  • Colors
    • adjust to desired scheme
Third go over basic commands
  • cmdlets: Verb - Noun
    • set-location
      • change directory
    • Clear-host
      • clear screen
    • get-childitem
      • list out directory
  • Native commands work
    • ping
    • ipconfig
    • calc
    • notepad
  • Aliases (DOS/Unix)
    • dir / ls
      • actual runs get-childitem
    • cd
      • actual runs set-location
    • cls
      • actual runs clear-host
    • get-alias / gal
      • list of aliases
Fourth discussed and demonstrated help/gal searching
  • gal g*
    • All aliases starting with G
  • gal *sv
    • all aliases containing SV


Friday, October 25, 2013

Microsoft Virtual Academy: PowerShell

Looking to learn more about Power Shell I stumbled across a link on SpiceWorks that lead me to Microsoft Virtual Academy. http://www.microsoftvirtualacademy.com/training-courses/getting-started-with-powershell-3-0-jump-start?o=3276#?fbid=aVs9FfAH2DJ

I have decided to complete and write a review of each module over the next 9 wks (1 module a week).

Starting Monday with Module 01 check back or head on over to MS site to sign up and complete your own.

Tuesday, October 22, 2013

SharePoint 2010: Setup Document Library to accept emails

To better utilize our SharePoint setup and reduce strain on end users I needed to set up processing of incoming emails for a couple of document libraries (Quotes and Shipping Labels).  Below are the steps I used to complete.

Check out my How-to with screen shots on SpiceWorks at:
http://community.spiceworks.com/how_to/show/54771-sharepoint-2010-setup-document-library-to-accept-emails


Ensure your SharePoint Server has SMTP feature installed and configured.



First we need to configure some farm level features:

  1. Access Central Administration > System Settings
  2. Configure Incoming email settings
  3. Enable Incoming E-Mail
    1. Enable: Yes
    2. Mode: Automatic
      • Automatic for email
      • Advanced for drop folder (need to validate permission's on folder)
  4. Directory Management Service
    1. Directory Management Service: Yes
      • No - will have to manually create any needed groups and contacts
      • Yes - SharePoint will create groups and contacts as needed
      • Remote - Seperate server configured to manage the creation of groups and contacts
    2. Container: Enter container info
      • We setup a seperate OU for all our SharePoint Accounts and Groups during setup so we just added an OU with in called Contacts
    3. SMTP mail server: SharePoint Server with SMTP service running
    4. Accept messages from authenticated users only
      • Yes for internal use only
      • No to allow outside emails to be processed
    5. Distrubution group creation
      • Yes Allow creation of Distribution groups
      • No Deny creation of Distrubution groups
    6. Distrubution group settings - Limit settings allowed with SharePoint if groups can be created
      • Create new distribution group
      • Change distribution group e-mail address 
      • Change distribution group title and description
      • Delete distribution group
  5. Incoming E-Mail Server Display Address
    • Set to something user friendly / your domain
  6. Safe E-Mail Servers
    • You can limit what email servers to process email from or leave it open to all.


Next we need to setup the feature on desired Document Library.

  1. Navigate to desired Document Library you wish to add feature to.
  2. On the ribbon click"Library"
  3. Click "Library Settings"
  4. Click "Incoming e-mail settings" under Communications
  5. Complete the following:
    1. Incoming E-Mail
      1. Allow: Yes
      2. E-mail address: [Enter email address to use]
    2. E-Mail Attachments
      1. Group: Select process that works for you
      2. Overwrite: Decide if overwrite is allowed or not
    3. E-Mail Message
      1. Decide if the email message needs to be saved also
    4. E-Mail Meeting Invitations
      1. Decide if the email message needs to be saved also
    5. E-mail Security
      1. Decide how you want to control who can add to library
        1. If allowing outside emails in you may see SPAM


Tuesday, October 15, 2013

SharePoint 2010 People Search Filtering

Working on our internal Phone book in SharePoint using the Enterprise Search: People Search Feature. I had to ensure none of our service accounts or other non-company users were being displayed.



First I setup Connection filters.

  • Central Administration > Application Management > Service Applications > 
  • Manage Service Applications > User Profile Service > Synchronization > 
  • Configure Synchronization Connections > [Your AD Connection] > Edit Connection Filters


Used the Any apply (OR) for the following rules:

         Attribute                         Operator                               Filter
sAMAccountName                 Contains                            Calendar 
sAMAccountName                 Contains                            Privileged 
sAMAccountName                 Contains                                   _ 
userAccountControl                Bit on equals                            17

This takes care of the accounts we have for department calendars [deptcalendar], any elevated management accounts [username Privileged], service accounts [System_Service] and any account that has a non-expiring password.

No even with all of this I was still seeing a service account sneaking in creating its own profile and for the life of me I could not stop it.  I don't care if it has a mySite I just don't want it to show up in our Phone book. 

After about a day and a half of searching finally came across the process to resolve this issue.  I needed to set up an exclusion on the People Search Core Results > Results Query Options > Append Test To Query.
-prefrerredname:"[enter format here]" in our case I setup -preferredname:"SP_*".  Ensure you do this to any pages that contain the People Search Core Results web app in our case we have two pages: People-Directory and peopleresults.




Friday, October 11, 2013

SharePoint 2010: Adding back deleted User Profile Property

So recently playing with SharePoint User Profiles.  Noticed we were getting duplicate Office/Office Values.  This was being caused by our Sync with AD as SP would pull the Office Field and SP also has an Office Location (that for use was the same).  I attempted to reuse the SP Office Location but could not make it accept the values I wanted.  I just deleted it, oops now profile pages wouldn't load.  Did some searching and found this article to be the most help full.


Adding back deleted User Profile Property
http://social.technet.microsoft.com/Forums/sharepoint/en-US/47f6e479-fad0-456f-ba93-f24c6f1212f0/adding-back-deleted-user-profile-property

Most specifically found Modulacht's response to be the best:

------------------Begin Quote------------------

Create a new Property with name like "SPSLocation" (leave the '-' after 'SPS'). In this way the propertyname will be accepted. Now just start your SQL-Server Management Studio, select and edit the appropriate record in you profile database.

In my case I used the following SQL: Using this SQL to select the appropriate record (Ensure that you only get 1 result-line!!!):

SELECT *
FROM PropertyList
WHERE PropertyName='SPSLocation'

Using this SQL to edit the appropriate record:

UPDATE PropertyList
SET PropertyName='SPS-Location'
WHERE PropertyName='SPSLocation'

------------------End Quote------------------

Followed above steps and within 10 minutes had my Profiles back working.

Tuesday, October 8, 2013

Windows Scheduled Tasks Service Account Report (Power Shell v2)

Wiped this one together to verify none of our scheduled tasks are using our domain administrator account prior to password change.

First step is setting up the source file.
On my servers I have the following structure: %root%\_scripts\source files

  • Create a text file in your source location, ensure it is a txt file
  • one server name per line
Copy the code below into a .ps1 file or you can download from my Google drive here
Update the fields highlighted in blue to meet your requirements. 

Hope this is helpful.
(If you find it helpful please head over to Spiceworks and Spice up the code to help other IT members find it: http://community.spiceworks.com/scripts/show/2213-scheduled-tasks-service-accounts)

# +-----------------------------------------------------------------------------------
# | File : Scheduled Tasks Service Accounts.ps1                                        
# | Version : 1.01                                        
# | Purpose : Pulls Scheduled Tasks from list of servers
# |           Saves to individual CSV files
# |           Can Email reports
# |           Can remove reports before script exits
# | Based on: Ryan Schlagel's Scripts
# |           http://ryanschlagel.wordpress.com/2012/07/09/managing-scheduled-tasks-with-powershell/
# +-----------------------------------------------------------------------------------
# | Maintenance History                                          
# | -------------------                                          
# | Name            Date        Version  C/R  Description      
# | ----------------------------------------------------------------------------------
# | Chris Lee     2013-10-08     1.01         Initial scirpt build
# +-----------------------------------------------------------------------------------


###SETUP START###
#-------DO NOT MODIFY-------#
#Add Exchange 2007 commandlets (if not added)
if(!(Get-PSSnapin | Where-Object {$_.name -eq "Microsoft.Exchange.Management.PowerShell.Admin"})) {ADD-PSSnapin Microsoft.Exchange.Management.PowerShell.Admin}
#Add Quest commandlets (if not added)
if(!(Get-PSSnapin | Where-Object {$_.name -eq "Quest.Activeroles.ADManagement"})) {ADD-PSSnapin Quest.Activeroles.ADManagement}
#Defines Time/Date Stamp used
$CreateStamp = Get-Date -UFormat %d_%m_%Y
###SETUP END###

###USER VARIABLES START###
#-------MODIFY AS NEEDED-------#
#Define path to server list to be used
    $path = "[PATHTOYOURSERVERLISTTEXTDOCUMENT]"
#Define path to temp/report folder include trailing \
    $temppath = "[PATHTOYOURTEMPFOLDER]"
#Email reports to admin "True" | "False"
    $email = "True"
#Delete files after script runs "True" | "False"
    $delete = "True"
#Enter System Admin
    $AdminName="[YOURADMINNAME]"
#Enter Admin Email Address
    $to="[YOURADMIN]@[YOURDOMAIN].com"
# SMTP Server to be used
    $smtp = "[YOURSMTP]"
# "From" address of the email
    $from = "ServerReports@[YOURDOMAIN].com"
#Enter Path to reports
    $file="C:\_temp\"
# Define font and font size
# ` or \ is an escape character in powershell
    $font = "<font size=`"3`" face=`"Calibri`">"
###USER VARIABLES END###

###PROGRAM VARIABLES START###
#-------DO NOT MODIFY-------#
# Get today's day, date and time
$today = (Get-date)
# Newline character
$newline = "<br>"
#Enter Subject line required for ticketing system
$subject="Service Accounts Report for " + $domain + " servers."
#Section break
$secbreak="`r`n---------------------------------------------------------------------------------------------------------------------------`r`n"
#Pull Domain information for email
$Domain = ([adsi]'').distinguishedname -replace "DC=","" -replace ",","."
###PROGRAM VARIABLES END###

###PROGRAM START###
#Loads Server list into varialbe
$a = Get-Content $path
#Declare string for report structure
$attachment = @()
#Generates CSV file with scheduled service account information for each server
foreach ($i in $a)
    {
      $schedule = new-object -com("Schedule.Service")
      $schedule.connect("$i")
      $tasks = $schedule.getfolder("\").gettasks(0)
      $tasks  | Format-Table Name, @{Name="RunAs";Expression={[xml]$xml = $_.xml ; $xml.Task.Principals.principal.userID}}, LastRunTime, NextRunTime -AutoSize |  Export-csv "$temppath$i.csv" -notype
      IF($tasks.count -eq 0) {Write-Host “Schedule is Empty”}
      $attachment += "$temppath$i.csv"
    }

#Check if email of files is desired
IF ($email -eq "True")
    {
        # Message body is in HTML font        
        $body = $font
        $body += "Dear " + $AdminName + ","+ $newline + $newline
        $body += "Attached are report(s) for scheduled tasks service accounts on " + $domain + " servers ." + $newline

        # Put a timestamp on the email
        $body += $newline + $newline + $newline + $newline
        $body += "<h5>Message generated on: " + $today + ".</h5>"
        $body += "</font>"

        # Invokes the Send-MailMessage function to send notification email
        Send-MailMessage -smtpServer $smtp -from $from -to $to -subject $subject -BodyAsHtml $body -Attachments $attachment
     }

#Check if removal of files is desired
IF ($delete -eq "True")
  {
    #Removes created file
    foreach ($i in $a)
      {
        Remove-Item "c:\_temp\$i.csv" -Recurse
      }
  }

###PROGRAM END###

Windows Service Account Report (Power Shell v2)

Well it has been busy.  After My two weeks away we let a group of users go which has lead to a whole new task of automating on boarding and off boarding users.  Still tweaking the scripts and will post them once I feel they are solid.  But here is a script that I pieced together in response to a post on SpiceWorks (see post here).

 Request was to pull Service Accounts from servers to see what account is running what.

Some quick searching had this resolved and then I wanted to export to file and offer up emailing the reports.

First step is setting up the source file.
On my servers I have the following structure: %root%\_scripts\source files

  • Create a text file in your source location, ensure it is a txt file
  • one server name per line
Copy the code below into a .ps1 file or you can download from my Google drive here
Update the fields highlighted in blue to meet your requirements. 

Hope this is helpful.
(If you find it helpful please head over to Spiceworks and Spice up the code to help other IT members find it: http://community.spiceworks.com/scripts/show/2212-service-account-report)

# +-----------------------------------------------------------------------------------
# | File : Service Account Report.ps1                                          
# | Version : 1.01                                          
# | Purpose : Pulls Service Accounts from list of servers
# |           Saves to individual CSV files
# |           Can Email reports
# |           Can remove reports before script exits
# +-----------------------------------------------------------------------------------
# | Maintenance History                                            
# | -------------------                                            
# | Name            Date        Version  C/R  Description        
# | ----------------------------------------------------------------------------------
# | Chris Lee     2013-10-08     1.01         Initial scirpt build
# +-----------------------------------------------------------------------------------


###SETUP START###
#-------DO NOT MODIFY-------#
#Add Exchange 2007 commandlets (if not added)
if(!(Get-PSSnapin | Where-Object {$_.name -eq "Microsoft.Exchange.Management.PowerShell.Admin"})) {ADD-PSSnapin Microsoft.Exchange.Management.PowerShell.Admin}
#Add Quest commandlets (if not added)
if(!(Get-PSSnapin | Where-Object {$_.name -eq "Quest.Activeroles.ADManagement"})) {ADD-PSSnapin Quest.Activeroles.ADManagement}
#Defines Time/Date Stamp used
$CreateStamp = Get-Date -UFormat %d_%m_%Y
###SETUP END###

###USER VARIABLES START###
#-------MODIFY AS NEEDED-------#
#Define path to server list to be used
    $path = "[PATHTOYOURSERVERLISTTEXTDOCUMENT]"
#Define path to temp/report folder include trailing \
    $temppath = "[PATHTOYOURTEMPFOLDER]"
#Email reports to admin "True" | "False"
    $email = "True"
#Delete files after script runs "True" | "False"
    $delete = "True"
#Enter System Admin
    $AdminName="[YOURADMINNAME]"
#Enter Admin Email Address
    $to="[YOURADMIN]@[YOURDOMAIN].com"
# SMTP Server to be used
    $smtp = "[YOURSMTP]"
# "From" address of the email
    $from = "ServerReports@[YOURDOMAIN].com"
#Enter Path to reports
    $file="C:\_temp\"
# Define font and font size
# ` or \ is an escape character in powershell
    $font = "<font size=`"3`" face=`"Calibri`">"
###USER VARIABLES START###

###PROGRAM VARIABLES START###
#-------DO NOT MODIFY-------#
# Get today's day, date and time
$today = (Get-date)
# Newline character
$newline = "<br>"
#Enter Subject line required for ticketing system
$subject="Service Accounts Report for " + $domain + " servers."
#Section break
$secbreak="`r`n---------------------------------------------------------------------------------------------------------------------------`r`n"
#Pull Domain information for email
$Domain = ([adsi]'').distinguishedname -replace "DC=","" -replace ",","."
###PROGRAM VARIABLES END###

###PROGRAM START###
#Loads Server list into varialbe
$a = Get-Content $path
#Declare string for report structure
$attachment = @()
#Generates CSV file with service account information for each server
foreach ($i in $a)
  Get-WmiObject win32_service -computer $i | select name, startname, startmode | Export-csv "$temppath$i.csv" -notype
  $attachment += "$temppath$i.csv"
}

#Check if email of files is desired
IF ($email -eq "True")
    {
        # Message body is in HTML font          
        $body = $font
        $body += "Dear " + $AdminName + ","+ $newline + $newline
        $body += "Attached are report(s) for service accounts on" + $domain + " servers ." + $newline 

        # Put a timestamp on the email
        $body += $newline + $newline + $newline + $newline
        $body += "<h5>Message generated on: " + $today + ".</h5>"
        $body += "</font>"

        # Invokes the Send-MailMessage function to send notification email
        Send-MailMessage -smtpServer $smtp -from $from -to $to -subject $subject -BodyAsHtml $body -Attachments $attachment
     }

#Check if removal of files is desired
IF ($delete -eq "True")
  {
    #Removes created file
    foreach ($i in $a) 
      {
        Remove-Item "c:\_temp\$i.csv" -Recurse
      }
  }

###PROGRAM END###

Monday, September 9, 2013

Server 2008: Rename Your Active Directory Domain

Server 2008: Rename Your Active Directory NetBios Name

Recently began setting up my own personal production and test networks on my personal VM host (VMware Esxi 5.1).  I had purchased a domain from GoDaddy and planed on using "internal.domain.com".

Shortly after setting up the first Domain Controller realized I need to change my NetBois to allow login as domain\user instead of internal\user.

Some quick Google searching provided the following resource:
http://www.trainsignal.com/blog/rename-active-directory-domain

Step 1: Access Domain Controller

Step 2: Open a command prompt

  • Start > Run > cmd
Step 3: Generate Domainlist.xml
  • From command prompt enter "rendom /list"
    • 2008R2 directory: C:\users\[logged in user]
Step 4: View list.xml file
  • From command prompt enter "type domainlist.xml"
Step 5: Edit Domainlist to correct NetBios name
  • Browse to file location
    • 2008R2 directory: C:\users\[logged in user]
  • Right-click and select edit (use favorite text editor if prompted)
  • Locate <!--ForestRoot -->
  • Change NetBiosName value from current (internal) to desired (domain)
    • Note I only wanted to change login from internal\user to domain\user if you wish to change domain completely update all instances of old value to new desired value.
Step 6: Verify desired changes
  • From command prompt enter "rendom /showforest"
  • Verify results
    • For domain login ensure FlatName is what you want
    • For entire rename ensure correct values for all fields
Step 7: Upload changes

  • From command prompt enter "rendom /upload"
Step 8: Prepare domain controller for update
  • From command prompt enter "rendom /prepare"
    • Best practice is to ensure all domain controllers have firewall off for remainder of operation
Step 9: Execute domain update
  • From command prompt enter "rendom /execute"
    • Verify no errors on results if so resolve before continuing (issue I ran into was firewall issues)
    • Note Domain Controllers may begin restarting. 
      • (Al-Dabbas stated his did, my experience they did not)
At this point I was able to logoff then back on as domain\user with now issues.  As I had just created the domain I did not need to continue past Step 6 of TrainSignal Blog their remaining steps walk you through updating any previous GPOs that were created.


Monday, September 2, 2013

SharePoint 2010: Hide Recently Modified

For our internal SharePoint each of our departments have a page.  Since we are using the Team Site template the pages are Wikis.  As this does not look professional I had to find a way to block on several pages (10-12 pages).

I did some searching and found you could do this two ways.

First way is per page and requires the Content Manager, following site provided the directions below: http://blog.drisgill.com/2010/09/sp2010-branding-tip-12-hiding-quick.html
This process works good for a few sites (1-3).

  1. Navigate to desired site
  2. Click Page > Site Actions > Edit Page
  3. Click Insert > Web Part
  4. Under Media and Content > Content Editor > Add
  5. Select the new webpart
  6. From ribbon click Format Text > HTML > Edit HTML Source
  7. Enter the following and Click Ok:
                  <style type="text/css">          
                         body #s4-leftpanel { display: none; }            
                        .s4-ca { margin-left: 0px; }          
                  </style>
     8.   Hiding the Content Editor
    • Webpart Tools Options > Web Part Properties
    • Expand Appearance 
    • Modify Chrome Type to None
    • Click OK
    • Save and Close page
Second way is to modify the site collections master page.  Following site provided directions:
  1. Navigate to Site with pages
  2. Click Page > Site Actions > Edit in SharePoint Designer
  3. Select Master Pages from Site Objects
  4. Select your master page to edit
    • Default is v4.master
  5. Click Edit file
    • Check out if prompted
  6. In code window scroll up till in the <head></head> section
  7. Enter following code somewhere between the <head></head> tags
    • <style type="text/css">.s4-recentchanges{display:none;}</style>
  8. Save the file
  9. Navigate back to Master pages
  10. Right click on page just edited and select Check In
    • Select Publish a major version
    • If prompted for content approval click yes
      • Browser will open
        • Select file Pending Approval
        • From drop down click Approve/Reject
        • Select approved and add comments if desired, click OK
          • Until approved changes will not take affect
Refresh the page and verify changes applied.

Monday, August 26, 2013

Spiceworks: Generate Ticket for upgrade notice

This is quick write up of experience implementing How-To: How to automatically create a ticket on Spiceworks upgrade by Vasily Ignatov.

Objective: Generate a ticket in Spiceworks when upgrade is released.  Provides upgrade history and ensures completed in timely manner.

  1. Navigated to following link and save the script:
  2. Create a storage directory 
    • Create a directory for downloaded Spiceworks installers and define the path_to_store_exe variable in the script. Grant write access to this folder for account, which will run the script.
    • We used \\server\shared\IT\Software\Spiceworks\Installers
  3. Fill in other variables in the script
    • All settings to fill in bounded with lines "=== Define your parameters ==="
      • $smtp_server = "smtp.domain.com"
      • $smtp_server_port = "25"
      • $SSL_is_used = $false
      • $ticket_creator_email = "admin@domain.com"
      • $ticket_creator_email_pass = ""
      • $helpdesk_email = "helpdesk@domain.com"
      • $helpdesk_url = "http://helpdesk.domain.com:port"
      • $path_to_store_exe = "D:\SW-Install\"
      • $LANG = "EN"
  4. Schedule a task Schedule a task to run the script every day at 09 AM or whenever you want.
    • Start > Run > Taskschd.msc
    • Task Scheduler Library > Action > Create Task
    • General
      • Name: Spiceworks Ticket Upgrade
      • Description: Run PS script to check for Spiceworks updates.  Generate ticket as needed.
      • Security Options: Run whether user is logged in or not
    • Triggers
      • New > On a schedule
      • Settings 
        • Define when you want to run
      • Advanced Settings
        • Ensure Enabled is checked
        • I like to set stop for tasks if they run more then an hr
    • Actions
      • New > Start a program
      • Settings:
        • Program/script: powershell
        • Add Arguments: -file [path to script enclose in "" if spaces in path]
    • Conditions
      • Adjust as needed
    • Settings
      • Adjust as needed
    • History
      • Log to verify running
    • Upon saving will be prompted for user password
That is it.  Now in the folder you have defined installers will be downloaded being named according to versions.  Also in the folder will be the log file (check_sw_ver.log) and the md5sums.

Download script, readme and task scheduler template at:
https://docs.google.com/file/d/0B1fwreWrAZiobmlXT1Y1ZGpCSEE/edit?usp=sharing


Friday, August 23, 2013

SharePoint 2010: Create new Site Collection

Quick run through on how to create a Site Collection on SharePoint 2010 for future reference.

For more details check out:
  1. Access Central Administration site
    • Typically server name on port 9999 but will vary with installation.  On server shortcut located at: Start > All Programs > Microsoft SharePoint 2010 Products > SharePoint 2010 Central Administration
  2. Application Management > Create site collections
  3. Complete following:
    1. Web Application
      1. Select Web Application from drop down
      2. Ex: Test
    2. Title and Description
      1. Title: What you want displayed above Ribbon
      2. Description: Describe the site for future reference 
        • **DO NOT LEAVE BLANK** future admins will HATE you
    3. Web Site Address
      1. Select if Root (/) or Sites (/sites/) address
    4. Template Selection
      1. Select desired Template: 
    5. Primary Site Collection Administrator
      1. Enter User Name:
        • I define Farm Admin as primary
    6. Secondary Site Collection Administrator
      1. Enter User Name
        • Define my privileged account as secondary
        • Can add more SC admins later if needed
    7. Quota Template
      1. Define quota if required
    8. Click OK; Wait for processing
  4. Completed screen provides hyperlink to new site collection 
Check out the How-To on Spiceworks: 

Wednesday, August 21, 2013

SharePoint 2010: Creating New Web Application

Little write up on process of creating Web Applications for future referencing on later posts, avoid having to duplicate data over and over.

I will not be explaining all the options for those details check out:




  1. Access Central Administration site
    • Typically server name on port 9999 but will vary with installation.  On server shortcut located at: Start > All Programs > Microsoft SharePoint 2010 Products > SharePoint 2010 Central Administration
  2. Application Management > Manage web applications
  3. Click New
  4. Complete following:
    1. Authentication
      • Classic Mode Authentication
    2. IIS Web Site
      • Create a new IIS web site: SharePoint - Test
        • Scheme I use is SharePoint - [SiteSubject]
          • SharePoint - HR / SharePoint - My Sites / etc
        • Port: Typically 80 unless known issues
        • Host Header: what you want the URL to be
          • http://[hostheader]:port
        • Path: Leave as defualt
    3. Security Configuration
      • Leave all as default unless you know you are using Kerberos/SSL or want to allow Anonymous (Public Sites)
    4. Public URL
      • URL: Leave default unless want different from Host Header
      • Zone: Can't change
    5. Application Pool
      • Create new: I just double check to make sure scheme is being applied: SharePoint - [hostheader][port]
      • Security Account: Leave as defualt (Typically Farm Account)
    6. Database Name and Authentication
      • Database server: [Enter Database Server DNS Name]
      • Database Name: Modify to following scheme
        • WSS_Content_[IIS Web site Name]
          • Ex: WSS_Content_Test (makes easier to find in SQL database
      • Database Authentication: Leave default unless your network requires it
    7. Failover Database Server: Enter Server name if you have mirrored SQL servers
    8. Search Server: Select desired Server if available
    9. Service Application Connections
      • Leave as default unless new web application does not require Service Application
    10. Customer Experience Improvement Program: No
  5. Click Okay; Wait for processing to complete
  6. Application Created Should Appear

Check out How-To on SpiceWorks:

Monday, August 19, 2013

SharePoint 2010: My Site Root Deletion/Restoration

So playing around in our My Sites and discovered a My Site for SP_Farm.  Went in and deleted as a typical site collection.  BIG MISTAKE.  Apparently when browsing other My Sites you are view a layered root site (/).  By deleting this I broke My Sites for everyone.  To resolve I had to recreate the Site Collection (followed post at Jerry Orman's Blog SharePoint My Site link stops redirecting users to their Personal Site)


  1. Browse Central Administration > Application Management 
    1. Click Create Site Collection 
    2. Select the My Site Web Application 
    3. Set the Title to My Site
    4. For URL, select the "/" option for the root 
    5. Select the My Site Host site template on the Enterprise tab 
    6. Set the SharePoint System account as the owner. 
    7. Click OK
That fixed it (whew thought I would have to recreate all the My Sites from a backup last week).

If you want to remove those service account My Sites there are two ways (GUI /  PS):
  1. GUI (Central Administration)
    1.  Start > All Programs > Microsoft SharePoint 2010 Products > SharePoint 2010 Central Administration
    2. Application Management
    3. Delete Site Collection
    4. Select Site from Drop Down
    5. Delete
  2. PS (Power Shell)
    1. Start > All Programs > Microsoft SharePoint 2010 Products > SharePoint 2010 Management Shell
    2. STSAdm.exe command-line (stsadm -o deletesite -url http://mysite/personal/bobsmith)

Windchill 10.0 Debug Logging

We have been having some issues with our WindChill/PTC Creo system and have had to reach out to their tech support.  Through this issue I have learned how to enable debug logging and what log files they typically need to resolve cases.  By knowing this information you can speed up the resolution process by including debug logs with initial case opening message instead of waiting for the tech to request.


  1. Access server running Windchill
  2. Start > All Programs > Windchill_10.0 > Windchill Shell
  3. Enter: xconfmanager -s wt.inf.team.verbose=true -t codebase/wt.properties –p
    • Set to false to disable
Now reproduce what ever caused your error and you will have a nice debug log to send.  

The logs location may vary. Ours are located in [Drive]:\PTC\Winchill_10.0\Windchill\logs


Friday, August 16, 2013

NeverFail: Disaster Recovery shouldn’t run on late nights and coffee alone (Webinar)

Summary of Webinar presented by Neverfail: Speaker Josh Mazgelis

This webinar covered following three topics plus preview of new Neverfail product

  1. Dependency Mapping
  2. DR planning
  3. Flavors of protection
  4. Neverfail product

  1. Mapping is important
    1. Business view
      1. Managers only see the Business needs (Services)
      2. There is little to no between Business needs and Infrastructure Support
      3. Don't understand Infrastructure Support is like insurance, only see money vanishing
    2. How IT sees it
      1. Inventory of servers, hosts, storage
      2. Don't understand the Business needs only to keep everything running
      3. May/May not understand what services are important/critical
      4. Very difficult to know if Backup Continuity / Disaster Recovery (BC/DR) is sufficient
      5. Difficult to justify money for better BC/DR
    3. Results
      1. IT staff spends more time determining how things work
        1. late nights figuring out how things work
      2. IT staff spends more time reacting/repairing when services break
      3. Business wants to know why took so long to restore/recovery
      4. Business wants to know why data is missing from restore
  2. DR Planning
    1. Planning originates from Business needs
      1. Business owners need to identify key services
      2. Business owners need to define target SLA
        1. Not IT
    2. How to do this 
      1. Start with what services are important/critical
      2. Identify components that keep these service running
      3. Identify the dependencies that support components/services
    3. Build DR to support Business services
      1. Easier to justify spending for application dependencies
      2. Reduce spending on extraneous infrastructure
        • By determining what levels of SLA (typically 2-3 for a company) can break services down and adjust resources to provide the needed protection
  3. Understanding challenges
    1. Off-target Recovery Plans
      1. With virtualization became easy to spread basic protection across Virtual Infrastructure
        1. May not meet Recovery Plan Objective (RPO) / Recovery Time Objective (RTO)
        2. May over protect smaller systems thus wasting resources
        3. Hard to balance funds to protection
      2. Spot solutions
        1. Don't provide complete protection
          • May protect Oracle database extermly well but not the SharePoint application that allows retrival and input
            • Database availability means nothing if no application to use it
    2. Keeping up with changes
      1. Visualization accelerated the pace of change
        1. New servers come online within minutes to hours versus days to weeks
        2. Old servers become abandoned forgotten easier as they are not physical without monitoring virtual enviroment easy to feed "zombie" system vitila resources
        3. VM's easily move between hosts and protection schemes
          • One host may be connected to a SAN were other is not
      2. Rouge IT ushers in undocumented changes
        1. Business units create "Ghost IT" infrastructure
          • Mac mini with server applications
        2. Cloud services compliment or replace internal resources
          1. Dropbox
      3. There's just a whole lot going on
        1. It is difficult enough keeping up with projects and issue without worrying about BC/DR updates
    3. Knowing where you stand
      1. DR plan last thing considered
        1. Doing more with less truely means there is more that doesn't get done
        2. BC/DR plans are rarely updated as changes are made
      2. Even if you have a plan hard to know status
        1. BCDR consultants could be used but not always through enough to catch everything
        2. Testing plans (not when there is an emergency)
          • Monthly/Qtrly/Semi-Annual/somthing
    4. Recovery Tool Taste Testing
      1. Basic backup & recovery
      2. Replicate of VM images/stores
      3. Trad. server cluster
      4. Replication with stand-by & failover
    5. Every blend own characteristics
      1. Different RTO/RPO
      2. Protection from different kinds of failures
      3. widely varying cost to protection ratio
    6. Delicate Balance
      1. SLA 
        1. Business wants to meet certain SLA;s
        2. Increasing # of threats to business continuity
        3. Everyone's stuff important, hard to determine
      2. Budgetary constraints
        1. Face It: Protection costs money, time, resources
        2. Hard to justify expense
        3. Not everything is going to get unlimited protection
    7. Building a better Coffee Maker
      1. Understand business needs
        1. Reference actual business needs and requirements
        2. Estimate application cost of downtime
          • Easier to justify funding when you know the cost
        3. Fully map out service dependencies
          • Ensure a small server/service running on other server is not missed that breaks entire service
      2. Apply protection appropriately
        1. Good, Fast, Cheap (Pick any two)
          • Never going to find a perfect solution
      3. Monitor results
        1. Regular testing
        2. Develop automation
Neverfail IT Continuity Architect
  1. Dashboard
    1. Progress and Summary reports
      1. Automatically inventories and analyzes IT infrastructure
      2. Summarizes availability and likelihood to meet SLA's
    2. Multiple Heatmap views of Inventory
      1. color coded by analysis state, protection, or tier ranking
      2. Bigger boxes indicate more dependent entities
    3. Create and define Biz services (SLA)
      1. combine dependent entities
    4. Dependency Graphs
      1. view inbound and outbound dependencies for any entity
  2. Learn More
    1. IT Continuity Architect tech preview
    2. IT Continuity Architect introduction video
    3. IT Continuity Architect - Discovery and Dependencies

ShoreTel Agent Login/Logoff via phones

We have a small number of users that are part of some groups within our ShoreTel System (Sales/Customer Service).

**Note Following only works if you have ShoreTel Workgroup Agent/Supervisor Access Licenses: else ShoreTel Administrator must manually make changes**

Recently there have been more questions on logging in and out as an agent during scheduled times. Below is a quick write up covering major models.

IP 212k:

  • Press Options > [Voice Mail Password] > #
  • Select "Agent State" via Custom Button
  • Select State [Logged In / Logged Out / Wrap-up] via Custom Button
  • Press Menu button to exit
IP 230/230g/265/560/560g/565g/:

  • Press Options > [Voice Mail Password] > #
  • Select "Agent State" via Up/Down Arrow control or press 5
  • Select State via Up/Down Arrow control
    • Logged In [1]
    • Logged Out [2]
    • Wrap-Up [3]
  • Press Done Softkey

Feature not available on following phones:
IP 110, IP 115, IP 210, IP 655 , IP 8000

For users on non-supported phones Agent status will need to be updated by ShoreTel Administrator.

Wednesday, August 14, 2013

Truth In IT: BackUp Seminar Notes to come

Morning everyone,

Have a lot going on but just want to let you all know I attended a great seminar (free) put on by Truth in IT yesterday.  Guest speaker was Curtis Preston (Mr. Backup).  Learned a lot and will have several posts on more in depth topics but keep an eye out over the next week I should get a nice overview of the event.

If possible I highly recommend attending any TruthInIT events that come to your area. I have attended three in the past year and each one is very informational.

SharePoint 2010: Aggregating Announcements

As part of our SharePoint Setup we decided we wanted the ability to aggregate / bubble up / consolidate announcements from multiple sites to our Main Page.  The following is how we accomplished this task while installing simple way for users to control what is displayed.

To start we added custom site columns to top level site that added controls of what was displayed on main page, when and for how long.
  1. Launch browser to top level site (in our case http://intranet/)
  2. Site Actions > Site Settings
  3. Galleries > Site content types
  4. List Content Types > Announcement
  5. Columns > Add from new site column
    1. Name and Type
      1. Name: Display on Home Page (Y/N)
      2. Type: Yes/No
    2. Group
      1. Group: Custom Columns or create a new one
    3. Additional Column Settings
      1. Description: Option to have announcement display on Main Page
      2. Default Value: No
        • This requires users to select Yes for announcement to be displayed on Home Page otherwise it only displays on their site
    4. Update List and Site Content Types
      1. Update All content types inheriting: Yes
        • This is how we populate this to all child sites
    5. Click Okay
  6. Columns > Add from new site column
    1. Name and Type
      1. Name: Display Start Date
      2. Type: Date and Time
    2. Group
      1. Group: Custom Columns or create a new one
    3. Additional Column Settings
      • Description: Date to Start Display on Main Page
      • Required: No
      • Enforce Unique: No
      • Format: Date Only
      • Default value: (None)
    4. Update All content types inheriting: Yes
      • This is how we populate this to all child sites
    5. Click Okay
  7. Columns > Add from new site column
    1. Name and Type
      1. Name: Display End Date
      2. Type: Date and Time
    2. Group
      1. Group: Custom Columns or create a new one
    3. Additional Column Settings
      • Description: Date to End Display on Main Page
      • Required: No
      • Enforce Unique: No
      • Format: Date Only
      • Default value: (None)
    4. Update All content types inheriting: Yes
      • This is how we populate this to all child sites
    5. Click Okay
Now we will setup the home page to display items from other sites.  To accomplish this we will use a Web Part: Content Query.  
  1. Move back to top level site
  2. Sites Actions > Edit Page
  3. Click Add Web Part in desired location (for use we chose the left space)
  4. Categories > Content Rollup > Web Parts > Content Query > Add
  5. Click open the tool pane or Drop Down > Edit Web Part
  6. Expand Query and set the following:
    1. Source: Show items from all sites in this
    2. List Type: Announcements
    3. Content Type: 
      1. Group: List Content Types
      2. Item: Announcements
    4. Audience Targeting: Apply as needed
    5. Click Apply
  7. Set filters (if desired)
    1. Query > Additional Filters:
      1. Display on Home Page (Y/N)
        • is equal to
        • [Yes]
      2. Display Start Date
        • is less then or equal to
        • [Today]
      3. Display End Date
        • is greater then or equal to
        • [Today]
    2. Click Apply
  8. Expand Appearance
    1. Set Title to desired name (we used Current Announcements)
    2. Click Apply > Click OK
That is it.  Now when a user creates an announcement they will have the option to set for homepage display.
For How-To write up with steps and images visit post on SpiceWorks: http://community.spiceworks.com/how_to/show/46775-sharepoint-2010-aggregating-annoucements

SharePoint 2010: Install Updates

Quick how-to to update SharePoint 2010.

**Note it is highly recommended to test all updates in test environment before deploying to your production network**

**Note Ensure you plan downtime for each update the longer between updates you wait the longer your downtime: SP2 took roughly 20-25 mins followed by restart**

  1. Determine current Version and build with following SharePoint 2010 Management Shell command:
    • (Get-SPFarm).buildversion
  2. Review what updates need to be applied via following link:
  3. Download update (s) for your version of SharePoint 2010
  4. Move file to location accessible from SharePoint Server
  5. Login to SharePoint Server to be updated
  6. Browse to location of downloaded file
  7. Double-Click to start installation
  8. Read and Accept MS Software Licence Terms > Continue
  9. Sit back and wait for installation
  10. Review output
    1. Some updates fail to start ProfileSynchronizationServiceInstance and will need to be manually restarted.
    2. Restart if needed
  11. Run following SharePoint 2010 Management Shell command as Farm Admin Account:
    • Start > All Programs > Microsoft SharePoint 2010 Products > SharePoint 2010 Management Shell
      • Press Shift and Right Click > Select Run as different user
      • Enter Farm Admin credentials
    • psconfig -cmd upgrade -inplace b2b -wait
    • wait for it to complete then restart
    • Review output
      1. If account not correctly permissioned ProfileSynchronizationServiceInstance will fail to start and will need to be manually restarted.
  12. Verify update by repeating step 1.
Steps to Start ProfileSynchronizationServiceInstance
  1. Access Central Administration
  2. System Settings > Services on Server
  3. User Profile Synchronization Service > Start
    1. Select User Profile Application: [User Profile Service]
    2. Enter password for Service Account: [We save them in KeePass]
    3. Click Okay
That is all there is.  Biggest thing is to remember to plan for downtime and alert your users when the system will be down.