Adventure in SPWonderland

Take apart and put back together

NAVIGATION - SEARCH

The Darfur Wall

You might have heard about the Million Dollar Homepage where a Alex Tew sold pixels on a 1000x1000 grid on his website thus making him over a millon dollars, an idea which falls both into the 'Why didn't I think of that' and 'there's one born every minute' categories, however there is a site out there that loosely takes the idea and turns it towards a much better cause.

From the Darfur Wall website itself:

The numbers 1 to 400,000 cover the 40 panels of The Darfur Wall. Each number represents a person killed in Darfur. By donating $1 or more you can light a number, turning it from dark gray to brilliant white. As we light the wall, we acknowledge the importance of each life lost, we cast light upon a tragedy too many have ignored, and we overcome one barrier to peace.

 

 

Its sobering to think that each number represents a human life lost, please promote this site where and when you can, it would be nice to see that wall brightly lit.

Strangely although I came to the site via Digg there is a SharePoint connection to this, the site has been setup by Jonah Burke ex of the MOSS BDC Team who I happened to see speak at TechED in Boston.    

And what number did I pick? In true geek fashion 443 of course.

 

 

MOSS Personal Sites - Sizing up

 

As a follow on from my previous post we can do some neat calculations on the amount of data stored in MOSS personal sites using Powershell.

Using the measure-object CmdLet against the StorageUsedMB property we can calculate the number of personal sites (+1 one as it includes the root site), average size of of each site, smallest, largest and total storage size of all sites.

Again assuming http://sps:20488 is where your personal sites are hosted

PoSH C:\demo> $output=stsadm -o enumsites -url "http://sps:20488"
PoSH C:\demo> $xml=[XML]$output
PoSH C:\demo> $xml.sites.site | measure-object storageusedmb -min -max -sum -average | format-table

Count Average Sum Maximum Minimum Property
----- ------- --- ------- ------- --------
5      0.62   3.1 0.7     0.4     StorageUs...

To get a list of the sites in question with the largest at the top use -descending on the sort CmdLet:

PoSH C:\demo> $xml.sites.site | sort storageusedmb -descending | select url, owner, storageusedMB | format-table

Url                         Owner                    StorageUsedMB
---                         -----                    -------------
http://sps:20488/person... CONTOSO\mike               0.7
http://sps:20488/person... CONTOSO\jeff               0.7
http://sps:20488/person... CONTOSO\administrator      0.7
http://sps:20488           CONTOSO\administrator      0.6
http://sps:20488/person... CONTOSO\brianb             0.4

And if you just want the top 2 offenders us the -first option on the select-object CmdLet

PoSH C:\demo> $xml.sites.site | sort storageusedmb -descending | select url, owner, storageusedMB -first 2 | format-table

Url Owner StorageUsedMB
--- ----- -------------
http://sps:20488/person... CONTOSO\mike 0.7
http://sps:20488/person... CONTOSO\jeff 0.7

 

Obviously in real life those figures would be a lot larger!

MOSS Personal Sites – Still Top Level

A point came up at the UK SharePoint User Group Chrismas drink, about Personal sites and whether they were top level sites i.e they exist as records in the Sites table and are the top level container for all its subwebs, I hadn’t checked this in v3 so a quick check with PowerShell and stsadm confirms that this is still the case as in V2

Assuming http://sps:20488 is where your personal sites are hosted calling stsadm -o enumsites and processing the returned XML in PowerShell gives us

PoSH C:\demo> $output=stsadm -o enumsites -url "http://sps:20488"
PoSH C:\demo> $xml=[XML]$output
PoSH C:\demo> $xml.sites.site

Url : http://sps:20488
Owner : CONTOSO\administrator
ContentDatabase : WSS_Content_11fba01f-f0c4-4a05-a3c3-868499fd31ce
StorageUsedMB : 0.6
StorageWarningMB : 0
StorageMaxMB : 0

Url : http://sps:20488/personal/administrator
Owner : CONTOSO\administrator
ContentDatabase : WSS_Content_11fba01f-f0c4-4a05-a3c3-868499fd31ce
StorageUsedMB : 0.7
StorageWarningMB : 80
StorageMaxMB : 100
Etc…

Tiding up the output a little

PoSH C:\demo> $xml.sites.site | select url, owner

Url Owner
--- -----
http://sps:20488                           CONTOSO\administrator
http://sps:20488/personal/administrator    CONTOSO\administrator
http://sps:20488/personal/brianb           CONTOSO\brianb
http://sps:20488/personal/jeff             CONTOSO\jeff
http://sps:20488/personal/mike             CONTOSO\mike
….

 

 


SharePoint/PowerShell 6: Approving all Publishing Pages

Now that I have added some content to our portal I want to approve and publish all pages in each publishing web.

To do this I've created a function that takes an MOSS PublishingPage object and a comment that will be added when we check-in, approve and publish.

 

# Function: Approve-PublishingPage
# Description: Approve a single page in a Publishing Web
# Parameters: publishingPage PublishingPage object
# comment Comment to accompany the check-in/approve/publish
#
function Approve-PublishingPage ([Microsoft.SharePoint.Publishing.PublishingPage]$publishingPage, [string]$comment)
{
" Publishing Page: " + $publishingPage.Name

$listitemfile = $publishingPage.ListItem.File

# Check item if checked out
if ($listitemfile.Level -eq [Microsoft.SharePoint.SPFileLevel]::Checkout)
{
   " Checking in page"
   $listitemfile.CheckIn($comment,[Microsoft.SharePoint.SPCheckInType]::MajorCheckin )
}


# If moderation is being used then handle the approval and publishing
if ($publishingPage.ListItem.ParentList.EnableModeration)
{
   $modInformation = $publishingPage.ListItem.ModerationInformation

   " Moderation Enabled"

   # Check for pending approval
   if($modInformation.Status -eq [Microsoft.SharePoint.SPModerationStatusType]::Pending)
   {
      " Approving"

      $listitemfile.Approve($comment)
   }

   # Publish
   if($modInformation.Status -eq [Microsoft.SharePoint.SPModerationStatusType]::Draft)
   {
   " Publishing"
   $listitemfile.Publish($comment)
   }
}
}

this function will be called from Approve-AllPagesInSPWeb

# Function: Approve-AllPagesInSPWeb
# Description: Loop through all the pages in a Publishing Web and checkin and approve them
# Parameters: web SPWeb object
# comment Comment to accompany the checkin/approve/publish
#
function Approve-AllPagesInSPWeb([Microsoft.SharePoint.SPWeb]$web, [string]$comment)
{

# Check this is a publishing web
if ([Microsoft.SharePoint.Publishing.PublishingWeb]::IsPublishingWeb($web) -eq $true)
{

$pubweb = [Microsoft.SharePoint.Publishing.PublishingWeb]::GetPublishingWeb($web);

"Checking $($pubweb.URL)" 

   $pubcollection=$pubweb.GetPublishingPages() 

   for($i=0; $i -lt $pubcollection.count; $i++)
   {
      Approve-PublishingPage $pubcollection[$i] $comment
   }

}

}

Now I'd like to use a foreach look around the GetPublishingPages collection but that's not possible due to lack of Generic support in PowerShell at the moment, so an index loop does the job.

To approve all pages in all webs we can pipe the SPWebs to a foreach loop and pass the SPWeb object to Approve-AllPagesInSPweb

$site = spsite "http://yourmossserver"
$site.allwebs | foreach-object {Approve-AllPagesInSPweb $_ "System Approval"}

Added as a function approveall it produces this output

 

In the next step I'll add a Contact WebPart to every page.

 

 

Fitz is blogging but Maurice is leaving

Yes...Fitz is blogging again. At last Mike Fitzmaurice one of the primary links between the SharePoint team and the outside world is blogging again.

No...Maurice Prather one of the few hardcore SharePoint bloggers around has posted that he is leaving the SharePoint team. That's a real loss, overall compared to say the PowerShell team the state of blogging from the SharePoint team has been really poor over the last year, but as blogs come second best to shipping a product out the door perhaps that's understandable, hopefully things will improve once the RTM celebrations have finished. 

 

List the fields in a SharePoint List

In a previous post I've added content to the publishing pages in the legal and finance divisions and its useful to know what fields are in a list when working with the SharePoint API or say the Query By Content WebPart . Yes you can do using the UI but its a pain.

First get the SPWeb object for our site

$site=spweb "http://sps:2828/divisions/finance"
$web=$site.Openweb()

now show the lists on the site

$web.lists | select title,contenttypes

Title                                   ContentTypes                          
-----                                   ------------                          
Documents                               {Document, Folder}                    
Images                                  {Document, Folder}                    
Master Page Gallery                     {Master Page, Folder}                 
Pages                                   {Page, Article Page, Welcome Page, F...
Workflow History                        {Workflow History}                    
Workflow Tasks                          {Task, Folder}
                        


The Pages list holds the publising pages so let's list its fields


$web.lists["Pages"].Fields | select title, internalname, typedisplayname | sort title

Title                      InternalName               TypeDisplayName         
-----                      ------------               ---------------         
Approval                   Approval                   Workflow Status         
Approval Status            _ModerationStatus          Moderation Status       
Approver Comments          _ModerationComments        Multiple lines of text  
Article Date               ArticleStartDate           Date and Time           
Byline                     ArticleByLine              Single line of text     
Check In Comment           _CheckinComment            Lookup                  
Checked Out To             CheckedOutTitle            Lookup                  
Checked Out To             CheckoutUser               Person or Group         
Checked Out To             LinkCheckedOutTitle        Computed                
Client Limit               Client_x0020_Limit         Number                  
Collect Feedback           CollectF                   Workflow Status         
Collect Signatures         CollectS                   Workflow Status         
Contact                    PublishingContact          Person or Group         
Contact E-Mail Address     PublishingContactEmail     Single line of text     
Contact Name               PublishingContactName      Single line of text     
Contact Picture            PublishingContactPicture   Hyperlink or Picture    
Content Type               ContentType                Choice                  
Content Type ID            ContentTypeId              Content Type Id
       

etc...

There are a lot of fields, to get the count use $web.lists["Pages"].Fields | measure-object
That gives 92 fields for the Pages library.

To see the content publishing pages themselves use

$web.lists["Pages"].Items | select name, file, level

Name                       File                                           Level
----                       ----                                           -----
default.aspx               Pages/default.aspx                             Draft
Client1.aspx               Pages/Client1.aspx                             Published


and finally two functions for our toolbox to make calling these easier

function get-SPListFields([string]$URL, [string]$ListName)
{
$site=get-spweb $URL
$web=$site.OpenWeb()
$list=$web.Lists[$ListName]
$list.Fields
}

function get-SPListItems([string]$URL, [string]$ListName)
{
$site=get-spweb $URL
$web=$site.OpenWeb()
$list=$web.Lists[$ListName]
$list.Items
}

These functions will come in useful later when we approve pages in a site with PowerShell.

 

 

What's in a Name

One of the things that the PowerShell team has always pointed out is the importance of naming conventions in functions.
Most of the bult-in PowerShell functions have a verb-noun format i.e. get-process, get-service etc..
Keeping things consistent across the different types of providers helps with the learning process as you can almost guess what a cmdlet would be called once you know the noun.

Up to now I thought this was just a syntax sugar but this blog entry from the main PowerShell guy Jeffrey Snover points out an important benefit if you name your functions correctly.
When you type a function name, and PowerShell tries to match the correct code to call, it will automagically add a get- to the function name and try to match on that if it fails to find a function of the exact name.

Heres an example take the simple get-SPSite function

function get-SPSite([string]$url)
{
   new-object Microsoft.SharePoint.SPSite($url)
}

I've been using this as $a=get-spsite "http://server" but I could have used $a=spsite "http://server". Its a small thing but if you're typing on the command those extra four characters add up plus it almost looks like C# code not having to use a lengthy namespace prefix.

Likewise for get-SPWeb

function get-SPWeb([string]$url)
{
   $site=new-object Microsoft.SharePoint.SPSite($url)

   $site.OpenWeb()
}

 


 

Windows PowerShell 1.0 Released!

Everyone and their dog will have this link but I'll post it anyway.

Windows PowerShell 1.0 has been released. This page has a list of the download links for XP, Windows Server 2003 and Vista .

http://www.microsoft.com/windowsserver2003/technologies/management/powershell/download.mspx

You might have to be patient as the links were a bit flakey when I tried them.

A point to note is that the Vista version is at RC2 status. Jeffrey Snover explains the Vista release delay here http://blogs.msdn.com/powershell/archive/2006/11/15/windows-powershell-windows-vista.aspx. Checking the comments it seems some people are getting hot under the collar about the delay but as I see it the PowerShell team have done a great job and by all accounts the only difference between RC2 and RTM is the installer. Personally given my bad beta experiences of Vista I'm happier installing the RC2 of PowerShell than the RTM of Vista.

Also worth a download is the Documentation Pack as it contains a Getting Started Guide and a User Guide.

 

 

 

SharePoint/PowerShell 5: Importing the Area's content

Ok now we have the main structure of our Portal with the new Areas we wanted and we need to start adding some content.

As the Area's we have created are Publishing sites we're going to start using the new Publishing API's in WSS to add content.

The first function to help us with this is called Add-Content. This function takes a Site Collection URL, Area URL, Contents Title, Content Text in HTML format and a check-in comment. This function adds the given text and title to the default content page in the Publishing site.
With a few changes you could modify it to add to a specific content page.

# Function: Add-Content
# Description: Add the given text and title to the default publishing page in the publishing web
# Parameters: SiteCollectionURL URL for the root of the Site Collection
# Area Relative URL to the site/subweb/area
# Title Title string for the page
# Text Content to publish
# Comment Checkin comment
function Add-Content($SiteCollectionURL, $Area, $Title, $Text, $Comment)
{

$url = $SiteCollectionURL + "/" + $Area

write-host "Adding content to " $url

$site = new-object Microsoft.SharePoint.SPSite($url)

$web = $site.OpenWeb()

$pubweb = [Microsoft.SharePoint.Publishing.PublishingWeb]::GetPublishingWeb($web)

$pp=$pubweb.DefaultPage

$pp.CheckOut()

# Set the properties
$item=$pp.Item
$item.set_Item("Title",$Title)
$item.set_Item("PublishingPageContent",$Text)
$item.Update()

# Checkin, Approve and Publish
$pp.CheckIn( $Comment )
$pp.Approve( $Comment )
$pp.Publish( $Comment )


}

Note the set_Item syntax, the SharePoint item property seems to conflict with PowerShell's built in properties.
This code assumes Moderation is turned on the Publishing Site.

So we have a handy function to add content but again I want to source that content from an external source.
In this case it's an XML file called content.xml the start of which looks like this:
<contents>
<content>
<area>Divisions/Finance</area>
<title>
Finance Division
</title>
<text>&lt;b&gt;Lorem ipsum dolor sit amet, consectetuer adipiscing elit.&lt;/b&gt;&lt;hr/&gt;&lt;br/&gt; Nullam hendrerit lacinia purus. Proin vulputate porta nisl. Aliquam commodo lobortis lacus. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Sed quis leo imperdiet nisl ultrices fringilla. Praesent enim est, commodo sed, pharetra mattis, condimentum ut, arcu. Morbi aliquet lacus vel elit. In quis nibh. Vestibulum tincidunt. Sed quis sem.
Maecenas lobortis convallis dolor. Nunc rutrum, nunc ac elementum tempor, velit est vehicula libero, eu fermentum lacus lacus ac diam. Curabitur risus quam, dignissim ut, mollis vel, nonummy at, metus. Mauris elit libero, interdum sit amet, sollicitudin nec, eleifend vel, magna. Aliquam at ipsum. Ut rutrum convallis turpis. Quisque urna quam, tincidunt id, pretium a, dignissim ac, neque. Donec ipsum. Sed ornare pretium diam. Phasellus massa. Morbi porttitor purus eget turpis. Sed vel lectus. Etiam egestas nibh vitae augue. Ut felis arcu, fermentum sed, venenatis eget, tincidunt et, lorem. Etiam commodo nisi ut nisl. Morbi augue enim, accumsan eu, dignissim a, scelerisque non, nunc.
Praesent mauris ante, pretium eget, volutpat quis, congue eget, odio.&lt;br/&gt; Praesent sed orci nec sapien molestie viverra. Aenean pulvinar dictum mauris. Quisque ligula est, vestibulum id, consequat vel, ullamcorper id, libero. In tristique. Duis turpis augue, egestas sed, semper eget, dignissim sed, nisl. Class aptent taciti sociosqu ad litora torquent per conubia nostra, per inceptos hymenaeos. Curabitur feugiat dolor cursus lacus. Curabitur dui augue, tempus id, sollicitudin vel, pharetra a, augue. Donec vestibulum dictum ligula. Ut velit dui, dignissim at, interdum quis, convallis a, arcu. Vivamus neque lorem, sollicitudin et, malesuada sed, rhoncus eu, pede. Aliquam hendrerit imperdiet lacus. Duis iaculis fringilla risus. Aliquam pretium ante quis arcu. Nam erat est, porttitor non, venenatis at, auctor sed, libero. Sed condimentum, ante vitae dignissim egestas, dolor nisl rutrum elit, nec ultrices lacus ipsum et pede. Duis dolor turpis, lacinia eu, facilisis ut, fermentum sit amet, nisl. Proin imperdiet mauris a lacus. Nullam suscipit imperdiet tellus.
</text>
........

Its a simple format that has the relative URL of the destination area, the content title and the contents in encoded HTML in its XML nodes. I've encoded the HTML as support for CDATA sections is not as easy as it should be in PowerShell at the moment.

Now I was going to use the import-xml cmdlet that was available in Monad but that seems to have disappeared in recent builds so to import this file I'm going to use the get-content cmdlet, create a variable from it (the $() syntax) and cast it to XML

$xml=[XML]$(get-content Content.xml)

The [XML] is a shorthand way to load the XML into an XMLdocument class
so

$xml | get-member


shows the typename as being the XmlDocument class. If you're used to using this in C# or VB.Net then you have access to the class as normal but the cool thing about PowerShell is the $xml variable now has extra properties that match the nodes in the XML file

$xml.contents.content


will list the content nodes and

$xml.contents.content[0].area


will show Divisions/Finance


The import-content function will do all the hard work.
I'm just piping the list of content nodes into the add-content function and referencing the node name as properties

# Function: import-content
# Description: Use the XMl import file to import some sample content into the Publishing Web
# Parameters: SiteCollectionURL URL for the root of the Site Collection
# ImportFile XML file containing the sample data
function import-content($SiteCollectionUrl,$ImportFile)
{

$xml=[XML]$(get-content $ImportFile)

# Loop through the xml items
$xml.Contents.Content | foreach-object { Add-Content $SiteCollectionUrl $_.Area $_.Title $_.Text }


}


So with a few lines of code this function is loading the contents of the XML file and piping the values of the nodes into our add-content function.

Lets have a look at what the Portal looks like so far


All this and we have not had to use the MOSS UI at all!

Now all those imported content pages will need to be approved. Do it by hand maybe? nope, thats our next PowerShell powered step.

 

 

SharePoint/PowerShell 4: Adding the Portal Areas

In this step I want to start filling out the Portal with some business areas from a list that has been defined for us

We're going to use a CSV file in the same way as we did for the users. The start of the file looks like this

AreaURL, AreaName, AreaTitle, AreaDescription, AreaTemplate
"Divisions","Divisions","Divisions Home Page", "This area includes links to content based on divisions in the company.", "BLANKINTERNET#2"
"Divisions/Sales","Sales","Sales Home Page", "This area includes information related to sales.", "BLANKINTERNET#2"
"Divisions/Support","Support","Support Home Page", "This area includes information related to support.", "BLANKINTERNET#2"
"Divisions/HumanResources","Human Resources","Human Resources Home Page", "This area includes information related to human resources.", "BLANKINTERNET#2"
"Divisions/Marketing","Marketing","Marketing Home Page", "This area includes information related to marketing.", "BLANKINTERNET#2"


We have a AreaURL column which is the URL relative to the site collection root and we also have the Area Name, Title and description.
The AreaTemplate field in this example is BLANKINTERNET#2 (format is WebTemplate#Configuration).
Now I know this is the Publishing Site template under the Publishing tab but we could just as easily create a team site, subsites of blogs or WIKI's etc.
The easist way to find out what template and configuration to use is to create a site of the type you want through the UI and use powershell to find out what template is used

Run these commands to get a list of the Templates in use on the portal

$sp=new-object microsoft.sharepoint.spsite("http://sps:2828")
$sp.allwebs | select serverrelativeurl, webtemplate, configuration


also checkout Dan Winter's list of the MOSS 2007 templates on his blog http://blogs.msdn.com/dwinter/archive/2006/07/07/659613.aspx


Now to the PowerShell functions
First a simple wrapper to add a new site to the Site Collection: add-spweb

# Function:         add-spweb
# Description:        Create a new Web
# Parameters:        SiteCollectionURL     URL for the root of the Site Collection
#            WebUrl         relative URl of the sub site
#            Title             Title string
#            Description         Description string
#            Template        Template to use
#
function add-spweb([string]$SiteCollectionUrl, [string]$WebUrl, [string]$Title, [string]$Description, [string]$Template)
{


    # Create our SPSite object
    $spsite=new-object Microsoft.SharePoint.SPSite $SiteCollectionUrl

    # Add a site
    $spsite.Allwebs.Add($WebUrl, $Title, $Description ,[int]1033, $Template, $false, $false)
    
    # Note: The new SPWeb will be returned from this call
}

And then a function that will import the CSV file, create a set of objects for each Area line in the CSV file and pipe the list to add-spweb to add a new site.


# Function:         Import-Sites
# Description:        Create a set of subwebs as listed in the import CSV file
# Parameters:        CSVFile         Location of the CSV file containing the list of webs
#            SiteCollectionURL     URL for the root of the Site Collection    
#    
function Import-Sites([string]$CSVFile, [string]$SiteCollectionURL)
{
    Import-Csv $CSVFile | foreach-object { add-spweb $SiteCollectionURL $_.AreaURL $_.AreaName $_.AreaDescription $_.AreaTemplate } | foreach-object {$_.Navigation.UseShared=$true; $_.Update() }
}

I'm also setting UseShared to true which tells the subareas to use the main portal navigation elements.

Use it like this

import-sites "ContosoAreas.CSV" http://sps:2828

The next step will be to import some content into our Publishing Areas...