SharePoint Users Missing from SharePoint Groups After Upgrade

The Problem

For our upgrade from SharePoint 2007 to SharePoint 2010, we ran into an issue with only one site collection where certain users are added to a SharePoint group in a particular site collection, they do not appear in the user list in the browser. If added outside of a SharePoint group (ie: given a permission directly), they appear normally.

They are added, as they can get access, and I can see them listed in the permissions in SharePoint Designer, however within the browser – they are not listed. There are no errors, messages, or warnings at any point. This only happens in one site collection, even though many other site collections use the same User Profile Synchronization Service.

The only similar situation I can find through Google is Cannot add user to SP group, SP 2010. We recently upgraded from 2007 to 2010, however we did not use any third party tool like Avepoint in the aforementioned thread – we used the database detach reattach option.

After some digging, we’ve found that the affected users are not listed in the “User Information List” (sitecollectionURL/_catalogs/users/simple.aspx), however it is possible to view the user’s entry in this list if you force-change the ID URL variable to their tp_id # from the UserInfo table in SQL (ie:http://sitecollection/_layouts/userdisp.aspx?ID=123.

Using PowerShell, we can found some commonalities in the users that are experiencing the problem:

  1. ForwardLinks and BackwardLinks contain empty arrays({}) in working users. Affected users contain nothing.
  2. FirstUniqueAncestorSecurableObject = “User Information List” for working users. Affected users have nothing.
  3. FirstUniqueAncestor = “User Information List” for working users. Affected users have nothing.

I’ve tried the following to resolve the issue:

  1. Full profile synchronization in central admin. (No Effect)
  2. stsadm -o sync deleteolddatabases 0 followed by a full User Profile sync in Central Admin (No Effect)
  3. Manually updating the missing properties in Powershell to match a working account (didn’t work, list is read-only).
  4. Deleting an affected user from the site collection and then re-adding their permissions. Issue persists.

The Solution

  1. Perform a full content deployment to  new content database.
  2. Detach the old database.
  3. Attach the new database.

Alternative

Stuart Pegg also offers an easier solution on SharePoint StackExchange using PowerShell, however we applied the above fix before Stuart’s response.

SharePoint PowerShell Script to Monitor SharePoint Availability

A little while ago on Reddit I read a post on the SharePoint Subreddit about monitoring a SharePoint 2010 environment. The #1 comment was from a user mentioning that since his fellow administrators hadn’t set up System Center Operations, he’d created a bunch of PowerShell scripts to monitor various aspects of SharePoint.

Since we also don’t use SCOM, I decided I’d write my own small script to monitor SharePoint to make sure it’s available from a basic level: just by checking to see if it’s possible to download the various site collection’s home page via HTTP, or if there are text that result from typical errors (“Troubleshoot issues with Microsoft SharePoint Foundation” for example). If the HTTP download fails, or common error text is found, shoot off an email to the admin(s). I’ve configured this script to run every 10 minutes on our farm; you may want to adjust this based off your SLA’s, governance, and experience.

Now, keep in mind that this is no-frills monitoring, canary in the coal mine type stuff. It’s not going to tell you what is wrong, but it will tell you that your clients are most probably having connection issues with SharePoint, and that aghem, you may want to look into that before the inevitable flood of “hey is SP down?” tickets and emails come your way.

#*=============================================
#* Script Name: Check SharePoint Status
#* Author: Nik Craik
#*=============================================
#* Purpose: Monitor SharePoint 2010 for errors
# and availability. If an error or problem
# is detected on a site, send an email to
# administrators.
#*=============================================
 
#Send an Email if an Error is encountered
function sendMail($site){
#SMTP server name
$smtpServer = "mail.yourserver.com"
 
#Creating a Mail object
$msg = new-object Net.Mail.MailMessage
 
#Creating SMTP server object
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
 
#Email structure
$msg.From = "sharepoint@yourserver.com"
$msg.ReplyTo = "sharepoint@yourserver.com"
$msg.To.Add("sharepointadmin@yourserver.com")
$msg.To.Add("sharepointadmin2@yourserver.com")
$msg.Priority = [System.Net.Mail.MailPriority]::High
$msg.subject = "ALERT: "+$site+" IS UNAVAILABLE"
$msg.body = "The SharePoint site collection $site is unavailable. Please have an administrator review the problem."
 
#Sending email
$smtp.Send($msg)
}
 
#Try to download the home page of a given site collection, if fails, send email
function checkIfSiteUp($url){
#Create a new web client object with the current credentials
$webclient = new-object System.Net.WebClient
$webClient.UseDefaultCredentials = $true
 
#Attempt to download site's home page, if fails, send email
try {
$page = $webclient.DownloadString($url)
$errorPage = $page.Contains("Troubleshoot issues with Microsoft SharePoint Foundation.")
$serverError = $page.Contains("Server Error")
$configDBOffline = $page.Contains("Cannot connect to the configuration database.")
if ($errorPage -or $serverError -or $configDBOffline) {
sendMail($url)
}
}
catch {
sendMail($url)
}
}
 
#Array of SharePoint Sites
$shptWebs = @("http://intranet",
"http://othersite",
"http://onemoresite")
 
#Execute for all SharePoint Sites
foreach ($web in $shptWebs)
{
checkIfSiteUp($web)
}

SharePoint PowerShell Script to Extract All Documents and Their Versions

Hey! Listen: This script doesn’t extract documents that suffer from Longurlitis (URL greater than the SharePoint maximum of 260 characters). So you may also want to also run the PowerShell Script To Find and Extract Files From SharePoint That Have A URL Longer Than 260 Characters.

Recently a client asked to extract all content from a SharePoint site for archival. A CMP file was out of the question, because this had to be a SharePoint independent solution.  Powershell to the rescue! The script below  will extract all documents and their versions, as well as all metadata and list data to CSV files.

The DownloadSite function will download all the documents and their versions into folders named after their respective document libraries. Versions will be named [filename]_v[version#].[extension].

The DownloadMetadata function will download all the document library’s metadata as well as list data from the site and export it as a CSV file. If you don’t need to download the metadata/ lists, just comment out the function below.

There’s also ample commenting in case someone wants to modify/ expand upon the script!

# This script will extract all of the documents and their versions from a site. It will also
# download all of the list data and document library metadata as a CSV file.
 
Add-PSSnapin Microsoft.SharePoint.PowerShell -erroraction SilentlyContinue
# 
# $destination: Where the files will be downloaded to
# $webUrl: The URL of the website containing the document library for download
# $listUrl: The URL of the document library to download
 
#Where to Download the files to. Sub-folders will be created for the documents and lists, respectively.
$destination = "C:\Export"
 
#The site to extract from. Make sure there is no trailing slash.
$site = "http://yoursitecollection/yoursite"
 
# Function: HTTPDownloadFile
# Description: Downloads a file using webclient
# Variables
# $ServerFileLocation: Where the source file is located on the web
# $DownloadPath: The destination to download to
 
function HTTPDownloadFile($ServerFileLocation, $DownloadPath)
{
	$webclient = New-Object System.Net.WebClient
	$webClient.UseDefaultCredentials = $true
	$webclient.DownloadFile($ServerFileLocation,$DownloadPath)
}
 
function DownloadMetadata($sourceweb, $metadatadestination)
{
	Write-Host "Creating Lists and Metadata"
	$sourceSPweb = Get-SPWeb -Identity $sourceweb
	$metadataFolder = $destination+"\"+$sourceSPweb.Title+" Lists and Metadata"
	$createMetaDataFolder = New-Item $metadataFolder -type directory 
	$metadatadestination = $metadataFolder
 
	foreach($list in $sourceSPweb.Lists)
	{
		Write-Host "Exporting List MetaData: " $list.Title
		$ListItems = $list.Items 
		$Listlocation = $metadatadestination+"\"+$list.Title+".csv"
		$ListItems | Select * | Export-Csv $Listlocation  -Force
	}
}
 
# Function: GetFileVersions
# Description: Downloads all versions of every file in a document library
# Variables
# $WebURL: The URL of the website that contains the document library
# $DocLibURL: The location of the document Library in the site
# $DownloadLocation: The path to download the files to
 
function GetFileVersions($file)
{
	foreach($version in $file.Versions)
	{
		#Add version label to file in format: [Filename]_v[version#].[extension]
		$filesplit = $file.Name.split(".") 
		$fullname = $filesplit[0] 
		$fileext = $filesplit[1] 
		$FullFileName = $fullname+"_v"+$version.VersionLabel+"."+$fileext			
 
		#Can't create an SPFile object from historical versions, but CAN download via HTTP
		#Create the full File URL using the Website URL and version's URL
		$fileURL = $webUrl+"/"+$version.Url
 
		#Full Download path including filename
		$DownloadPath = $destinationfolder+"\"+$FullFileName
 
		#Download the file from the version's URL, download to the $DownloadPath location
		HTTPDownloadFile "$fileURL" "$DownloadPath"
	}
}
 
# Function: DownloadDocLib
# Description: Downloads a document library's files; called GetGileVersions to download versions.
# Credit 
# Used Varun Malhotra's script to download a document library
# as a starting point: http://blogs.msdn.com/b/varun_malhotra/archive/2012/02/13/10265370.aspx
# Variables
# $folderUrl: The Document Library to Download
# $DownloadPath: The destination to download to
function DownloadDocLib($folderUrl)
{
    $folder = $web.GetFolder($folderUrl)
    foreach ($file in $folder.Files) 
	{
        #Ensure destination directory
		$destinationfolder = $destination + "\" + $folder.Url 
        if (!(Test-Path -path $destinationfolder))
        {
            $dest = New-Item $destinationfolder -type directory 
        }
 
        #Download file
        $binary = $file.OpenBinary()
        $stream = New-Object System.IO.FileStream($destinationfolder + "\" + $file.Name), Create
        $writer = New-Object System.IO.BinaryWriter($stream)
        $writer.write($binary)
        $writer.Close()
 
		#Download file versions. If you don't need versions, comment the line below.
		GetFileVersions $file
	}
}
 
# Function: DownloadSite
# Description: Calls DownloadDocLib recursiveley to download all document libraries in a site.
# Variables
# $webUrl: The URL of the site to download all document libraries
function DownloadSite($webUrl)
{
	$web = Get-SPWeb -Identity $webUrl
 
	#Create a folder using the site's name
	$siteFolder = $destination + "\" +$web.Title+" Documents"
	$createSiteFolder = New-Item $siteFolder -type directory 
	$destination = $siteFolder
 
	foreach($list in $web.Lists)
	{
		if($list.BaseType -eq "DocumentLibrary")
		{
			Write-Host "Downloading Document Library: " $list.Title
			$listUrl = $web.Url +"/"+ $list.RootFolder.Url
			#Download root files
			DownloadDocLib $list.RootFolder.Url
			#Download files in folders
			foreach ($folder in $list.Folders) 
			{
    			DownloadDocLib $folder.Url
			}
		}
	}
}
 
#Download Site Documents + Versions
DownloadSite "$site"
 
#Download Site Lists and Document Library Metadata
DownloadMetadata $site $destination