Archive | SharePoint configuration RSS for this section

UPDATED: Gigantic Config DB… why… and how to get it back to normal

It has been pretty quiet around here but I’m trying to get some content up every now and then…

This was an issue I encountered recently with one of my clients. Their  configuration datbase has grown up to 43GB and they couldn’t get it back to normal size which according to this TechNet Article is definied as small (~1GB) in size.

The first thing I have done to drill down into this issue was to open the disk usage by table report for the database within SQL management studio. This report showed me at the first glance that the timerjob history table size was very huge. Usually the timerjob history gets cleaned by a timerjob called “Delete job history” which by default runs every week

The next thing was to check the timerjob history in Central Admin and look for the last runs of this particular timerjob. And here I found the job failing for multiple weeks. When checking the error message it was pretty clear, the timerjob failed because the transaction log of the config db was full!!!

The next step was to check the settings for the transactionlogs in SQL server management studio and here I found the size of the transactionlog limited…After setting the size of the transactionlog to unlimited and running the timerjob the timerjob history table in the disk usage by table report shrunk back to a normal size and the database size could be normalized back to less than 1GB.

UPDATE:

A good friend of mine Paul Grimley contacted me about this blogpost and highlighted that it would be helpful to update this article with some general guidance on database maintenance for SharePoint. Especially guidance on how we got the database size in the above described scenario back to normal.

In general I would recommend this TechNet article on SharePoint Database maintenance. It is for SharePoint 2010 but still valid for 2013. It also gives particular guidance on how to shrink data files.

Also I would like to share some articles Paul pointed me at which look at this topic from a supportability perspective:

Plus extra Kudos to Paul for checking the delete history timerjob on his environment – which lead us to identify that the schedule of the timerjob has changed from weekly in SharePoint 2010 to daily in SharePoint 2013.

Hope this helps…

Advertisements

“Database is up to date, but some sites are not completely upgraded” – and the journey to straighten things out

I faced an interesting thing in the SharePoint environments of one of my customers the other day…obviously the cool stuff never happens on your local environments Smiley.

This customer had patched their environment lately, everything was running smooth, nobody was complaining about anything, then, by incident, I checked the databasestatus.aspx page in central admin and found some databases marked with:

“Database is up to date, but some sites are not completely upgraded”.

image

So my first idea was, because they had a awful lot of databases in this state, to loop through all the databases and perform an upgrade-spcontentdatabase on the ones which contain not upgraded sites.

After running the script I found nothing changed on the UI (turns out if you do an IIS reset or an AppPool recycle this changes quite quickly). So after doing an IIS reset on the CA box I found some databases in the desired state “No action required” others though still in the same state.

So I checked the upgradestatus.aspx in the CA and found some warnings and errors for the databases containing sites which are not upgraded in the referenced upgrade logs. The most warnings I saw stated that

“Feature could not be upgraded. Exception: Unable to access web scoped feature (id: FeatureID) because it references a non-existent or broken web (id: webID) on site “siteurl”. Exception: System.ArgumentException: Value does not fall within the exptected range.”

When I checked the mentioned site and searched for the specific web I found this web existing in the recycle bin of the site.

So my next step was to create a PowerShell script to loop through all the sites and figure out which sites contain web objects in their recycle bin.

Once I have done that I created another PowerShell script removing all the webs from the recycle bin of the affected sites.

After that I ran my upgrade script again and found all databases in the “No action required” state except for two.

The first one was easy, this database was taken offline and for that couldn’t be upgraded.

For the second one I found Errors in the upgrade logs telling me:

Microsoft.SharePoint.Portal.Upgrade.MossSiteSequence failed.

Exception: Attempted to perform an unauthorized operation.                      

at Microsoft.SharePoint.SPSite.set_AllowDesigner(Boolean value)
   at Microsoft.SharePoint.Portal.Upgrade.AllowMasterPageEditingAction.Upgrade(SPSite site)
   at Microsoft.Office.Server.Upgrade.SiteAction.Upgrade()
   at Microsoft.SharePoint.Upgrade.SPActionSequence.Upgrade()

This confused me because I knew that my customer had SharePoint Designer disabled for this webapplication.

After some asking around it turned out this wasn’t always the case, they had it enabled for a short timeframe… So my assumption is one of their sitecollection administrators must have enabled SharePoint Designer on his or her site during that short timeframe and maybe even modified something with it… scary huh?

Now SharePoint Designer being disabled on webapplication level SharePoint seems to have a little problem with that when performing upgrade operations on this site.

To overcome this issue I enabled SharePoint Designer for the webapplication and rerun my upgrade script again and afterwards disabled the SharePoint Designer settings again on the webapp level. After that I had all databases in every SharePoint Admins favorite state “No action required”

“Upgrade required” and missing SQL aliases a funny little fact

I encountered an interesting issue today when I was scaling out a existing SharePoint 2010 farm. So after installing SharePoint, several language packs, service pack and CUs I successfully joined the server to the farm.

No issues so far.

When I was checking the “servers in farm” link within Central admin I found right next to my newly joined server a big red “Upgrade required” message. But it wasn’t highlighting what component it wants to have upgraded. Normally this component would also be highlighted in red.

So I double checked my installation and made sure that I had all components installed properly, I even ran the configuration wizard one more time. Still the same result.

So my next step was to force SharePoint to write me some upgradelogs and give me some more details on what it actually was missing.

For that I used the commandline psconfig:

psconfig.exe -cmd upgrade -inplace b2b -force -cmd applicationcontent -install -cmd installfeatures

Once it was done I checked Central admin for the upgrade status and found the upgrade failed as expected but now SharePoint wrote me some nice upgradelog which I could look into.

After checking the log I found a few errors telling me that SharePoint tries to upgrade databases located under a certain SQL alias but it could not find them.

So I fired up cliconfg and found the SQL alias not existing after adding it (and also adding it to the x64 version of cliconfg) I just re-ran the SharePoint configuration wizard and after checking the servers in the farm section in central admin tataaaaaa the server was added with a nice grey “no action required” right next to it.

And what is the key takeaway from this? If you use SQL aliases (which I am a pretty big fan of) make sure to have them configured consistent across all your servers and when you see a warning telling you that an “upgrade is required” don’t  immediately think that you have to install software but also check your configuration.

List Threshold and Search crawl error–The Site cannot be downloaded

After doing a migration from SharePoint 2007 to SharePoint 2010 for one of my clients I experienced a funny Search crawl error. My client had a webpart on their SharePoint 2007 environment which displays the recently changed documents on the site. Everything was working just fine in SharePoint 2007 and the migration to SharePoint 2010 was running successfully as well.

Once we configured a search crawl to index the content we encountered several issues in the crawl logs telling us that the site can’t be downloaded and for that can’t be crawled.

After we ensured that the crawl account had appropriate permissions on the webapplication I used a little move I pull from time to time to troubleshoot searrch crawl issues. I logged in as the crawl account and we found that when the crawl account accesses the site a threshold error gets thrown which tells us that the webpart is exceeding the allowed threshold of 5000 items (default setting for the webapplication).

You can adjust the threshold in the webapplication general settings in central administration I would not recommend it though because you open up the door for very severe performance issues within your environment. Rather I would recommend to rethink the way you query your data from SharePoint to be more performance efficient.

My client was using his account with administrative privileges to verify if everything was working as expected after the migration but for this account the threshold will not be applied. The crawl account just has read permissions to the content and for that wasn’t able to crawl the content due to the threshold.

After removing the webpart from the site(s) crawling was just working fine. The webpart will need to be replaced by one beeing a little bit more aware of the threshold in SharePoint 2010 Smiley.

Stuff2Read: Sandboxed solutions service

This time I was digging a little bit deeper in the sandboxed solutions topic and especially in the configuration of resource points and service tiers.

Here are some interesting reads on those topics:

Resource Usage Limits on Sandboxed Solutions in SharePoint 2010
Configure resource points for sandboxed solutions (SharePoint Server 2010)
Nice overview on sandboxed solutions service tiers
Configure sandboxed solutions service tiers (SharePoint Server 2010)

PowerPoint / Word WebApps–PowerPoint / Word WebApp cannot open this presentation / document

When trying to open a PowerPoint presentation or an Word document stored in SharePoint using the PowerPoint / Word WebApps you might face the following issue:

image

image

 

When I saw this error I felt like there is an error in the matrix (dejavu) not only in front of me.

Looking over my shoulder for Neo I was checking the eventlog I found EventID 3760

image

I had an issue like this before (described in this blogarticle:Excel calculation services–The Workbook can not be opened)

When applying the script mentioned in this article and performing an IISreset the issue got resolved.

The concerns highlighted by me in the above mentioned article are still the same (and to be honest have even been boosted)

So I will have to include all service accounts running my Office WebApps to the script which I use to create new Content Databases in the future.

Good2Know: Merge Terms…What does it do?

Today I played around with the SharePoint2010 managed metadata termstore and especially with the “merge terms” functionality.

What did I do:

1. I have created a termset “TestTermSet” (I know I am so creative when it comes down to names)

2.I have created Terms (Term1-Term4) within the termset

3. I have created a list “TermsTest” and added two managed metadata columns to it.

4 I have created two entries in that list (Test1 – Term2, Term3 / Tes3 – Term4)

I ran the following PowerShell to get me the IDs of the existing terms:

$taxsession = Get-SPTaxonomySession -Site http://yoururl
$termstore = $taxsession.termstores[“SharePoint – MMS”]
$testgroup =$termstore.groups[“Tests”]
$termset =$testgroup.termsets[“TestTermSet”]
foreach ($term in $termset.terms) {
   
    Write-Host $term.name “-“$term.id
}

 

The output was:

Term1 – 1280d6f8-3db8-42ec-87be-2ab9389e8d13
Term2 – c3cb39a5-5fcb-4040-90c7-005eb34b53f1
Term3 – 01bb9ed6-08e5-43dd-be03-9bbd17a45fd2
Term4 – 8fe913ef-6879-4003-b0e5-82ad9a358dfb

Then I merged (using the UI on central administration) Term1 and Term4:

After running the script again I get this result:

Term2 – c3cb39a5-5fcb-4040-90c7-005eb34b53f1
Term3 – 01bb9ed6-08e5-43dd-be03-9bbd17a45fd2
Term4 – 8fe913ef-6879-4003-b0e5-82ad9a358dfb

So Term1 disappeared, Term4 kept its ID and when you look into the UI you see that Term1 has been added as an “Label” (synonym) to Term4

image

When you now check the list with the entries made you will still find Term1 as value of one of the columns.

When you now create a new entry in your list and you enter “Term1” SharePoint will autosuggest you “Term4”.

image

So from what I can see the “Merge term” functionality just adds a new label to an term and removes the term. And as so often in SharePoint values set previous to the merge will not be updated after a merge… Interesting