How to copy SharePoint list items in Powershell

I had a requirement to split out a SharePoint list that had grown to over 200k items into smaller clone lists. Easy I thought, there must be hundreds of blogs online that will give step by step instructions to do just this so a quick internet search should tell me how to do it.

So I searched and I didn’t find any blogs that did quite what I wanted, so here is how I achieved it which I hope can help anybody out who is in the situation I was in;

Firsts I created an empty list template from the master list which has grown so big.

I then made several (in my case 28) lists with an obvious naming convention. (ProductList-Jan13, ProductList-Feb13, etc…)

I made these within the same site however these could be as far apart as different site collections as long as they are within the same farm and can be accessed by the same front end server (where later we will run the PowerShell to move the list items)

With the new framework created all that was left was to move the lists items, in my case I also created new input forms and site pages for each list but I won’t go into detail on that in this blog post.

To move data between lists there are five main ways; Content and structure move, Datasheet copy/paste, PowerShell, Workflow and import from excel.

I decided that the best way to move my data was PowerShell;

  • Content and Structure limits you to 1000 items at a time so for large amounts of items it can be a frustrating and timely method.
  • Datasheet copy and paste is great for small numbers of items but anything over few hundred and internet explorer usually crashes and then there’s the task of manually working out what items copied and which didn’t.
  • Workflows copy data well but you either need to manually kick these off or use PowerShell to start a workflow on all items. I’ve tried running a workflow on all items in a large list before (70k items) and it took 15+ hours before freezing the remaining items on “starting” so I wouldn’t recommend kicking off a workflow on all items in a large list.
  • Excel import can work well, like some of the other methods I found it doesn’t handle large amounts of data well. The reason I didn’t use it in this instance was because my lists had lookup, choice and calculated columns, importing from excel Sharepoint does its best guess at what types of columns each should be but usually falls short so there’s usually a lot of configuration to get the columns right when importing from excel. With 28 lists to create this would have driven me mad so I had to rule out an excel import.

I’m ok at PowerShell I’m not great hence why my first port of call is google when I need to use PowerShell, Usually I find a script close enough to what I want to achieve that I can do some small changes to get exactly what I want. In this case nothing I found was close enough so I had to write a script from scratch;

Before I get into the script I want to recommend using “Windows PowerShell ISE” it makes scripting PowerShell allot clearer and shows any errors before running the script. I now run all of my PowerShell through the ISA even if it’s a script I know works as it also gives start/stop buttons and a window to view what’s going on. If you don’t have PowerShell ISE on your programs menu there are plenty of blogs that will tell you the PowerShell commands to run to import and install it.

So onto what youre most likely here for, the script;

The first thing I had to do in PowerShell was load the Sharepoint PowerShell snap in (if I was using the Sharepoint management module I wouldn’t need to do this but as I want to use ISE I have to specifically load this snapin). The code to load the SharePoint snapin is

“if((Get-PSSnapin | Where {$_.Name -eq “Microsoft.SharePoint.PowerShell”}) -eq $null) {          

Add-PSSnapin Microsoft.SharePoint.PowerShell }”

This checks to see if it is already loaded and if not it will load it.

Next I had to load the Sharepoint site where the source list located;

“$web=get-spweb -identity https://www.sharepoint.com/site

Next to load the source lists located in this site;

“$lista=$web.Lists[“Products”]”

Next I need to load the destination list which is located in this site; (if I was moving between sites or site collections I would have to specify a second spweb here, changing the original $web to $weba and making a second $webb.)

“$listb=$web.Lists[“ProductList-Dec13”]”

(If I were using a $webb I would need to update the reference in this line to that too)

Next I need to tell the script I want to use the items within the Source list I’ve just referenced;

“$items=$lista.items”

Here in the script is where I struggled to find any examples on the internet, in my situation I needed to split the data up into multiple lists. To do this I needed to specify a condition within my data to be true and only copy that data if that condition is true. This way I can modify the destination list and condition to keep running this script to split my main list into the 28 i created.

To create the condition, as a C# programmer my first though was I want an “IF” statement. In PowerShell we use a “where-object” instead specify the condition the object (item) needs to have.

Remember to pipe this script after the $list=$lista.items command;

“| Where-Object {$_[“Product Month”] -eq “December”}”

Next we need to run through each item in the list;

“foreach ($item in $items) {“

Now we need to tell the script that we are going to want to add items it runs through to the destination list;

“$newitem=$listb.items.Add()”

Now the script knows the goal is to add items from the source list to the destination list we need to tell the script which columns we want to include in the item;

$newitem[“Column1”] = $item[“Column1”]

$newitem[“Column2”] = $item[“Column2”]

$newitem[“Column2”] = $item[“Column2”]

In my example the source and destination list use the same column names however they do not have to as long as the column titles on the left are the names in the destination and the column titles on the right are the names in the source, spelt correctly with the correct capitalisation and the column types are the same (i.e. a number field to a number field will work but obviously a date and time field to a number field will not work). You can add as many lines of this code as you need for columns, mine had 15 columns in total.

Next we need to tell the script to run the add item now it has all of the information that we’ve specified;

“$newitem.update() }”

Finally to stick to best practise we need to dispose of the web that we referenced at the top;

“$web.dispose”

And that’s it, running this in PowerShell on your SharePoint front end server will copy the items with the chosen columns from the source list to the destination list. Depending on how many items you are moving and your server spec this could take a while. If you are using PowerShell ISE all you have to do is paste the code into the window and press the green arrow “start” and watch the display window which will show the dispose line at the end.

I hope this will help others doing the same task or just learning PowerShell, rather than just give the script I have tried to explain what is happening line by line. This scrip can be added to, to move the items rather than copy by including a delete statement after the update. Personally I like to check it’s all worked fine before deleting anything so I deleted in datasheet view manually after running this script. What this script doesn’t do which maybe I will in future try to add into it is copying the created by and create fields and copy versions. If you know how to include these please add these to the comments.

Here’s the full script;

PS

Advertisements

Exporting and Importing SharePoint sites.

So I had a task to move multiple SharePoint site collections/Farms to sub sites within one site collection so that the company could condense their hardware into one farm/site collection among other reasons.

Although this is fairly simple I have learnt a few tips from this exercise which is why I will cover it entirely in this blog;

Let’s assume that the farm you are exporting from has no custom solutions installed as things then get trickier, all solutions must be the same on both farms. This is a good reason to always keep copies of anything you install on a SharePoint farm, I have seen companies before who have purchased third party solutions then a few years later that company goes out of business or is brought out and it’s almost impossible to get the original install/WSP files, then when they want to upgrade or migrate they face custom solution issues.

From a front end server on the farm you are exporting out you have two options, export using either STSADM or PowerShell. If you are running 2007 then you only have one option and that is STSADM.

As I have been using SharePoint for a long time I will usually go down the STSADM route without thinking, but we all know that PowerShell is a lot more powerful for most commands so it’s usually best to go down the PowerShell route. However in this case Importing and Exporting in STSADM and PowerShell run almost the same processes to its really personal choice.

If you are going down the STSADM route then open CMD as an administrator and cd to the Hive 12/14 bin folder. Once there set your STSADM command, set the Operation as export, specify the URL of the site collection or Subside you want to export and set the filename and location of where you want to the exported file to go (remember to make sure that the drive has enough space). I have just described the basic switches which you are required to make an export run but there are more to specify and constraints you may have. For example by default the export compresses all of the files  into a compressed file, unless space is an issue I do not like to do this so I run the switch –nofilecompression on the end, this then exports the files to a directory.I won’t go over all of the switches as that’s what the TechNet articles are for.

Here is a sample of how it should look “STSADM –O Export –URL http://mysite/subsite -Filename c:\backupdir\backup.cmp –nofilecompression

If you wanted to instead use PowerShell you would open the sharepoint management shell and use “Export-SPWeb http://mysite/subsite -Path c:\backupdir\backup.cmp –NoFileCompression”. Notice that in powershell we do not have to specify the –URL switch and the –Filename switch now becomes –Path. Powershell is also case sensitive so we need to make sure capital letters are used in the correct place, again TechNet will give you the correct operations.

Here is a good place to mention the difference between SPSite  and SPWeb, in SharePoint a “Site” is a site collection and a “Web” is a subsite. In PowerShell if you want to import or export a “Site” then you need to specify it by its ID not its URL.

Once you have typed your script into cmd or PowerShell then press enter and it will create the export file/directory. This could take some time if there is allot of data on the site.

Once you have your file then more that to your destination farm, onto a front end server if possible.

*If you have export from 2007 you cannot then import into 2010, exporting and migration are two completely different things. If you need to upgrade the data then do a farm upgrade before exporting or if that isn’t possible build a temp farm, import into there, upgrade the farm, do an export and then you have a 2010 export file*

On the front end server of your destination farm, choose your import method, again either STSADM or PowerShell, as a general rule I would keep with whatever I used to export. 2010 does support STSADM but it will be discontinued in the 2013 version of share point.

If you are using PowerShell then you need to first manually go to the site you wish to import the file into as a subsite and create a subsite using the same site template which was used for the original.

If you are using STSADM you can simply specify the URL where you want the site to be created and STSADM will create the site for you, this means that you do not need to work out which site template as used and pre create the site. This is why I personally prefer to Import with STSADM.

Now to type the commands to import the file;

STSADM. “STSADM –O Import –URL Http://mysite/newsubsite -Filename c:/backup.cmp –NoFileCompression”. If you used “nofilecompression” on the export you must use it on the import too.

PowerShell.”Import-SPWeb http://mysite/newsubsite –Path c:backup.cmp –NoFileCompression”.

Again there are a few other switches you can use to run the import which you will find in TechNet.

Once you have typed in the commands and pressed enter it will now begin importing and soon you will have your data moved from one farm to another.

SOAP message could not be parsed

Over the last few days I have had a problem publishing an InfoPath form to a SharePoint 2010 list, this took me almost 2 days solid work to resolve so I hope this post saves somebody else’s time….

I have a SharePoint list which uses a custom InfoPath form. The list has been used in a production environment for a few months so it has grown to over 30k items.

I had been asked to make a change to the InfoPath form, so I made the change and clicked publish – I get the error “The SOAP message could not be parsed” after some googling I found out this was because of the high amount of items in the list. So I then created an identical list, using content and structure I moved all of the items from the first list (this took ages as it only does it 1000 at a time) I then made the changes to the InfoPath form and published it. Tested it worked (which it did) then I moved all of the items back again using content and structure (I moved the data this way as I needed to retain the created/modified fields).

After moving all of the data back, I tested the form and it had reverted back to how it was before I started this process! grrr

So I decided to check the logs (I know this should have been my first port of call) I could see in the logs that the SOAP message was really just InfoPath’s lack of understanding SharePoint errors and it was really a timeout error. So my next port of call was to edit the web.config file to increase the http timeout, you can do this by going to the drive your IIS stores its data > Inetpub > wwwroot > wss > VirtualDirectories > “your web app” . Edit the web.config file in a text editor like notepad; look for this line of code –

<httpRuntime maxRequestLength=”2097151″ executionTimeout=”3600″ />

Then increase the executionTimeout,  in my case it was already at 3600 (I had already increased this when I built the server) This should have been more than enough, just in case I added a few zeros, saved the file and reset ISS.

I then clicked in my SharePoint list, clicked customize InfoPath form, made my changes and clicked publish……. I got the SOAP message again.

Back to the drawing board again! I then changed the execution time out on the web.config file back to how I found it as it was clear this had not made a difference.

So I knew it was timing out but I wasn’t sure why, after all, the default timeout setting was not causing the issue.

At this point I went through all of the InfoPath service settings which I won’t go into detail on as they were not the problem.

As this particular list had several calculated fields I decided to delete these and try again, after all, as the InfoPath form publishes it updates every item in the list, so it was logical that the calculated fields would then try to update too and cause a long session. When I went to delete these columns I received an error “This page has been modified since you last opened it”. From previous experienced I knew this was a custom branding issue, so I reset my master page back to the out of the box V4 and tried to delete the calculated columns again, this time they deleted fine.

I then clicked on customise form, made the changes to the InfoPath form, clicked publish …. It worked!!

So after all my work it turned out that it was the calculated fields, which in turn had problems due to the custom branding.  After testing the InfoPath input form was working as I wanted it to I then recreated the calculated fields, tested the form again, then set my custom master page as default and again tested the form. It all worked!

Infopath calculate each row of a repeating table

This was somthing that bugged me for ages then it suddenly clicked and i couldnt belive how simple it is . As i didnt find anything on the net to help me work this out  i hope this will help someone.

The senario is;

I have a field in an infopath form which holds a value, then i have a repeating table in the same form which is from a SharePoint list. This repeating table has a number column which i want to calculate against the value field row by row.

My first mistake was to think that this is possible using a single text field, it is only possible to do this with a calculated value field as you cannot store data unbound in a repeating table.

Click on the right hand column in the repeating table and choose “Insert” > “Column to the right” give it a meaningful header then click in the new cell and insert a calculated value field.

Right click the calculated value field and click properties, click on the “FX” xpath button and then insert a field, choose the data source which the repeating table comes from, choose the field you wish to calculate on, then do the same to insert the in form field to calculate with and once both fields are in the box put the calculation symbol between the fields i.e. * / – + . Click ok to get back to the form and preview. Now all rows should have a row by row calculation against them.

 

Export site users to excel

SharePoint comes with a great feature which is that its lists and libraries  can be exported to excel, (if you then change the settings in the data connection within excel to refresh on file open you can then save this excel and every time you open it, it will be up to date with the list, saving you from keep navigating to the list and pressing exporting to excel)

unfortunately there is no export to excel option on the site people and groups list, so to create this feature you have to create a query file (this is the file that is created when in a regular list you click export to excel)

To create a query file first create a new .txt file then paste the following code into it changing the text in bold to your details;

WEB 1 http://SITE COLLECTION URL/_vti_bin/owssvr.dll?XMLDATA=1&List={LIST ID}&View={LIST VIEW ID}&RowLimit=0&RootFolder=%2fLists%2fLIST NAME

Selection={LIST ID}-{LIST VIEW ID} EditWebPage= Formatting=None PreFormattedTextToColumns=True ConsecutiveDelimitersAsOne=True SingleBlockTextImport=False DisableDateRecognition=False DisableRedirections=False SharePointApplication=http://SITE COLLECTION URL/_vti_bin SharePointListView={LIST ID} SharePointListName={LIST VIEW ID} RootFolder=/Lists/LIST NAME

Save the file, then rename the file to  .IQY

Double click and open with Excel and it will export your site collection people and groups list into the excel workbook.

Here is an example of the code;

WEB 1 https://www.sitecollection.com/_vti_bin/owssvr.dll?XMLDATA=1&List={37fd0da1-ce48-405b-b8ec-d8b364dcaddb}&View={F86875B9-C5A8-4AAE-94E4-9E7D1BB9551F}&RowLimit=0&RootFolder=%2fLists%2fList1

Selection={37fd0da1-ce48-405b-b8ec-d8b364dcaddb}-{F86875B9-C5A8-4AAE-94E4-9E7D1BB9551F} EditWebPage= Formatting=None PreFormattedTextToColumns=True ConsecutiveDelimitersAsOne=True SingleBlockTextImport=False DisableDateRecognition=False DisableRedirections=False SharePointApplication=https://www.sharepointcollection.com/_vti_bin SharePointListView={F86875B9-C5A8-4AAE-94E4-9E7D1BB9551F} SharePointListName={37fd0da1-ce48-405b-b8ec-d8b364dcaddb} RootFolder=/Lists/List1

See all accounts in a site collection

Quick and easy one, sometimes you just want to see every user account recognised in a site collection (this doesn’t mean they have any permissions at all just that they are in your Active directory and has been synced by the UserProfileSyncronisation service)

All you have to do is add the following to end of the site collection URL;

/_layouts/people.aspx?MembershipGroupId=0&FilterField1=ContentType&FilterValue1=Person

Heres an example:

http://mysharepointsitecollection.com/_layouts/people.aspx?MembershipGroupId=0&FilterField1=ContentType&FilterValue1=Person

How to find a SharePoint list ID

Navigate to the SharePoint list in your browser. On the ribbon select the list Settings in this new page copy the url from the browser address bar into a text editor such as notepad.

Example:
http://sharepoint2010/_layouts/listedit.aspx?List=%7B3A0740A0%2D56AB%2D4FD3%2DAFD0%2D5631C7565208%7D

• Delete everying before and including “List=”.
• Change “%7B” to “{”
• Change all “%2D” to “-“
• Chnage “%7D” to “}”

Now you have the List (Guid) ID:
{3A0740A0-56AB-4FD3-AFD0-5631C7565208}