Upload SharePoint data to Office 365

This is a step by step guide on how to use the SharePoint Online Migration PowerShell cmdlets to migrate content from an on-premises file share to Office 365.

SharePoint Online Migration PowerShell cmdlets are designed to move on-premises content from file shares. Requiring minimal CSOM calls, it leverages Azure temporary BLOB storage to scale to the demand of large migration of data content.

System Requirements

  • Supported Operating Systems:    Windows 7 Service Pack 1,Windows 8,Windows Server 2008 R2 SP1,Windows Server 2008 Service Pack 2, Windows Server 2012, Windows Server 2012 R2

  • Windows PowerShell 4.0


Note: Permissions: You must be a site collection administrator on the site you are targeting.

Getting Started with SharePoint Online Migration

Step 1: Install SharePoint Online Management Shell

  1. Uninstall all previous versions of the SharePoint Online Management Shell.

  2. Install from here: SharePoint Online Management Shell.

  3. Open SharePoint Online Management Shell and select Run as Administrator.

Step 2: Set up your working directory

  1. Create two empty folders before you start the migration process. These folders do not require a lot of disk space as they will contain only XML.

  2. Create a Temporary package folder.

  3. Create a Final package folder.

Using the SharePoint Online (SPO) Migration PowerShell commands

Step 1: Determine your locations and credentials

$creds = (Get-Credential admin@contoso.com)

$sourceFiles = '\\fileshare\users\charles'

$sourcePackage = 'C:\migration\CharlesDocumentsPackage_source'

$targetPackage = 'C:\migration\CharlesDocumentsPackage_target'

$targetWeb = 'https://contoso-my.sharepoint.com/personal/charles_contoso_com'

$targetDocLib = 'Documents’

Step 2: Create a new content package

After you determine your locations and credentials, the next step is to create a new migration package from a file share. There are two options in creating a content package: either from a fileshare or from a SharePoint Server site. Choose one of the following methods.

Create a new content package from an on-premises fileshare

To create a content package from a fileshare, the New-SPOMigrationPackage command reads the list of content targeted by the source path and will generate XML to perform migration.

The following parameters are required unless marked optional: :

  • SourcefilesPath: points to the content you want to migrate

  • OutputPackagePath: points to your Temporary folder

  • TargetWeb: point to your destination web

  • TargetDocumentLibraryPath: point to the document library inside the web.

  • IgnoreHidden: option to skip hidden files (optional)

  • ReplaceInvalidCharacters: will fix invalids characters when possible (optional)


# Create new package from file share, ignoring hidden file and replacing unsupported characters in file/folder name New-SPOMigrationPackage -SourceFilesPath $sourceFiles -OutputPackagePath $sourcePackage -TargetWebUrl $targetWeb -TargetDocumentLibraryPath $targetDocLib -IgnoreHidden –ReplaceInvalidCharacters

Step 3: Convert the content package for your target site

After you have created the content package, the ConvertTo-SPOMigrationTargetedPackage command converts the xml generated in your temporary folder. It saves a new set of targeted migration package metadata files to the target directory. This is the final package.

Note: Your target site collection administrator credentials are used to gather data to connect to the data site collection.

There are six required parameters to enter (others are optional)

  • TargetwebURL: points to the destination Web

  • SourceFilesPath: points to the you want to migrate

  • SourcePackagePath: points to your Temporary package folder

  • OutputPackagePath: points to your final package folder

  • TargetDocumentLibraryPath: the path to your destination library

  • Credentials: SPO credential that has admin rights to the destination site


# Convert package to a targeted one by looking up data in target site collection, with –ParallelImport turned on to boost File Share migration performance.

$finalPackages = ConvertTo-SPOMigrationTargetedPackage -ParallelImport -SourceFilesPath $sourceFiles -SourcePackagePath $ sourcePackage -OutputPackagePath $ targetPackage -Credentials $cred -TargetWebUrl $targetWeb -TargetDocumentLibraryPath $targetDocLib

Step 4: Submit content to import

The final step is to use the Invoke-SPOMigrationEncryptUploadSubmit command to create a new migration job in the target site collection, and returns a GUID representing the JobID. This command will upload encrypted source files and manifests into temporary Azure blob storage per job.

There are four required parameters to enter (others are optional):

  • TargetwebURL: point to the Web of the destination

  • SourceFilesPath : Point to the files to import

  • SourcePackagePath : points to the final manifest of the files to import

  • Credentials: SPO credential that has Site Collection Administrator rights to the destination site

Example 1:

# Submit package data to create new migration job

Invoke-SPOMigrationEncryptUploadSubmit -SourceFilesPath $sourceFilePath -SourcePackagePath $spoPackagePath -Credentials $cred -TargetWebUrl $targetWebUrl

Example 2:

# Submit package data to create new migration jobs for parallel import

$jobs = $finalPackages | % {Invoke-SPOMigrationEncryptUploadSubmit -SourceFilesPath $_.FilesDirectory.FullName -SourcePackagePath $_.PackageDirectory.FullName -Credentials $cred -TargetWebUrl $targetWeb}

Processing and Monitoring your SPO Migration

After the job is submitted, only Azure and SPO are interacting to fetch and migrate the content into the destination. This process is Timer Job Based, which means it’s a queue on a first come first served basis. This does not prevent other jobs from being queued up by the same person. There is a potential of a 1 minute delay if there are no other jobs running.

Checking job status

You can check the status of your job by viewing the real time updates posted in the Azure storage account queue by using the Encryption.EncryptionKey returned in step 4.

Viewing logs

If you’re using your own Azure storage account, you can look into the manifest container in the Azure Storage for logs of everything that happened. At this stage, it is now safe to delete those containers if you don’t want to keep them as backup in Azure. If there were errors or warnings, .err and .wrn files will be created in the manifest container.

If you’re using the temporary Azure storage created by Invoke-SPOMigrationEncryptUploadSubmit in step 4, the import log SAS URL can be obtained by decrypting the Azure queue message with “Event” value “JobLogFileCreate”. With the import log SAS URL, you can download the log file and decrypt it with the same encryption key as returned in step 4.

Scripting Scenarios for Reuse

If you’re using your own Azure storage account, then use Set-SPOMigrationPackageAzureSource and Submit-SPOMigrationJob to upload content into your storage

$userName = "admin@contoso.onmicrosoft.com"
$adminSite = "https://contoso-admin.sharepoint.com"
$sourceFilePath = "D:\data\documents\"
$packagePath = "d:\data\documentPackage"
$targetWebUrl = "https://contoso.sharepoint.com/sites/finance"
$spoPackagePath = "d:\data\documentPackageForSPO"
$targetLibrary = "Documents"
$cred = Get-Credential $userName

Connect-SPOService -Url $adminSite -credential $cred
New-SPOMigrationPackage -SourceFilesPath $sourceFiles -OutputPackagePath $sourcePackage -TargetWebUrl $targetWebUrl -TargetDocumentLibraryPath $targetLibrary -IgnoreHidden -ReplaceInvalidCharacters
# Convert package to a targeted one by looking up data in target site collection
ConvertTo-SPOMigrationTargetedPackage -SourceFilesPath $sourceFiles -SourcePackagePath $sourcePackage -OutputPackagePath $targetPackage -TargetWebUrl $targetWeb -TargetDocumentLibraryPath $targetDocLib -Credentials $creds # Submit package data to create new migration job
Invoke-SPOMigrationEncryptUploadSubmit -SourceFilesPath $sourceFilePath -SourcePackagePath $spoPackagePath -Credentials $cred -TargetWebUrl $targetWebUrl

# Look at the returned information of a job
$job = $jobs[0]
# To obtain the $job.ReportingQueueUri.AbsoluteUri
#To obtain the Encryption key: $job.Encryption

Important: All messages are encrypted in the queue. If you want to read from the ReportingQueue, you must have the EncryptionKey.


Best Practices and Limitations



Package size


Use –ParallelImport switch for File Share migration which will automatically split the big package into smaller ones.

File size

2 GB

Target size

Target site should remain non-accessible to users until migration is complete

SharePoint Online limits

SharePoint Online and OneDrive for Business: software boundaries and limitsSharePoint Online: software boundaries and limits

Azure Limits



TB per storage account

500 TB

Max size of single blob container, table, or queue

500 TB

Max number of blob containers, blobs, file shares, tables, queues, entities, or messages per storage account

Only limit is the 500 TB storage account capacity

Target throughput for single blob

Up to 60 MB per second, or up to 500 requests per second

See Also

Use Windows PowerShell cmdlets for SharePoint Online and OneDrive Migration

Share Facebook Facebook Twitter Twitter Email Email

Was this information helpful?

Great! Any other feedback?

How can we improve it?

Thank you for your feedback!