Use case: There’s a set of files/scripts/templates that I want to keep in sync on a set of servers, but only on-demand.
A few different ways to solve this, but one way following a pattern I’ve used a few times is to have an Azure DevOps pipeline that populates and Azure File Share, and then a separate script deployed on the servers that can on-demand pull in files from the File Share.
The script below is a YAML pipeline for Azure DevOps, that uses an AzurePowerShell task.
The primary issue I had to work-around with this (at least using the Azure PowerShell module, is that the cmdlet “Set-AzStorageFileContent” requires the parent directory to exist; it won’t auto-create it. And unfortunately “New-AzStorageDirectory” has the same problem, not creating directories recursively.
So the PowerShell script below has two sections: first to create all the folders by ensuring each leaf in the path of each distinct folder gets created, and then populating with files.
variables: storageAccountName: "stg123" resourcegroupName: "teststorage-rg" fileShareName: "firstfileshare" trigger: branches: include: - main paths: include: # Only trigger the pipeline on this path in the git repo - 'FileTemplates/*' pool: vmImage: 'windows-latest' steps: - task: AzurePowerShell@5 inputs: azureSubscription: 'AzureSubConnection' #This is the devops service connection name ErrorActionPreference: 'Stop' FailOnStandardError: true ScriptType: 'inlineScript' inline: | $accountKey = (Get-AzStorageAccountKey -ResourceGroupName $(resourcegroupName) -Name $(storageAccountName))[0].Value $ctx = New-AzStorageContext -StorageAccountName $(StorageAccountName) -StorageAccountKey $accountKey $s = Get-AzStorageShare $(fileShareName) -Context $ctx # We only want to copy a subset of files in the repo, so we'll set our script location to that path Set-Location "$(Build.SourcesDirectory)\FileTemplates" $CurrentFolder = (Get-Item .).FullName $files = Get-ChildItem -Recurse | Where-Object { $_.GetType().Name -eq "FileInfo"} # Get all the unique folders without filenames $folders = $files.FullName.Substring($Currentfolder.Length+1).Replace("\","/") | split-path -parent | Get-Unique # Create Folders for every possible path foreach ($folder in $folders) { if ($folder -ne ""){ $folderpath = ("dbscripts\" + $folder).Replace("\","/") # Create a toplevel folder in front of each path to organize within the Azure Share $foldersPathLeafs = $folderpath.Split("/") if ($foldersPathLeafs.Count -gt 1) { foreach ($index in 0..($foldersPathLeafs.Count - 1)) { $desiredfolderpath = [string]::Join("/", $foldersPathLeafs[0..$index]) try { $s.ShareClient.GetDirectoryClient("$desiredfolderpath").CreateIfNotExists() } catch { $message = $_ Write-Warning "That didn't work: $message" } } } } } # Create each file foreach ($file in $files) { $path=$file.FullName.Substring($Currentfolder.Length+1).Replace("\","/") $path = "scripts/"+$path # Create a toplevel folder in front of each path to organize within the Azure Share Write-output "Writing: $($file.FullName)" try { Set-AzStorageFileContent -Share $s.CloudFileShare -Source $file.FullName -Path $path -Force } catch { $message = $_ Write-Warning "That didn't work: $message" } } azurePowerShellVersion: 'LatestVersion' displayName: "Azure Files Storage Copy"
Just wanted to say a big thank you for sharing this. I’ve used it as inspiration for a task in our ADO pipeline to copy the build output to an Azure File Share.
One “issue” I found was that because our share is backed up, it has snapshots. So I got a lot of errors. The fix (which I suspect could be included even when not using snapshots) is to ensure the Share ($s) is not a snapshot. So change:
$s = Get-AzStorageShare $(fileShareName) -Context $ctx
to:
$s = Get-AzStorageShare $(fileShareName) -Context $ctx | Where-Object { $_.IsSnapshot -eq $false}