Our IT world is moving more and more to the cloud. Thanks to that we can almost forget how useful it could be to have a local backup. In all standard scenarios the cloud provider takes care of backup, provides high availability etc. However, there are situations when a local copy has a price of gold. Cannot imagine that? Let’s give me an example.
Your team uses repositories in Azure DevOps. It’s a small team and the setup doesn’t use AD groups. Instead, access is granted directly for user accounts. Because of unexpected event, users with direct access are unavailable. At the same time, there is a need for urgent patching.
Another example. Due to hardware failure, network connection is broken for few hours. There is no access to code repository. Yes, there are still places in the world, even in Europe, where Internet connection could be disrupted for few hours. And yes, there are offices there – believe me. You don’t have a clone of the repository on your PC and need it urgently…
Anyway, whatever is the reason, it’s sometimes useful to have a local copy. I’ll describe how to achieve it with a simple script. It downloads files from a repository in Azure DevOps to local storage.
In this example we will use simple REST queries. First of all we need to authorize. Create a personal access token
for this purpose. In Azure DevOps click on ‘User Settings
’ -> ‘Personal access token
’.
Add a ‘New Token
’ with a name and expiration time of your choice. To download files it’s enough to give it a ‘Read
’ access in ‘Code
’ section.
As the warning says, remember to copy the code to a safe place, you will not be able to read it in Azure DevOps again.
Once you have the token, I strongly recommend to store it in a secure way (in a key vault, local credentials manager, encrypt by secure string mechanism in an AD domain etc.). The way you choose depends on your need and environment. In this example we assume that it’s already stored as a variable of SecureString
type called $tokenInSecureString
.
The authentication for REST API is done in a header so we need the token in a clean text. To do it use the SecureString
as password for a dummy PSCredentials
object. Later we will call GetNetworkCredential
method on it.
$dummyCredential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList 'fakeUser', $tokenInSecureString
Now we can create the authorization header. To do it, convert the clean text token into a Base64
String. You can use a .NET
Convert
and Text.Encoding
classes to do it:
$base64AuthString = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f "", $ dummyCredential.GetNetworkCredential().Password)))
$header = @{
"Authorization" = ("Basic {0}" -f $base64AuthString)
}
That’s it! Now we can execute any REST API query that the token has access to. For example, read all the repositories. The easiest way to do it, is to use Invoke-RestMethod
cmdlet with our authorization header passed as argument. Firstly, assign value to variables you will use going forward. You need names of Azure DevOps organization and Project (you can pass them to the script as parameters):
$organization = {ORGANIZATION NAME}
$project = {PROJECT NAME}
Now we can use them to build the query:
$urlRepos = "https://dev.azure.com/$organization/$project/_apis/git/repositories?api-version=1.0"
$resultRepos = Invoke-RestMethod -Uri $urlRepos -Method Get -ContentType "application/text" -Headers $ header
Let’s find a query to download a file. The best should be:
$urlDownloadItem = "https://dev.azure.com/$organization/$project/_apis/git/repositories/{REPOSITORY NAME}/items?path={FILE PATH}&download=true&api-version=5.0"
You can download a file returned by this query using Invoke-RestMethod
, just add -OutFile
parameter:
Invoke-RestMethod -Uri $urlDownloadItem -Method Get -ContentType "application/text" -Headers $header -OutFile "{YOUR LOCAL PATH}"
But how to find all the relevant paths programmatically? Here we can use another query:
$urlDownloadItems = "https://dev.azure.com/$organization/$project/_apis/git/repositories/$($resultRepos[0].name)/items?recursionLevel=Full&api-version=5.0"
$itemPaths = Invoke-RestMethod -Uri $urlDownloadItems -Method Get -ContentType "application/text" -Headers $header
This one returns paths of all folders and paths in the first repository returned by one of the previous queries (the one retrieving all repository names). You can easily use this query in a loop to query all available repositories.
With the full recursion level, you will simply get all files. Then you can create a loop iterating through the result: creating missing folders and downloading all files. Information if it’s a folder or a file is kept in the returned result. You can read it from value.isFolder
property.
foreach ($i in $itemPaths.value) {
# skip the root folder
if ($i.path -like "/") {
continue
}
if ($i.isFolder -like "True") {
# for folders, create them
New-Item -ItemType Directory -Path "{LOCAL PATH}$($i.path)" | Out-Null
}
else {
# for files, download them
$urlDownloadItem = "https://dev.azure.com/$organization/$project/_apis/git/repositories/$($resultRepos[0].name)/items?path=$($i.path)&download=true&api-version=5.0"
Invoke-RestMethod -Uri $urlDownloadItem -Method Get -ContentType "application/text" -Headers $header -OutFile "{LOCAL PATH}$($item.path)"
}
The Out-Null cmdlet is used to have a clean console’s output. You can remove this parameter if you like. That’s it.
Written by: Wiktor Mrówczyński
- Use PowerShell to integrate OpenAI GPT with context menu - October 25, 2023
- Let OpenAI improve and correct your PowerShell code - October 11, 2023
- Create Web Link with PowerShell in Intune - September 27, 2023
This is an excellent example of using REST API to interact with the Azure DevOps, but wouldn’t it be more practical to use native Git tooling to clone and sync your repositories hosted in ADO with a local copy? This way, you would have all the files, branches and history along with your source code. I believe that’s the whole point of using distributed version control systems.
Thank you for your good comment. Use of native Git tooling will do the job as well. In some cases even better as you wrote. The article presents one simple way to interact with Azure DevOps from PowerShell that can be extended to other purposes. It might be useful in some cases as well.
Another thing is that if you have your credentials stored in PSCredential object you can skip the Base64 encoding part and the construction of the headers. You can use the version of Invoke-RestMethod which uses -Credential parameter with -Authentication Basic parameter. So you would call the Rest API like this:
Invoke-RestMethod -Uri $urlDownloadItems -Method Get -ContentType “application/text” -Credential $dummyCredential -Authentication Basic