We've been working lately to use HashiCorp Packer to standardize and automate our VM template builds, and we found a need to pull in all of the contents of a specific directory on an internal web server. This would be pretty simple for Linux systems using wget -r
, but we needed to find another solution for our Windows builds.
A coworker and I cobbled together a quick PowerShell solution which will download the files within a specified web URL to a designated directory (without recreating the nested folder structure):
1$outputdir = 'C:\Scripts\Download\'
2$url = 'https://win01.lab.bowdre.net/stuff/files/'
3
4# enable TLS 1.2 and TLS 1.1 protocols
5[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12, [Net.SecurityProtocolType]::Tls11
6
7$WebResponse = Invoke-WebRequest -Uri $url
8# get the list of links, skip the first one ("[To Parent Directory]") and download the files
9$WebResponse.Links | Select-Object -ExpandProperty href -Skip 1 | ForEach-Object {
10 $fileName = $_.ToString().Split('/')[-1] # 'filename.ext'
11 $filePath = Join-Path -Path $outputdir -ChildPath $fileName # 'C:\Scripts\Download\filename.ext'
12 $baseUrl = $url.split('/') # ['https', '', 'win01.lab.bowdre.net', 'stuff', 'files']
13 $baseUrl = $baseUrl[0,2] -join '//' # 'https://win01.lab.bowdre.net'
14 $fileUrl = '{0}{1}' -f $baseUrl.TrimEnd('/'), $_ # 'https://win01.lab.bowdre.net/stuff/files/filename.ext'
15 Invoke-WebRequest -Uri $fileUrl -OutFile $filePath
16}
The latest version of this script will be found on GitHub.