I am using a script to call URLs, get the output, and save the output to a text file. It all works great. However, when I need to call multiple URLs in succession it causes CPU spikes because it is starting/stopping powershell.exe
so many times. Is there a whay I could use a for each
technique for each URL while still saving the output? Here is my script:
$content = Get-Content $PSScriptRoot\urls.txt
echo "Testing for $content"
(Invoke-WebRequest -Uri "$content").Content |
Out-File -FilePath "$PSScriptRoot\out.txt"
$status = Get-Content $PSScriptRoot\out.txt
Note that the two echo
are just for debugging purposes, don't really matter.
Is this Something that you wanted to do?
I Wrote a urls.txt with 7-8 urls,And it worked fine for me.