I'm using a xcopy in an XP windows script to recursively copy a directory. I keep getting an 'Insufficient Memory' error, which I understand is because a file I'm trying to copy has too long a path. I can easily reduce the path length, but unfortunately I can't work out which files are violating the path length restriction. The files that are copied are printed to the standard output (which I'm redirecting to a log file), but the error message is printed to the terminal, so I can't even work out approximately which directory the error is being given for.
问题:
回答1:
do a dir /s /b > out.txt
and then add a guide at position 260
In powershell cmd /c dir /s /b |? {$_.length -gt 260}
回答2:
I created the Path Length Checker tool for this purpose, which is a nice, free GUI app that you can use to see the path lengths of all files and directories in a given directory.
I've also written and blogged about a simple PowerShell script for getting file and directory lengths. It will output the length and path to a file, and optionally write it to the console as well. It doesn't limit to displaying files that are only over a certain length (an easy modification to make), but displays them descending by length, so it's still super easy to see which paths are over your threshold. Here it is:
$pathToScan = "C:\Some Folder" # The path to scan and the the lengths for (sub-directories will be scanned as well).
$outputFilePath = "C:\temp\PathLengths.txt" # This must be a file in a directory that exists and does not require admin rights to write to.
$writeToConsoleAsWell = $true # Writing to the console will be much slower.
# Open a new file stream (nice and fast) and write all the paths and their lengths to it.
$outputFileDirectory = Split-Path $outputFilePath -Parent
if (!(Test-Path $outputFileDirectory)) { New-Item $outputFileDirectory -ItemType Directory }
$stream = New-Object System.IO.StreamWriter($outputFilePath, $false)
Get-ChildItem -Path $pathToScan -Recurse -Force | Select-Object -Property FullName, @{Name="FullNameLength";Expression={($_.FullName.Length)}} | Sort-Object -Property FullNameLength -Descending | ForEach-Object {
$filePath = $_.FullName
$length = $_.FullNameLength
$string = "$length : $filePath"
# Write to the Console.
if ($writeToConsoleAsWell) { Write-Host $string }
#Write to the file.
$stream.WriteLine($string)
}
$stream.Close()
回答3:
As a refinement of simplest solution, and if you can’t or don’t want to install Powershell, just run:
dir /s /b | sort /r /+261 > out.txt
or (faster):
dir /s /b | sort /r /+261 /o out.txt
And lines longer than 260 will get to the top of listing. Note that you must add 1 to SORT column parameter (/+n).
回答4:
you can redirect stderr.
more explanation here, but having a command like:
MyCommand >log.txt 2>errors.txt
should grab the data you are looking for.
Also, as a trick, Windows bypasses that limitation if the path is prefixed with \\?\
(msdn)
Another trick if you have a root or destination that starts with a long path, perhaps SUBST
will help:
SUBST Q: "C:\Documents and Settings\MyLoginName\My Documents\MyStuffToBeCopied"
Xcopy Q:\ "d:\Where it needs to go" /s /e
SUBST Q: /D
回答5:
From http://www.powershellmagazine.com/2012/07/24/jaap-brassers-favorite-powershell-tips-and-tricks/:
Get-ChildItem –Force –Recurse –ErrorAction SilentlyContinue –ErrorVariable AccessDenied
the first part just iterates through this and sub-folders; using -ErrorVariable AccessDenied
means push the offending items into the powershell variable AccessDenied
.
You can then scan through the variable like so
$AccessDenied |
Where-Object { $_.Exception -match "must be less than 260 characters" } |
ForEach-Object { $_.TargetObject }
If you don't care about these files (may be applicable in some cases), simply drop the -ErrorVariable AccessDenied
part.
回答6:
I've made an alternative to the other good answers on here that uses PowerShell, but mine also saves the list to a file. Will share it here in case anyone else needs wants something like that.
Warning: Code overwrites "longfilepath.txt" in the current working directory. I know it's unlikely you'd have one already, but just in case!
Purposely wanted it in a single line:
Out-File longfilepath.txt ; cmd /c "dir /b /s /a" | ForEach-Object { if ($_.length -gt 250) {$_ | Out-File -append longfilepath.txt}}
Detailed instructions:
- Run PowerShell
- Traverse to the directory you want to check for filepath lengths (C: works)
- Copy and paste the code [Right click to paste in PowerShell, or Alt + Space > E > P]
- Wait until it's done and then view the file:
cat longfilepath.txt | sort
Explanation:
Out-File longfilepath.txt ;
– Create (or overwrite) a blank file titled 'longfilepath.txt'. Semi-colon to separate commands.
cmd /c "dir /b /s /a" |
– Run dir command on PowerShell, /a
to show all files including hidden files. |
to pipe.
ForEach-Object { if ($_.length -gt 250) {$_ | Out-File -append longfilepath.txt}}
– For each line (denoted as $_), if the length is greater than 250, append that line to the file.
回答7:
TLPD ("too long path directory") is the program that saved me. Very easy to use:
https://sourceforge.net/projects/tlpd/
回答8:
For paths greater than 260:
you can use:
Get-ChildItem | Where-Object {$_.FullName.Length -gt 260}
Example on 14 chars:
To view the paths lengths:
Get-ChildItem | Select-Object -Property FullName, @{Name="FullNameLength";Expression={($_.FullName.Length)}
Get paths greater than 14:
Get-ChildItem | Where-Object {$_.FullName.Length -gt 14}
Screenshot:
For filenames greater than 10:
Get-ChildItem | Where-Object {$_.PSChildName.Length -gt 10}
Screenshot: