bugGNU Wget - Bugs: bug #21714, File name too long

 
 

bug #21714: File name too long

Submitter:  John Doe <johndoe32102002>
Submitted:  Tue 04 Dec 2007 05:29:29 AM UTC
   
 
Category:  Program Logic Severity:  3 - Normal
Priority:  7 - High Status:  Fixed
Privacy:  Public Assigned to:  gscrivano
Originator Name:  John Open/Closed:  Closed
Release:  1.10.2 Operating System:  GNU/Linux
Reproducibility:  Every Time Fixed Release:  None
Planned Release:  1.12.x Regression:  Yes
Work Required:  0 - Hours Patch Included:  No

Discussion locked!

Jump to the original submission

Wed 05 Apr 2017 08:20:53 AM UTC, comment #56: 

Try Long Path Tool as it can remove any problems that you might have.

frank jackosn <frank1982>
Wed 24 Aug 2016 09:25:15 AM UTC, comment #55: 

The previous two comments seem to be spam, probably triggered by "File name too long" in the subject. Can someone delete them?

(When wget cannot download a file in the first place, a tool to rename files obviously won't help. Also, the bug has been fixed for some years now.)

Frank Heckenbach <frank>
Wed 24 Aug 2016 06:19:56 AM UTC, comment #54: 

I tried the Long Path Tool and it worked wonders for me, it's a great tool for copying and renaming file names with path too long.

Ronald Carl <ronald87>
Sat 27 Dec 2014 02:39:27 PM UTC, comment #53: 

Hello, Did you used long path tool when you faced this problem "File name too long". I did use long path tool when did i faced that type problem. Hope it will help you.

julias <julias4>
Sat 29 Sep 2012 02:21:24 PM UTC, comment #52: 

Thanks. I've verified that the current git version fixes the problem.

Frank Heckenbach <frank>
Sat 29 Sep 2012 11:49:26 AM UTC, comment #51: 

Applied as commit: 67e6027ea130d06aeff365adfbc83f34d019b968

Giuseppe Scrivano <gscrivano>
Group administrator
Fri 28 Sep 2012 04:35:09 PM UTC, comment #50: 

I am going to take a look at the patch in the next few days.

Giuseppe Scrivano <gscrivano>
Group administrator
Fri 28 Sep 2012 03:05:32 AM UTC, comment #49: 

What exactly are we supposed to do to get this patch finally included?

We (i.e., I and other contributors) have repeatedly answered to the maintainer's questions, fixed issues they pointed out, ported the fix to several new versions of wget, etc.

Now that everything's resolved, the issue seems to be ignored.

It's been going on for years now.

So what should we do to move on???

Frank Heckenbach <frank>
Mon 09 Jul 2012 06:43:07 PM UTC, comment #48: 

Hello,

just as a reminder. Maybe it just got forgoten. Could the patch be added?

Thank you.

Noël Köthe <nok>
Wed 30 May 2012 12:03:22 AM UTC, comment #47: 

Looks good to me.

Frank Heckenbach <frank>
Tue 29 May 2012 11:23:49 AM UTC, comment #46: 

Sorry for the delay - I had been off pc.

ret is long again and spaces for errno=0 added.

I changed the return type of get_max_length() to size_t, since the return value is directly compared to strlen().

Instead of returning INT_MAX - which was inconvenient anyway, should have been LONG_MAX - it returns 0 for 'no limit' / 'unknown limit'.

Any complaints ?

Tim Ruehsen <rockdaboot>
Group administrator
Fri 18 May 2012 07:43:17 PM UTC, comment #45: 

Two minor things about the new patch, both in get_max_length ():

- Why did you change the type of ret from long to int? pathconf () returns long, at least on GNU/Linux.

- Why did you remove the spaces in "errno = 0;"?

Frank Heckenbach <frank>
Fri 18 May 2012 03:11:50 PM UTC, comment #44: 

Just accidentially named the path 1.14,.
I meant ;-) 1.13.4, fixed up for current git.

Why hasn't this feature been applied yet ?
It would also close Debian bug #672131.

Tim Ruehsen <rockdaboot>
Group administrator
Thu 29 Mar 2012 06:18:12 PM UTC, comment #43: 

Can this patch now finally be included, or is anything missing?

Frank Heckenbach <frank>
Sun 27 Nov 2011 11:29:55 AM UTC, comment #42: 

I wrote:

> So perhaps you should use the current directory (or the target directory, resp.) instead of "/". Of course then the function couldn't cache the result since it might be different for different files. However, the cost of a pathconf() call is probably negligible compared to downloading a file through the net.


I did that now. A special case to note is when the target directory doesn't exist yet (it may be created in the current retrieval). In this case, I strip directories and check the path again until I find an existing one. Since the newly created directories will be on the same file system as their parent directories, this should be safe.

I also adopted the patch for 1.13. I hope it's finally ready for inclusion now.

(wget_filename_length.v3-1.13.patch)

(file #24473)

Frank Heckenbach <frank>
Sat 12 Nov 2011 04:40:54 AM UTC, comment #41: 


> So perhaps you should use the current directory (or the target directory, resp.) instead of "/". Of course then the function couldn't cache the result since it might be different for different files. However, the cost of a pathconf() call is probably negligible compared to downloading a file through the net.


I believe Micah brought up the same point earlier (comment #28).  I know I'm still assigned to this bug, but I won't have time to change anything now.  It's been a couple of years (almost 3?!) since I submitted my patch and that is the only work I've done on wget.  I suspect if that safety improvement is really needed (checking target directory params versus root directory), then I won't have the time to get it done. Maybe someone else will add that check?

Anyway, glad the patch is still more-or-less working for folks (I know it's still doing the job on our server).  Hopefully the fix makes it into the next release (even without the safer file system check).

Joseph Rios <alotau>
Fri 11 Nov 2011 11:50:39 PM UTC, comment #40: 


> Here it goes (sorry for italian message: "Nome del file troppo lungo" means "File name too long")


BTW, if you run it with "LC_ALL=C /opt/wget...", you get default (English) messages.

> >Are you sure you've applied the patch properly?
>
> I'm sorry, I messed up with patching!
>
> I do confirm it does work indeed on wget 1.12!
>
> Thank you for the good job!


Good to hear so.

Though when investigating the previous message, I saw a possible problem (but even then the patch in an improvement to the status quo):

get_name_max() does:

    if ((ret = pathconf ("/", _PC_NAME_MAX)) == -1)
    if ((ret = pathconf ("/", _PC_PATH_MAX)) == -1)

But the manpage says:

       _PC_NAME_MAX
              returns  the  maximum length of a filename in the directory path or fd that the process is allowed to create. 
       _PC_PATH_MAX
              returns  the  maximum length of a relative pathname when path or fd is the current working directory.

So imagine the situation that the root file system allows a larger limit than the current directory. It would probably still fail (I can't test it easily).

So perhaps you should use the current directory (or the target directory, resp.) instead of "/". Of course then the function couldn't cache the result since it might be different for different files. However, the cost of a pathconf() call is probably negligible compared to downloading a file through the net.

Frank Heckenbach <frank>
Fri 11 Nov 2011 06:42:26 PM UTC, comment #39: 


>Are you sure you've applied the patch properly?


I'm sorry, I messed up with patching!

I do confirm it does work indeed on wget 1.12!

Thank you for the good job!

Giovanni <timido>
Thu 10 Nov 2011 09:33:29 PM UTC, comment #38: 

Are you sure you've applied the patch properly?  There are some logprintf's that I had included that I don't see in your example output. 

From the patch:


+  if (strlen (temp_fnres.base) > get_name_max () - CHOMP_BUFFER)
     {
-      append_char (FN_QUERY_SEP, &fnres);
-      append_uri_pathel (u_query, u_query + strlen (u_query), true, &fnres);
+      logprintf (LOG_NOTQUIET, "The name is too long, %lu chars total.\n",
+          (unsigned long) strlen (temp_fnres.base));
+      logprintf (LOG_NOTQUIET, "Trying to shorten...\n");
+
+      /* Shorten the file name. */
+      temp_fnres.base[get_name_max () - CHOMP_BUFFER] = '\0';
+
+      logprintf (LOG_NOTQUIET, "New name is %s.\n", temp_fnres.base);
     }

Or am I missing something else?

Joseph Rios <alotau>
Thu 10 Nov 2011 05:24:29 PM UTC, comment #37: 


> What doesn't work, which version and system?


Linux lenny 2.6.26-2-686
ext3
wget 1.12 patched with yours

Here it goes (sorry for italian message: "Nome del file troppo lungo" means "File name too long")


$ /opt/wget-1.12-patch/bin/wget "google.de?q=`seq 200|tr -d 'n'`"
--2011-11-10 06:31:50--  http://google.de/?q=1%0A2%0A3%0A4%0A5%0A6%0A7%0A8%0A9%0A10%0A11%0A12%0A13%0A14%0A15%0A16%0A17%0A18%0A19%0A20%0A21%0A22%0A23%0A24%0A25%0A26%0A27%0A28%0A29%0A30%0A31%0A32%0A33%0A34%0A35%0A36%0A37%0A38%0A39%0A40%0A41%0A42%0A43%0A44%0A45%0A46%0A47%0A48%0A49%0A50%0A51%0A52%0A53%0A54%0A55%0A56%0A57%0A58%0A59%0A60%0A61%0A62%0A63%0A64%0A65%0A66%0A67%0A68%0A69%0A70%0A71%0A72%0A73%0A74%0A75%0A76%0A77%0A78%0A79%0A80%0A81%0A82%0A83%0A84%0A85%0A86%0A87%0A88%0A89%0A90%0A91%0A92%0A93%0A94%0A95%0A96%0A97%0A98%0A99%0A100%0A101%0A102%0A103%0A104%0A105%0A106%0A107%0A108%0A109%0A110%0A111%0A112%0A113%0A114%0A115%0A116%0A117%0A118%0A119%0A120%0A121%0A122%0A123%0A124%0A125%0A126%0A127%0A128%0A129%0A130%0A131%0A132%0A133%0A134%0A135%0A136%0A137%0A138%0A139%0A140%0A141%0A142%0A143%0A144%0A145%0A146%0A147%0A148%0A149%0A150%0A151%0A152%0A153%0A154%0A155%0A156%0A157%0A158%0A159%0A160%0A161%0A162%0A163%0A164%0A165%0A166%0A167%0A168%0A169%0A170%0A171%0A172%0A173%0A174%0A175%0A176%0A177%0A178%0A179%0A180%0A181%0A182%0A183%0A184%0A185%0A186%0A187%0A188%0A189%0A190%0A191%0A192%0A193%0A194%0A195%0A196%0A197%0A198%0A199%0A200
Risoluzione di google.de (google.de)... 74.125.39.103, 74.125.39.104, 74.125.39.105, ...
Connessione a google.de (google.de)|74.125.39.103|:80... connesso.
HTTP richiesta inviata, in attesa di risposta... 301 Moved Permanently
Posizione: http://www.google.de/?q=1%0A2%0A3%0A4%0A5%0A6%0A7%0A8%0A9%0A10%0A11%0A12%0A13%0A14%0A15%0A16%0A17%0A18%0A19%0A20%0A21%0A22%0A23%0A24%0A25%0A26%0A27%0A28%0A29%0A30%0A31%0A32%0A33%0A34%0A35%0A36%0A37%0A38%0A39%0A40%0A41%0A42%0A43%0A44%0A45%0A46%0A47%0A48%0A49%0A50%0A51%0A52%0A53%0A54%0A55%0A56%0A57%0A58%0A59%0A60%0A61%0A62%0A63%0A64%0A65%0A66%0A67%0A68%0A69%0A70%0A71%0A72%0A73%0A74%0A75%0A76%0A77%0A78%0A79%0A80%0A81%0A82%0A83%0A84%0A85%0A86%0A87%0A88%0A89%0A90%0A91%0A92%0A93%0A94%0A95%0A96%0A97%0A98%0A99%0A100%0A101%0A102%0A103%0A104%0A105%0A106%0A107%0A108%0A109%0A110%0A111%0A112%0A113%0A114%0A115%0A116%0A117%0A118%0A119%0A120%0A121%0A122%0A123%0A124%0A125%0A126%0A127%0A128%0A129%0A130%0A131%0A132%0A133%0A134%0A135%0A136%0A137%0A138%0A139%0A140%0A141%0A142%0A143%0A144%0A145%0A146%0A147%0A148%0A149%0A150%0A151%0A152%0A153%0A154%0A155%0A156%0A157%0A158%0A159%0A160%0A161%0A162%0A163%0A164%0A165%0A166%0A167%0A168%0A169%0A170%0A171%0A172%0A173%0A174%0A175%0A176%0A177%0A178%0A179%0A180%0A181%0A182%0A183%0A184%0A185%0A186%0A187%0A188%0A189%0A190%0A191%0A192%0A193%0A194%0A195%0A196%0A197%0A198%0A199%0A200 [segue]
--2011-11-10 06:31:51--  http://www.google.de/?q=1%0A2%0A3%0A4%0A5%0A6%0A7%0A8%0A9%0A10%0A11%0A12%0A13%0A14%0A15%0A16%0A17%0A18%0A19%0A20%0A21%0A22%0A23%0A24%0A25%0A26%0A27%0A28%0A29%0A30%0A31%0A32%0A33%0A34%0A35%0A36%0A37%0A38%0A39%0A40%0A41%0A42%0A43%0A44%0A45%0A46%0A47%0A48%0A49%0A50%0A51%0A52%0A53%0A54%0A55%0A56%0A57%0A58%0A59%0A60%0A61%0A62%0A63%0A64%0A65%0A66%0A67%0A68%0A69%0A70%0A71%0A72%0A73%0A74%0A75%0A76%0A77%0A78%0A79%0A80%0A81%0A82%0A83%0A84%0A85%0A86%0A87%0A88%0A89%0A90%0A91%0A92%0A93%0A94%0A95%0A96%0A97%0A98%0A99%0A100%0A101%0A102%0A103%0A104%0A105%0A106%0A107%0A108%0A109%0A110%0A111%0A112%0A113%0A114%0A115%0A116%0A117%0A118%0A119%0A120%0A121%0A122%0A123%0A124%0A125%0A126%0A127%0A128%0A129%0A130%0A131%0A132%0A133%0A134%0A135%0A136%0A137%0A138%0A139%0A140%0A141%0A142%0A143%0A144%0A145%0A146%0A147%0A148%0A149%0A150%0A151%0A152%0A153%0A154%0A155%0A156%0A157%0A158%0A159%0A160%0A161%0A162%0A163%0A164%0A165%0A166%0A167%0A168%0A169%0A170%0A171%0A172%0A173%0A174%0A175%0A176%0A177%0A178%0A179%0A180%0A181%0A182%0A183%0A184%0A185%0A186%0A187%0A188%0A189%0A190%0A191%0A192%0A193%0A194%0A195%0A196%0A197%0A198%0A199%0A200
Risoluzione di www.google.de (www.google.de)... 74.125.39.103, 74.125.39.104, 74.125.39.105, ...
Riutilizzo della connessione esistente a google.de:80.
HTTP richiesta inviata, in attesa di risposta... 200 OK
Lunghezza: non specificato [text/html]
index.html?q=1%0A2%0A3%0A4%0A5%0A6%0A7%0A8%0A9%0A10%0A11%0A12%0A13%0A14%0A15%0A16%0A17%0A18%0A19%0A20%0A21%0A22%0A23%0A24%0A25%0A26%0A27%0A28%0A29%0A30%0A31%0A32%0A33%0A34%0A35%0A36%0A37%0A38%0A39%0A40%0A41%0A42%0A43%0A44%0A45%0A46%0A47%0A48%0A49%0A50%0A51%0A52%0A53%0A54%0A55%0A56%0A57%0A58%0A59%0A60%0A61%0A62%0A63%0A64%0A65%0A66%0A67%0A68%0A69%0A70%0A71%0A72%0A73%0A74%0A75%0A76%0A77%0A78%0A79%0A80%0A81%0A82%0A83%0A84%0A85%0A86%0A87%0A88%0A89%0A90%0A91%0A92%0A93%0A94%0A95%0A96%0A97%0A98%0A99%0A100%0A101%0A102%0A103%0A104%0A105%0A106%0A107%0A108%0A109%0A110%0A111%0A112%0A113%0A114%0A115%0A116%0A117%0A118%0A119%0A120%0A121%0A122%0A123%0A124%0A125%0A126%0A127%0A128%0A129%0A130%0A131%0A132%0A133%0A134%0A135%0A136%0A137%0A138%0A139%0A140%0A141%0A142%0A143%0A144%0A145%0A146%0A147%0A148%0A149%0A150%0A151%0A152%0A153%0A154%0A155%0A156%0A157%0A158%0A159%0A160%0A161%0A162%0A163%0A164%0A165%0A166%0A167%0A168%0A169%0A170%0A171%0A172%0A173%0A174%0A175%0A176%0A177%0A178%0A179%0A180%0A181%0A182%0A183%0A184%0A185%0A186%0A187%0A188%0A189%0A190%0A191%0A192%0A193%0A194%0A195%0A196%0A197%0A198%0A199%0A200: Nome del file troppo lungo

Impossibile scrivere in "index.html?q=1%0A2%0A3%0A4%0A5%0A6%0A7%0A8%0A9%0A10%0A11%0A12%0A13%0A14%0A15%0A16%0A17%0A18%0A19%0A20%0A21%0A22%0A23%0A24%0A25%0A26%0A27%0A28%0A29%0A30%0A31%0A32%0A33%0A34%0A35%0A36%0A37%0A38%0A39%0A40%0A41%0A42%0A43%0A44%0A45%0A46%0A47%0A48%0A49%0A50%0A51%0A52%0A53%0A54%0A55%0A56%0A57%0A58%0A59%0A60%0A61%0A62%0A63%0A64%0A65%0A66%0A67%0A68%0A69%0A70%0A71%0A72%0A73%0A74%0A75%0A76%0A77%0A78%0A79%0A80%0A81%0A82%0A83%0A84%0A85%0A86%0A87%0A88%0A89%0A90%0A91%0A92%0A93%0A94%0A95%0A96%0A97%0A98%0A99%0A100%0A101%0A102%0A103%0A104%0A105%0A106%0A107%0A108%0A109%0A110%0A111%0A112%0A113%0A114%0A115%0A116%0A117%0A118%0A119%0A120%0A121%0A122%0A123%0A124%0A125%0A126%0A127%0A128%0A129%0A130%0A131%0A132%0A133%0A134%0A135%0A136%0A137%0A138%0A139%0A140%0A141%0A142%0A143%0A144%0A145%0A146%0A147%0A148%0A149%0A150%0A151%0A152%0A153%0A154%0A155%0A156%0A157%0A158%0A159%0A160%0A161%0A162%0A163%0A164%0A165%0A166%0A167%0A168%0A169%0A170%0A171%0A172%0A173%0A174%0A175%0A176%0A177%0A178%0A179%0A180%0A181%0A182%0A183%0A184%0A185%0A186%0A187%0A188%0A189%0A190%0A191%0A192%0A193%0A194%0A195%0A196%0A197%0A198%0A199%0A200" (Nome del file troppo lungo).

Giovanni <timido>
Thu 10 Nov 2011 01:12:15 AM UTC, comment #36: 

Giovanni <timido> wrote:

> Doesnt seem to work even with patch applied!


What doesn't work, which version and system?

The example I gave still works for me (Linux 2.6.34, ext[2-4], wget 1.12):

wget "google.de?q=`seq 200|tr -d '\n'`"

> Are you sure Linux VFS allows wget to save a file whose name is longer than 255 bytes ?


No, and that's just the point. The patch makes wget shorten the
output file name.

> The issue is when wget creates temporary files from following HTTP redirections... which you cannot save with -O option indeed


Or with recursive retrieval if one of the linked files has a longer name.

Frank Heckenbach <frank>
Wed 09 Nov 2011 06:39:24 PM UTC, comment #35: 

Doesnt seem to work even with patch applied!

Are you sure Linux VFS allows wget to save a file whose name is longer than 255 bytes ?

This problem still is in wget version 1.13.4 .

The issue is when wget creates temporary files from following HTTP redirections... which you cannot save with -O option indeed

Giovanni <timido>
Mon 13 Jun 2011 04:27:50 AM UTC, comment #34: 

Rob Mangiafico wrote:

> Is there an updated patch for version 1.12? I tried massaging the 1.11 patch, but now filenames that are short have random invalid characters appended to the filenames.


I did the same and found the same problem. It was because temp_fnres wasn't 0-terminated. The bug is not actually related to the version, just happens to occur now (and perhaps even before, unnoticed).

I also fixed the problems Micah pointed out except "Why is temp_fnres a growable?" (I'm not familiar with those types and at least it seems to work as is, if maybe a little less efficient).

I'll upload my patch here. I hope it's ready for inclusion now. Should I do anything else to make that happen?

To test it:

Before:

% wget "google.de?q=`seq 200|tr -d '\n'`"
[...]
Cannot write to `index.html?q=[...]' (File name too long).
% echo $?
3

With the patch:

% wget "google.de?q=`seq 200|tr -d '\n'`"
The name is too long, 505 chars total.
Trying to shorten...
New name is index.html?q=[...].
[...]
% echo $?
0


(file #23520)

Frank Heckenbach <frank>
Fri 01 Apr 2011 02:15:26 AM UTC, comment #33: 

Is there an updated patch for version 1.12? I tried massaging the 1.11 patch, but now filenames that are short have random invalid characters appended to the filenames.

Rob Mangiafico <robman>
Mon 02 Aug 2010 06:19:21 PM UTC, comment #32: 

Hello,
I have this same problem, it's in wget version 1.12 as well.

How this patch does apply to ?
Is it a final version?

Thank you for support.

Giovanni <timido>
Thu 10 Sep 2009 06:19:32 AM UTC, comment #31: 


> Why is temp_fnres a growable?


I wanted to use the append_uri_pathel() function which seemed to be doing smarter things than buffer/string tricks I'd play on my own.

Joseph Rios <alotau>
Fri 04 Sep 2009 03:17:03 AM UTC, comment #30: 

Some additional comments on the latest patch.

It's not quite "GNU-style"; the indentation is good, but there's a number of places where the spacing around parentheses is off: "if( foo )" instead of "if (foo)".

Why is temp_fnres a growable? ... you never grow it, and I don't see any usage that wouldn't have been just as good with an ordinary buffer. Also, I don't understand why you're restoring the string length just before freeing it altogether. free() doesn't care where the NUL is; it doesn't even know it's a string.

printf("%ld", strlen(...)) is dangerous. Make sure the return size of the variable matches the specifier: size_t's size isn't guaranteed to match a long. And strlen will never return a signed value, so your best bet is something like printf("%lu", (unsigned long) strlen(...)).

BTW, unified diffs are easier for me to deal with than contextual diffs; easier for me to double-check that they were applied correctly.

Micah Cowan <micahcowan>
Tue 01 Sep 2009 05:45:58 AM UTC, comment #29: 

You're right that the fallback behavior may not be necessary; but I'm still not crazy about running a risk of prematurely truncating the filename before we're certain it's too big (especially since get_max_name isn't necessarily checking the same file system). The chances of checking the right filesystem could be improved if we invoke pathconf on the directory argument to -P (or the current working directory, if -P wasn't used), though it still wouldn't be fool-proof (the fail cases, however, are quite tolerably rare).

What about a boolean option, instead of a max-length parameter? When we get ENAMETOOBIG we could just toggle the flag, and call url_file_name just one more time. Basically, I'd just like to eliminate any potential for already-working behavior to change.

Micah Cowan <micahcowan>
Tue 01 Sep 2009 04:50:49 AM UTC, comment #28: 

Your suggestions make sense, but I wonder at the necessity.  I think the patch fixes the bug as described by all who reported it.  If NAME_MAX has issues on someone's system, then a working wget for long file names may work around a problem that should be addressed elsewhere.  I guess I'm saying that adding the additional fail-safe you describe (for the reasons you describe) seems like we're fixing a problem that isn't wget's to worry about.  I understand, though, that pro-action is good too.

I'm not trying to be lazy and am open to working on it some more (though I'm not sure when)... I'm just opening this for further discussion before I do so.

Joseph Rios <alotau>
Sun 30 Aug 2009 04:11:36 AM UTC, comment #27: 

So, what I liked about the original patch was that it waited until it was sure that the name was going to be too long before it adjusted anything; what I like about the new patches is that it uses {NAME_MAX} intelligently. But it would also have been nice for it to fall back on the first patch's behavior, in the event that {NAME_MAX} is wrong or something.

What about something that does the following?

- Modify url_file_name to take a maximum-length parameter, with a potential value representing no limit.
- First time url_file_name gets called, give it the no-limit parameter
- If http.c gets an ENAMETOOBIG error, then use get_max_name for that parameter, and call url_file_name again.
- If we still get ENAMETOOBIG, use the first patch's behavior and keep chomping off the filename.

Micah Cowan <micahcowan>
Sun 05 Jul 2009 09:50:35 PM UTC, comment #26: 

Pursuing copyright assignment.

Micah Cowan <micahcowan>
Fri 06 Feb 2009 07:20:18 PM UTC, comment #25: 

I've clean the code to make it "GNU style".  I'll consider this the final patch and that the bug is fixed unless I hear otherwise.  I have placed TODO tags in the code where there should be a check for path length violations.  I didn't do that check because I have no idea what to do if the path is too long.  I don't see an obvious solution.  In addition, that hasn't been the complaint I originally had or any users have posted.

Below is the result of the only test I ran on the code.  On the first try with standard wget, the 'File Name Too Long' error occurs.  I then use the patched version and the page is saved.  I then run it again and another copy is saved, demonstrating that the *.1 suffix didn't break anything on the second download.

This is my first GNU bug fix, so if there is something else I am supposed to do, please let me know.

Thanks.


[joseph@torpedo 10:57:29] wget "http://personal.optus.com.au/web/ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=/personal/shoppingcart&LHP=/personal/customerhelp/producthelp/mobilehelp/howtoguidesmobile/usingsimbackup/nokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh=2&foo=barrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr"
--10:57:35--  http://personal.optus.com.au/web/ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=/personal/shoppingcart&LHP=/personal/customerhelp/producthelp/mobilehelp/howtoguidesmobile/usingsimbackup/nokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh=2&foo=barrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr
           => `ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=%2Fpersonal%2Fshoppingcart&LHP=%2Fpersonal%2Fcustomerhelp%2Fproducthelp%2Fmobilehelp%2Fhowtoguidesmobile%2Fusingsimbackup%2Fnokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh=2&foo=barrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr'
Resolving personal.optus.com.au... 203.13.127.15
Connecting to personal.optus.com.au|203.13.127.15|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=%2Fpersonal%2Fshoppingcart&LHP=%2Fpersonal%2Fcustomerhelp%2Fproducthelp%2Fmobilehelp%2Fhowtoguidesmobile%2Fusingsimbackup%2Fnokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh=2&foo=barrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr: File name too long

Cannot write to `ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=%2Fpersonal%2Fshoppingcart&LHP=%2Fpersonal%2Fcustomerhelp%2Fproducthelp%2Fmobilehelp%2Fhowtoguidesmobile%2Fusingsimbackup%2Fnokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh=2&foo=barrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr' (File name too long).
/Users/joseph/projects/wget_dev
[joseph@torpedo 10:57:35] src/wget "http://personal.optus.com.au/web/ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=/personal/shoppingcart&LHP=/personal/customerhelp/producthelp/mobilehelp/howtoguidesmobile/usingsimbackup/nokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh=2&foo=barrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr"
The name is too long, 481 chars total.
Trying to shorten...
New name is ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=%2Fpersonal%2Fshoppingcart&LHP=%2Fpersonal%2Fcustomerhelp%2Fproducthelp%2Fmobilehelp%2Fhowtoguidesmobile%2Fusingsimbackup%2Fnokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhh.
--2009-02-06 10:57:45--  http://personal.optus.com.au/web/ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=/personal/shoppingcart&LHP=/personal/customerhelp/producthelp/mobilehelp/howtoguidesmobile/usingsimbackup/nokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh=2&foo=barrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr
Resolving personal.optus.com.au... 203.13.127.15
Connecting to personal.optus.com.au|203.13.127.15|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: `ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=%2Fpersonal%2Fshoppingcart&LHP=%2Fpersonal%2Fcustomerhelp%2Fproducthelp%2Fmobilehelp%2Fhowtoguidesmobile%2Fusingsimbackup%2Fnokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhh'

    [    <=>                                                                                                ] 44,173      59.2K/s   in 0.7s

2009-02-06 10:57:46 (59.2 KB/s) - `ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=%2Fpersonal%2Fshoppingcart&LHP=%2Fpersonal%2Fcustomerhelp%2Fproducthelp%2Fmobilehelp%2Fhowtoguidesmobile%2Fusingsimbackup%2Fnokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhh' saved [44173]

/Users/joseph/projects/wget_dev
[joseph@torpedo 10:57:46] src/wget "http://personal.optus.com.au/web/ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=/personal/shoppingcart&LHP=/personal/customerhelp/producthelp/mobilehelp/howtoguidesmobile/usingsimbackup/nokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh=2&foo=barrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr"
The name is too long, 481 chars total.
Trying to shorten...
New name is ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=%2Fpersonal%2Fshoppingcart&LHP=%2Fpersonal%2Fcustomerhelp%2Fproducthelp%2Fmobilehelp%2Fhowtoguidesmobile%2Fusingsimbackup%2Fnokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhh.
--2009-02-06 10:57:52--  http://personal.optus.com.au/web/ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=/personal/shoppingcart&LHP=/personal/customerhelp/producthelp/mobilehelp/howtoguidesmobile/usingsimbackup/nokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh=2&foo=barrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr
Resolving personal.optus.com.au... 203.13.127.15
Connecting to personal.optus.com.au|203.13.127.15|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: `ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=%2Fpersonal%2Fshoppingcart&LHP=%2Fpersonal%2Fcustomerhelp%2Fproducthelp%2Fmobilehelp%2Fhowtoguidesmobile%2Fusingsimbackup%2Fnokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhh.1'

    [    <=>                                                                                                ] 44,173      59.4K/s   in 0.7s

2009-02-06 10:57:53 (59.4 KB/s) - `ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=%2Fpersonal%2Fshoppingcart&LHP=%2Fpersonal%2Fcustomerhelp%2Fproducthelp%2Fmobilehelp%2Fhowtoguidesmobile%2Fusingsimbackup%2Fnokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhh.1' saved [44173]



(file #17414)

Joseph Rios <alotau>
Fri 06 Feb 2009 04:37:55 AM UTC, comment #24: 

I'm uploading a new patch.  The major changes are in url_file_name() (of url.c) as discussed below.  I also added a couple of functions to util.c:  get_name_max() and get_path_max().  I think I wrote them in a way such that repeated calls won't generate additional system calls.

Instead of chomping 1/3's off of the file length, I actually just trim it to a preferred length.  I calculate the preferred length based on the value of get_name_max() and a rough guess of how much junk might be appended to the file name later in the code (specifically in get_http() I think).  Things like ".html" and ".orig" and various numerals might get appended after url_file_name().

I don't think the code is in the preferred style, but I can fix that if it seems that the patch does the right thing.

Thanks for the quick response earlier, David.  I didn't think about sending junk params to increase the URL length.

If anyone can provide feedback, the sooner you can get it to me the better.  I can't say how much more time I'll have to work on this after tomorrow.

Thanks.

(file #17409)

Joseph Rios <alotau>
Fri 06 Feb 2009 12:45:27 AM UTC, comment #23: 

Hi Joseph,

Here's one of the longest URL that wget was able to save for me:

http://personal.optus.com.au/web/ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=/personal/shoppingcart&LHP=/personal/customerhelp/producthelp/mobilehelp/howtoguidesmobile/usingsimbackup/nokia6280&site=personal

Maybe if you try to add some dummy parameters behind it would reach wget's limit.

Here's an example of what I get if I put dummy parameters at the end of the URL.


-----
C:\Site Dumps\ver 20081012\personal.optus.com.au\web\test>wget "http://personal.optus.com.au/web/ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=/personal/shoppingcart&LHP=/personal/customerhelp/producthelp/mobilehelp/howtoguidesmobile/usingsimbackup/nokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh"
--11:31:57--  http://personal.optus.com.au/web/ocaportal.portal?_nfpb=true&_pageLabel=shoppingcart&FP=/personal/shoppingcart&LHP=/personal/customerhelp/producthelp/mobilehelp/howtoguidesmobile/usingsimbackup/nokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh
           => `ocaportal.portal@_nfpb=true&_pageLabel=shoppingcart&FP=%2Fpersonal%2Fshoppingcart&LHP=%2Fpersonal%2Fcustomerhelp%2Fproducthelp%2Fmobilehelp%2Fhowtoguidesmobile%2Fusingsimbackup%2Fnokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh'
Resolving personal.optus.com.au... 10.120.192.22
Connecting to personal.optus.com.au|10.120.192.22|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
ocaportal.portal@_nfpb=true&_pageLabel=shoppingcart&FP=%2Fpersonal%2Fshoppingcart&LHP=%2Fpersonal%2Fcustomerhelp%2Fproducthelp%2Fmobilehelp%2Fhowtoguidesmobile%2Fusingsimbackup%2Fnokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh: No such file or directory

Cannot write to `ocaportal.portal@_nfpb=true&_pageLabel=shoppingcart&FP=%2Fpersonal%2Fshoppingcart&LHP=%2Fpersonal%2Fcustomerhelp%2Fproducthelp%2Fmobilehelp%2Fhowtoguidesmobile%2Fusingsimbackup%2Fnokia6280&site=personal&blahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh' (No such file or directory).
-----


Note that the problem might be amplified by Windows' maximum pathname limitation.

David Han <kayhadrin>
Thu 05 Feb 2009 07:45:06 PM UTC, comment #22: 

Hi Micah,

Can you (or anyone for that matter) provide a url upon which I can test my code?  The site I used for testing previously doesn't seem to be offering long URLs anymore and I don't have any locally.  Or, better yet, can you provide a wget command (or commands) complete with URLs that should DTRT when this bug is fixed?

I've been sitting on this for a while and have a little time to poke at it now.

Joseph Rios <alotau>
Wed 12 Nov 2008 11:15:28 PM UTC, comment #21: 

Note that the current version of the patch doesn't DTRT when -E is given (that is, the final filename won't end in .html).

(Be that as it may, it still allowed me to download Savannah's 1.12 buglist, which the current Wget won't.)

Micah Cowan <micahcowan>
Wed 29 Oct 2008 08:00:53 PM UTC, comment #20: 

Yes, url_file_name is the perfect place for this.

We should probably determine the values for PATH_MAX and NAME_MAX just once, and then save the values away from that point on. The case where they will differ at different points in the hierarchy strikes me as so rare as not to be worth considering.

Then we could place fallback code around the actual point of opening the file. Although really, for a first go, PATH_MAX and NAME_MAX should be plenty for our needs.

Currently Wget finds a unique filename and then opens it a little later, which is a race condition (some other file can "pop into existence"). At some point we'll want to have the unique-name-finder actually attempt to open the file, and create it if it doesn't exist, so that we already have an open filehandle and know for certain that it's now ours. That would be an excellent point at which to introduce the fallback code, if you want to wait until then.

Micah Cowan <micahcowan>
Wed 29 Oct 2008 07:17:40 PM UTC, comment #19: 

It makes sense to do it as you describe.  I was catching the error at the point the file failed to open, which is long after the path and filename had been determined.  So any chomping at that point could dig into the path.

Figuring out the best place to perform the check for filename and path length will take some time since I'm not that familiar with the code yet.  I started looking a bit more, though, so I can ask: does it make sense to perform the check in the url_file_name() function?  It looks like that is where ultimate filename is created.  Or maybe in unique_name() since that may be called after url_file_name() and could actually add to the ultimate length?  Or do you have a sense for a better place?

If I don't hear from you, I'll use my best judgement.  I probably won't get to this for a few days.

Joseph Rios <alotau>
Wed 29 Oct 2008 05:41:09 AM UTC, comment #18: 

Ah. Well I never actually meant to chomp the path: I meant a third of the filename.

I'd strongly prefer the use of NAME_MAX (and possibly PATH_PAX) (obtained from pathconf, not macros), as that's supposed to be able to tell us exactly how long to make it: no guessing. The other's just for fallback.

I recommended chopping by thirds mainly because for long paths it's a lot quicker than shaving a few at a time (O(log_3 N) versus O(N)). If the max size is 1000 chars and you've got 9000, you'll get there in six steps, rather than 2,666 (with a step-down of 3).

Micah Cowan <micahcowan>
Wed 29 Oct 2008 02:07:13 AM UTC, comment #17: 

Hi Micah,

I think I have a fix that doesn't require explicit knowledge of the {NAME_MAX} and {PATH_MAX} macros, though I do query them diagnostically.  Now instead of chomping of a set amount of characters and hoping for the best, I'm chomping off a few characters and trying again.  I do that until either the filename is legal or fopen() fails for some other reason.  I don't know if this is better or worse than cutting it by a third each time, but I think the spirit is the same.  Chomping off a third makes me nervous because if the path is long, but the file name is actually short, we could be chomping into the path which would cause other problems.  Right now I am the NAME_CHOMP_AMT is set to 3, but whatever seems reasonable to you is fine with me.

Thanks for the info on the C/POSIX stuff.  I've never had to deal with that directly, so it was helpful looking up a little bit of it.

Let me know if you think the attached patch has any further issues to address in regards to this bug.  Also let me know if you think it actually fixes the problem sufficiently.

Thanks for maintaining wget.

(file #16746)

Joseph Rios <alotau>
Tue 28 Oct 2008 06:04:30 PM UTC, comment #16: 

Hi Joe, and thanks for the patches.

Some comments: actually, POSIX does have FILENAME_MAX, just like the C standard does. And {NAME_MAX} isn't necessarily the name of a macro: the most reliable way to get {NAME_MAX} is to use fpathconf or pathconf with the _PC_NAME_MAX macro, which will tell you what the limit is for the particular filesystem that happens to be in use at that particular location.

Don't forget, {NAME_MAX} represents the longest allowable path component, whereas {PATH_MAX} represents the longest allowable path. Both may be needed to choose an appropriate filename size.

It might not be a terrible idea to have some sort of fallback mechanism, in case judicial use of {NAME_MAX} and {PATH_MAX} still results in ENAMETOOLONG; maybe keep cutting the pathname by a third and see if it works sortuva thing.

Micah Cowan <micahcowan>
Tue 28 Oct 2008 04:51:58 PM UTC, comment #15: 

I'm trying to attach the patches that worked for me (never uploaded anything here, so hoping it works).  The only changes were to http.c adding a new branch to an 'if' statement that checks if creating a new file fails and why it fails.  Added one line to http.h as well.  Someone emailed me for this patch, so I thought it best to post it here.  Hopefully if anyone tries it out, he/she can let post a message here letting us know if it works.  And if Micah takes a look at it, he can let us know if it seems reasonable and that it doesn't break anything else.

(file #16744, file #16745)

Joseph Rios <alotau>
Thu 09 Oct 2008 03:16:57 AM UTC, comment #14: 

Micah,

I've made some changes to http.c and util.c which serve my purposes and may serve others as well.

Essentially, I recognize when a file name is too long and then chomp off some number of characters then pass that string to unique_name() for a new file name.  This pretty much follows your suggestions I think. 

I haven't extensively tested it, nor made the code "pretty" yet, but I am certain it works for my application of wget.  Also, it shouldn't break anything else since it only triggers when a long file name error is returned from the fopen() call.  If it doesn't work for all cases, I think it is still better than what exists now (i.e. you never get the file at all).

Let me know if you'd like me to clean up the code and submit a patch or provide more details on my fix.

Joseph Rios <alotau>
Thu 09 Oct 2008 01:51:55 AM UTC, comment #13: 

My apologies for raising that subject in the wrong place.
I reckon Micah's alternative is better though.

I'm a newbie here so I'll have to find out said mailing list ;-)

David Han <kayhadrin>
Wed 08 Oct 2008 07:05:08 PM UTC, comment #12: 

Hi guys, an unrelated bug isn't really the place to discuss new feature ideas; that's really what the mailing list is for.

I would tend to prefer a more generalized solution: send the output to a user-specifiable command, which could then save it in the database or whatever it desires to do.

Micah Cowan <micahcowan>
Wed 08 Oct 2008 03:34:18 PM UTC, comment #11: 

Maybe a flag to choose database saving isn't a bad idea.  Linux and Mac usually come with sqlite installed (right?).  It's server-less, has a C/command-line API and saves the db as single file wherever you like, so folks wouldn't necessarily have to have a mysql server running.  I'm sure this will take quite a bit more thought, but a workaround like this would definitely solve the problem I've been having with long file names.

Joseph Rios <alotau>
Wed 08 Oct 2008 03:44:16 AM UTC, comment #10: 

Does anyone know if there's a tool to save wget's output to a database instead?

That may be a good workaround to be able to save any length of page URL and later for backup purpose.

David Han <kayhadrin>
  Spam posted by anonymous
Wed 20 Aug 2008 08:28:04 PM UTC, comment #8: 

Using URLs as the basis of filenames still makes tons of sense when they're not too long, though. Throwing out the baby with the bathwater isn't what is needed. And while long URLs are not an extreme rarity, calling them common is a bit of a stretch (I personally have yet to run across one in my own use; that might change if I were downloading bug reports from Savannah, though: it tends to have really huge URLs).

As to whether it's still useful to base alternatives to too-long filenames on a truncated form of the URL, I would find it so. And why not? It's not hard to do, and it still keeps as much of the original location information as intact as possible. So what advantage is there in eschewing the original URL entirely, and making up some arbitrary name?

Micah Cowan <micahcowan>
Wed 20 Aug 2008 04:21:38 PM UTC, comment #7: 

If the URLs are too long, relative links will still be broken when viewing the downloaded files in a browser, because the files simply won't be there.  The attempt to save them using a too-long filename will fail.

I ran into this problem myself.  It's the original reason why I posted a comment here.

John Nolan <fossjn>
Mon 21 Jul 2008 06:56:34 PM UTC, comment #6: 

That's all well and good; but using URLs as the basis for filenames has the advantage that relative URLs are not broken when you then view the downloaded files in a browser.

That's less of a problem if you use the -k option.

Micah Cowan <micahcowan>
Mon 21 Jul 2008 06:15:42 PM UTC, comment #5: 

According to the RFCs, there is no theoretical limit to the size of a URL.  In theory, a URL can be arbitrarily long.

Most browsers and servers have some kind of implementation limit on URL length, but often the implementation limits are  generous.  Even older mobile phone browsers, which may have severely limited device memory to work with, often allow 1024 bytes or more for a URL.  I believe Firefox allows very long URLs.

http://www.boutell.com/newfaq/misc/urllength.html

This contrasts with filesystems, which typically have a much smaller length limit on filenames.  Ext2 apparently allows only up to 255 characters for a filename.

In the general case, it really does not make sense to blindly attempt to use a URL as a filename.  Long URLs are not even an edge case, they are common.

John Nolan <fossjn>
Mon 21 Apr 2008 06:56:47 PM UTC, comment #4: 

I'm unclear how truncating the URL does not result in an artificial file name? Obviously, we'd "unique"-ize it, same as we do for other filenames.

Any rename whatsoever is likely to be useless, unless -k (and probably -E) has been enabled, to retain interlinking between the renamed page and other downloaded pages. Still, for some cases it may at least help to narrow down what the original page was.

The real issue that's likely to result, is that on a followup download with -c or -N, Wget won't know where to look for the local file, so will always redownload (to a new, unique filename - the proposed Session Info Database feature would solve this problem).

Micah Cowan <micahcowan>
Mon 21 Apr 2008 05:39:18 PM UTC, comment #3: 

I'd just like to add a suggestion based on my experience with this bug/issue.  It seems that wget is actually able to download the "long file name" files with no problem, so the "truncate URL and try again" may be too harsh.  Perhaps an option to "create artificial local file name and try again" would be more appropriate.  wget could then download to that artificial file name.  That would solve the issue with my wget script.  From my experience, it seems that the long file name is due to a long list of cgi script inputs in the URL so truncation would likely be useless.

Anyway, thanks for addressing this issue, it's much appreciated.

Joseph Rios <alotau>
Sun 13 Jan 2008 09:41:19 PM UTC, comment #2: 

This has been a problem for several users. It kind of sucks that there's no workaround.

Perhaps, if it gets a "filename too long" error, it could automatically truncate the path and try again? Or possibly add a 'cut out the query string from paths'-type deal. Both would require -k to be useful, though, it seems to me.

Micah Cowan <micahcowan>
Tue 04 Dec 2007 05:42:57 AM UTC, comment #1: 

I'm not sure about the "should not exist logically in wget" bit; systems tend to have maximum filename lengths, and the file here exceeded it. Still, it'd be nice to have Wget handle it better, say by picking a shorter name.

See also bug 21042, which has been made a duplicate of this one (but which also refers to an erroneous status message).

Micah Cowan <micahcowan>
Tue 04 Dec 2007 05:29:29 AM UTC, original submission:  

I am unable to save Java to my computer from wget because of this error:
(File name too long).

It seems there is no bypass for it, and it should not exist logically in wget.

John Doe <johndoe32102002>

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attached Files
file #25881:  0001-filename-length-patch-v4-1.14.patch added by rockdaboot (7KiB - text/x-diff - fixed up for current 1.13.4)
file #24473:  wget_filename_length.v3-1.13.patch added by frank (6KiB - text/x-diff - patch updated for 1.13, check actual target directory)
file #23520:  wget_filename_length.v2-1.12.patch added by frank (7KiB - text/x-diff - alotau's latest patch updated for 1.12, bug fixed)
file #17414:  wget_filename_length.v2.patch added by alotau (9KiB - application/octet-stream - Final fix?)
file #17409:  wget_file_length.patch added by alotau (8KiB - application/octet-stream)
file #16746:  wget.patch added by alotau (4KiB - application/octet-stream)
file #16744:  http.c.patch added by alotau (2KiB - application/octet-stream)
file #16745:  http.h.patch added by alotau (301B - application/octet-stream)

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -email is unavailable- added by frank1982 (Posted a comment)
  • -email is unavailable- added by ronald87 (Posted a comment)
  • -email is unavailable- added by julias4 (Posted a comment)
  • -email is unavailable- added by gscrivano (Posted a comment)
  • -email is unavailable- added by nok (Posted a comment)
  • -email is unavailable- added by nok
  • -email is unavailable- added by rockdaboot (Updated the item)
  • -email is unavailable- added by frank (Updated the item)
  • -email is unavailable- added by robman (Posted a comment)
  • -email is unavailable- added by gsauthof
  • -email is unavailable- added by timido (Posted a comment)
  • -email is unavailable- added by kayhadrin (Posted a comment)
  • -email is unavailable- added by fossjn (Posted a comment)
  • -email is unavailable- added by alotau (Posted a comment)
  • -email is unavailable- added by cy6erbr4in (Updated the item)
  • -email is unavailable- added by micahcowan (Updated the item)
  • -email is unavailable- added by johndoe32102002 (Submitted the item)
  •  

    Follow 37 latest changes.

    Date Changed by Updated Field Previous Value => Replaced by
    2017-04-05 rockdaboot Discussion LockNone Locked
    2012-09-29 gscrivano StatusNeeds Discussion Fixed
        Assigned toNone gscrivano
        Open/ClosedOpen Closed
    2012-09-28 gscrivano Assigned toalotau None
    2012-07-09 nok Carbon-Copy- Added -email is unavailable-
    2012-05-29 rockdaboot Attached File- Added 0001-filename-length-patch-v5-1.13.4.patch, #25949
    2012-05-18 rockdaboot Attached File- Added 0001-filename-length-patch-v4-1.14.patch, #25881
    2011-11-27 frank Attached File- Added wget_filename_length.v3-1.13.patch, #24473
    2011-06-13 frank Attached File- Added wget_filename_length.v2-1.12.patch, #23520
    2011-01-30 gsauthof Carbon-Copy- Added gsauthof
    2009-09-08 micahcowan Planned Release1.14 1.12.x
    2009-09-01 micahcowan StatusIn Progress Needs Discussion
    2009-08-30 micahcowan StatusAwaiting Approval In Progress
        Planned Release1.12 1.14
    2009-08-18 micahcowan Priority6 7 - High
        Planned Release1.14 1.12
    2009-07-05 micahcowan StatusReady For Test Awaiting Approval
        Planned Release1.12 1.14
    2009-02-06 alotau StatusIn Progress Ready For Test
    2009-02-06 alotau Attached File- Added wget_filename_length.v2.patch, #17414
    2009-02-06 alotau Attached File- Added wget_file_length.patch, #17409
    2008-11-04 micahcowan Assigned toNone alotau
    2008-10-29 alotau Attached File- Added wget.patch, #16746
    2008-10-28 micahcowan Assigned tocy6erbr4in None

    Back to the top

    Powered by Savane 3.14-3b9d.
    Corresponding source code