Richard Thomas
2018-08-22 18:21:36 UTC
Hi, hope this is the correct way to do this.
I want to be able to download a webpage and all its prerequisites and
turn it into a multipart/related single file. Now, this requires
identifying and changing URLs which, as most members of this list are
no-doubt aware is a thorny problem. Fortunately, wget already does this
as part of its -p and -k options. Unfortunately, though it's amazingly
useful, it's difficult to use the output for what I want.
So I am planning on adding a way to implement this functionality
directly into wget. Either I'll rewrite the links and filenames so that
it's easy to piece together a multipart/related file from what is spit
out or I'll have wget generate the multipart/related file itself
(probably the latter or maybe both).
I was just wondering if I should bother trying to feed this back into
the project if there's any interest. Also, any suggestions on ways I can
make this as useful as possible are welcome.
Rich
I want to be able to download a webpage and all its prerequisites and
turn it into a multipart/related single file. Now, this requires
identifying and changing URLs which, as most members of this list are
no-doubt aware is a thorny problem. Fortunately, wget already does this
as part of its -p and -k options. Unfortunately, though it's amazingly
useful, it's difficult to use the output for what I want.
So I am planning on adding a way to implement this functionality
directly into wget. Either I'll rewrite the links and filenames so that
it's easy to piece together a multipart/related file from what is spit
out or I'll have wget generate the multipart/related file itself
(probably the latter or maybe both).
I was just wondering if I should bother trying to feed this back into
the project if there's any interest. Also, any suggestions on ways I can
make this as useful as possible are welcome.
Rich