Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Capturing a website to view offline

  1. Nov 6, 2006 #1

    DaveC426913

    User Avatar
    Gold Member

    Anyone know of a convenient way to capture a whole (flat HTML) website so it can be viewed offline? I mean, other than file by file and image by image.
     
  2. jcsd
  3. Nov 6, 2006 #2

    turbo

    User Avatar
    Gold Member

    Google on "copy website" for options. There are applications that will crawl through entire websites and download them to your HD.
     
  4. Nov 6, 2006 #3
  5. Nov 6, 2006 #4

    NoTime

    User Avatar
    Science Advisor
    Homework Helper

    If all you want is a screen image...
    Use Alt-PrintScreen to copy active window.
    Paste image into Paint, Imaging, Photoshop, etc.
     
  6. Nov 6, 2006 #5
  7. Nov 6, 2006 #6

    jtbell

    User Avatar

    Staff: Mentor

    If it's a large web site, the owner may not appreciate having a robot crawl all over and suck up hundreds of megabytes of content at once. It puts a huge load spike on his server, and he may have to pay his provider based on traffic above a certain threshold.
     
  8. Nov 6, 2006 #7

    DaveC426913

    User Avatar
    Gold Member

    Never mind, I found WinHTTrack, a site-capture tool.

    Wow, and just as well, this site is monstrous. I had no idea. It's a yearbook site, spanning 75 years. I'm over 100Mb/10,000 files so far.

    All this, so my dad can look at it from a CD, rather than online...

    The things I do...
     
    Last edited: Nov 6, 2006
  9. Nov 6, 2006 #8

    0rthodontist

    User Avatar
    Science Advisor

    If you are using Linux you can use wget. Actually you can do this in Windows too if you download it.
     
  10. Nov 7, 2006 #9

    DaveC426913

    User Avatar
    Gold Member

    Phew. 700Mb, 15,800 files, 6 hours to download.

    I'll bet the site owner hates me.
     
  11. Nov 7, 2006 #10
    agreed!! private crawling is hated amongst webmasters because most of the time it's from a some dude trying to rip or copy the site and then put up a copy on another site. there is actually a hack in vb that lets you block 100s of common private crawlers
     
  12. Nov 7, 2006 #11

    jtbell

    User Avatar

    Staff: Mentor

    In my case, my personal hobby site (which has a large gallery of pictures) is on one of my college's Web servers, and I don't want to impact normal academic use.

    So I watch for robots and for my other pet peeve: people on forums who hotlink to several of my pictures in a single posting, which causes several hits on my server every time someone opens that thread.

    To counter this, first I set up my server to examine the referring URL whenever someone fetched a picture, and if it was from one of the offending sites, I sent instead a GIF with the red-bar-in-circle logo over the word "Hotlinking", and the URL of my terms of usage below.

    Then I saw that someone had started a thread titled "The scariest thing in the world!" which hotlinked directly to that GIF! So I took a thumbnail-sized JPEG of Alfred E. Neumann (the MAD magazine character) and substituted that. The thread became hilarious for a while, with new viewers seeing Alfred while previous viewers (including of course the original poster) still had my "scary" GIF in their browser caches. "What, me scary?"

    Eventually someone caught on and said, "hey dudes, refresh your cache!" but it was fun in the meantime. :biggrin:
     
  13. Nov 7, 2006 #12
    try to hotlink on of PF's images :smile:
     
  14. Nov 7, 2006 #13
    O Yes You Cam!!!!!!!!

    :rolleyes: [​IMG][​IMG]
    Custemize it
    [​IMG][​IMG]

    [​IMG][​IMG]
    :biggrin:
     
  15. Nov 7, 2006 #14
    How do i Delete

    [​IMG][​IMG]

    [​IMG][​IMG]
    o:)
     
  16. Oct 9, 2007 #15
    mmmmmmmmm
    capture a whole (flat HTML) website
    1. File
    2. Save As Type
    3. Webpage Complete (*.htm, *.html)
     
  17. Oct 9, 2007 #16
    mmmmmmmmm
    capture a whole (flat HTML) website

    File
    Save As Type
    Webpage Complete (*.htm, *.html)
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Capturing a website to view offline
  1. Developing a website (Replies: 8)

  2. Storage Website (Replies: 18)

  3. Is website a software? (Replies: 4)

Loading...