Howto dump all texts of a web site

Print This Post Print This Post

Recently I wanted to get all texts from a web site to prepare a translation. Although I found the one or other tip, here is the easy solution with a linux system.
Simply use lynx:

lynx -crawl -traversal

(just replace with the URL you want to get the texts from).

To get all html pages (no images etc) you can use wget:
wget -r -k -L -A htm,html

With lynx you will get all texts as dat files and can start reading/translating offline.

Leave a Reply