Questions tagged [Wget] (101)

1
answer

Throw new wget for Windows?

Is there a new WGET for Windows? Searched here and on other sites but he ceased to connect to https. Is there any version where HTTPS works?
mckenna67 asked April 8th 20 at 17:47
4
answers

How to download wget'from links from file and save index.html and site.ru?

There is a file with the sites *.EN: 01-PLAN.RU 01-POKROV.RU 01-PRINT.RU 01-PROFI.RU 01-PTM.RU 01-R.RU 01-REGION.RU 01-REMONT.RU 01-RU.RU 01-S.RU 01-SB.RU 01-SBERBANK.RU Then about 5 million records I need wget multiple threads to download the main pages of sites(unless the site working) Found here such piece: cat ru.txt | ...
Abdul_Lehner asked April 8th 20 at 10:38
1
answer

How to copy website with login?

I tried using wget Ordered this: wget --save-cookies cookies.txt --post-data 'login=loginuser&password=passuser' https://site.com/account I get an error: HTTP request sent, awaiting response... 500 Internal Server Error 2020-02-19 16:05:01 ERROR 500: Internal Server Error. supposedly not to knock. In wget is not strong,...
mitchell_Yun asked April 7th 20 at 16:06
1
answer

How to copy the mobile version of the website using wget?

On the website you want to copy the mobile version is enabled via js and only when the screen resolution for mobile devices. And wget only copies of the PC version? Is it possible to somehow emulate the low resolution and download site?
Ruthie.Ko asked April 7th 20 at 11:25
1
answer

Bash, wget, post, and brackets. How to make friends?

Good night! Using wget I send the following post request: --post-data="name=$name&direction[]=1,3,5" The second part of the server processes with error .. How to overcome the problem )) ? Thanks in advance ))
hugh asked April 4th 20 at 13:15
2
answers

In bash wget to save the file with the name of the part of the link?

Good time of day. There is a file with 10,000+ links that need to download with wget. In the file links in the form: google.com/page?=abcde Will be stored as html pages. Accordingly, if register wget -i links.txt the files will be in the form ofindex.htmlindex.html.1index.html.2 Is there a possibility of using wget to save ...
eden_Halvorson88 asked April 4th 20 at 12:42
1
answer

How to save an exact copy of the relative links with wget?

There is one site that you want to copy locally, but it is necessary to keep an exact copy of a relative url addresses. All posts end in .html, but after downloading, in the browser we get "index.html@p=134.html". That you need to add this command wget-r-k -l 3 -p-E -ncto get an exact copy of the relative links in the brows...
leda.Zboncak asked April 4th 20 at 01:20
1
answer

How to make a ban on the downloading site wget?

How to make a ban on the downloading site wget?
Estella30 asked April 4th 20 at 01:14
2
answers

How to download using wget on a mask without knowing what page it is?

There is here such link: somename.livejournal.com/593.html The number in front .html can be any. The list and the number increases in order. The following can be somename.livejournal.com/22593.html but the last number I know. Is it possible with one command wget to download all existing posts? If Yes, how? Huge request to ...
ludie asked April 4th 20 at 00:33
3
answers

Cannot download the site via wget, why?

Cannot download the site( www.sound-of-change.com/# ) using wget, why? More precisely downloaded one page, and all. Wget works fine with other sites, but that's it. What can I do? If the site I need to explore!
madalyn.McGlyn asked April 4th 20 at 00:18