Welcome!
There is a page with a list of URLs that need to walk, go each to make something(perhaps, too, to link to it). The problem is that after the first link, the next no longer comes.
url1 = '....'
@browser = Capybara.current_session
@browser.visit url1
@browser.all '.registerBox table'.each do |item|
link_on_uid = item.first.descripTd dt a'
url2 = link_on_uid['href']
puts url2
@browser.visit url2
@browser.all('.boxWrapper>table>tbody>tr').each do |tr|
.....
end
puts 123
end
The code like this. Run in the console, after 123 puts for a long time nothing happens, and then the process ends without errors. The result in console only two strings - the url of the page and 123.
If you comment out the link in the variable url2 that returns all the necessary links from the page.
How to use save_and_open_page ? I realized that the page is saved locally, but you continue to do? To parse with nokogiri or open it in a separate session capybaras? - Clare63 commented on June 8th 19 at 17:02
This is a debugging tool when you don't know what will proishodit - save the page and see what's there - jany.Rolfson commented on June 8th 19 at 17:05