How to bypass the limitation on requests to the server per second?

I want to put the whole wall of one group Vkontakte. There are a lot of records.
Learned that there is a method wall.get, but it is possible to cause only 2500 times a day. This is not enough.
Then I learned about the fact that with the mobile version of the page scroll down executes this query:
POST https://m.vk.com/clubXYZ?offset=35&own=1
I tried using requests - working. Removed, own=1 - works.
Also found out that VK will deliver a total of 10 posts.
Ie if the group has 70,000 posts, I will have to do 7'000 requests. Each query executes in 0.2 c -> 23 minutes (and the group)
I decided to use threads - not helped, flows from the proxy did not help, asynchronous requests too.
Tried asynchronous requests with a proxy, but there is a crutch on a crutch and still nothing works.
What can be done to ensure that Vkontakte did not get banned my queries?
And how to use proxies, if necessary?
Code

import random
import asyncio
import aiohttp
import aiohttp_socks
from aiohttp import ClientSession
from aiohttp_socks import SocksConnector
import pickle

storage = []

proxies = ['46.4.96.137:1080', '134.0.116.219:1080', '207.154.231.212:1080', '207.154.231.213:1080', '138.68.161.60:1080', '82.196.11.105:1080', '178.62.193.19:1080', '188.226.141.127:1080', '207.154.231.211:1080', '207.154.231.216:1080', '88.198.50.103:1080', '188.226.141.61:1080', '188.226.141.211:1080', '176.9.119.170:1080', '207.154.231.217:1080', '138.68.161.14:1080', '138.68.165.154:1080', '176.9.75.42:1080', '95.85.36.236:1080', '138.68.173.29:1080', '139.59.169.246:1080']


async def fetch(url, i):
 l = 1
 while l < 10000:
 await asyncio.sleep(random.randint(0, 10))
 proxy = random.choice(proxies)
 # print(proxy)
try:
 async with ClientSession(connector=SocksConnector.from_url('socks5://' + proxy)) as session:
 async with the session.post(url, data={'offset': i}, proxy='http://' + random.choice(proxies)) as response:
 s = await response.read()
 l = len(s)
print(l)
 except aiohttp.client_exceptions.ServerDisconnectedError:
 await asyncio.sleep(3)
 except aiohttp_socks.proxy.errors.ProxyError:
 await asyncio.sleep(3)
storage.append(s)
 return s


async def bound_fetch(sem, url, i):
 # Getter function with the semaphore.
 async with sem:
 await fetch(url, i)


async def run(r):
 url = 'https://m.vk.com/sketch.books'
 tasks = []
 # create instance of Semaphore
 sem = asyncio.Semaphore(1000)

 # Create client session that will ensure we dont open new connection
 # per each request.
 for i in range(0, r + 1, 10):
 # pass the Semaphore and session to every GET request
 task = asyncio.ensure_future(bound_fetch(sem, url, i))
tasks.append(task)

 responses = asyncio.gather(*tasks)
 await responses


number = 70610
loop = asyncio.get_event_loop()

future = asyncio.ensure_future(run(number))
loop.run_until_complete(future)

print(len(storage))
with open('sketch_books_2.vk', 'wb') as f:
 pickle.dump(storage, f)

April 7th 20 at 10:43
1 answer
April 7th 20 at 10:45
Solution
proxies or directly parse the page
What proxy is used? If for these queries, HTTP, HTTPS, SOCKS4, SOCKS5 proxy? - morgan66 commented on April 7th 20 at 10:48
@morgan66before you ask a question it would be nice with these protocols to get acquainted))) - Cordell_Stroman27 commented on April 7th 20 at 10:51

Find more questions by tags PythonAPIVKontakte