TIME OUT requests

Hey, someone could give me a hand?

I’m trying to get data from a third party api to store on heroku postgresql.
And 'm having problems with the 30s time out request from heroku.
there are someway to make partials requests pagination to get all data without time out blow up?

Thanks!

This doesn’t appear to be a Django issue. You might get faster and more specific assistance through a heroku-specific support venue, with with checking with the API provider to see if they provide a pagination-style response. (You can’t control that - any ability to retrieve partial response must be supported by the API.)

1 Like

Thanks for answer!

I’m developing an app to help data analise.
I already developed a json paginator system, but it takes too much time to process, on heroku request time limit is 30 seconds. So theres a way to handle it? just to get all data and set a dictionary to save the data on a model object?

Some of it:

mlb = []
page = 1
while page:
print(’===========’)
response = requests.get(
f"https://bling.com.br/Api/v2/contatos/page={page}/json/?apikey={apikey}")
r = json.loads(response.content.decode(‘utf-8’))
get_r = r[‘retorno’][‘contatos’]
print(’=========’)

        mlb.extend(get_r)
        print('GET_R salvo')

        page += 1;

The paging part does not matter – you are going thru all the pages will inside a Django request. (I think).
You have while page, that will keep going until all the data is loaded.

The HTTP request is not the best place to do a long running task. It is better to cache or store the data that you need to send your client.

1 Like

thanks, you help me a lot!