When you search in the json file gives a 500 error?

Good evening, everyone. Guys please help me with this task.

Is a form of search data from json file. So here is the problem when the json file is small everything works. When you do a search in the complete file gives a 500 error Intervall Server Error 500.
The volume of the json file is 274 MB.
On the LAN, the search is a to server no . I suspect that the problem may be server capacity. But maybe I do not do the right thing .

Here is the code py file
from flask import Flask

from flask import render_template, request, Response, send_from_directory

import json

import logging

from logging.handlers import RotatingFileHandler
import os



app = Flask(__name__)






path = 'data2.json'

@app.route('/', methods=['GET', 'POST'])
def hello_world():
 if request.method == 'POST':
 medi_metall = request.form['num_st']

 with open(path, 'r') as f:
 data = json.loads(f.read())
 for i in data:
 if i['katid'] == medi_metall:

 ss = 'inventory number:' + i['katid']
 ss1 = 'use the document:' + i['vidiz']
 ss2 = 'Area:' + str(i['pl']) + 'sqm'
 ss3 = 'Address:' + i['adres']
 ss4 = 'previous value:' + str(i['prrez']) + 'RUB'

 return render_template('index.html', tt=ss, tt1=1, tt2=ss2 tt3=ss3, tt4=ss4)





 return render_template('index.html')



@app.route('/1.html', methods=['GET', 'POST'])
def send():

 if request.method == 'POST':
 medi_metall = request.form['num_st']
 with open(path, 'r') as f:
 data = json.loads(f.read())
 for i in data['employees']['employee']:
 if i['FIELD2'] == medi_metall:

 ss ='to this number the order of application has already been made'
 return render_template('1.html', tt1=ss)
break
else:
 ss ='the customer first';
 return render_template('1.html', tt1=ss)


 return render_template('1.html')


if __name__ == '__main__':
app.run()
 if not app.debug:
 # ...

 if not os.path.exists('logs'):
os.mkdir('logs')
 file_handler = RotatingFileHandler('logs/microblog.log', maxBytes=10240,
backupCount=10)
file_handler.setFormatter(logging.Formatter(
 '%(asctime)s %(levelname)s: %(message)s [in %(pathname)s:%(lineno)d]'))
file_handler.setLevel(logging.INFO)
app.logger.addHandler(file_handler)
app.logger.setLevel(logging.INFO)
 app.logger.info('Microblog startup')
June 3rd 19 at 19:09
2 answers
June 3rd 19 at 19:11
Solution
The volume of the json file is 274 MB. You for each request loaded it into memory, and the memory usage will be much more of these 274. When you have 10 concurrent requests? The database is not invented for nothing.
I agree with that. Translated data into the database worked. But still thought that it will be possible to do without database - did not work. - Wilson.Buckridge commented on June 3rd 19 at 19:14
June 3rd 19 at 19:13
little code

Find more questions by tags PythonFlaskJSON