How to implement a minimum load on mysql, with a strong visit to the app?

Good time of day!

Comrades, interested in the question: which is better to use, or how best to organize multiplayer action on a single application.

Describe the situation: there is a module that contains the entrance to several locations. When you log on, the user identificireba in the database, at the moment he is in one of several locations. When you click on locations respectively a record changes. In each location, you can attack those who at the moment is in the same place. Our database is the table, duplicating the user's settings, specially for the module, if in a given cell will be 0 after the attack, the entry is removed and the user respectively moves to the beginning location.
Question: with this approach, it is clear that if a user in a module of 50-100 base will tolerate, but if they are 1000+, respectively, then there is a lot of burden on mysql with a large number of queries. Are there any ideas to overcome? Not cost-effective to use sqllite? Or better?
September 26th 19 at 06:06
4 answers
September 26th 19 at 06:08
Try Redis
September 26th 19 at 06:10
These are your 1000+ users will make requests at the same time?
It is not clear which requests You go to the database (?)
It is not clear what calculations are performed on data (?)

Please describe in more detail.

Maybe you can just load from the database to move the files. For example, the same user data that are duplicated.
Yes, requests go at the same time.
In the database there is a query to determine the current settings of the user.
Take out the data produced in the normal Mat. operation, save the results. - elmore_Kohl commented on September 26th 19 at 06:13
Well, then you should think about making data about users in a file (as PHP array) and access them directly. It will be faster than any either side of the solution. - Cristal commented on September 26th 19 at 06:16
I thought about it, to bring in temporary file storage, to perform operations, and at the end the result in the database. But! with such volumes, the file will be extremely large, and to find data of a particular user, the script will have to read this file from beginning to end, and that with each user. I'm afraid the response time will drop significantly. That's what bothers me... - elmore_Kohl commented on September 26th 19 at 06:19
First, it is necessary to bring only the static data. As I understand it, You have it in user settings. I.e., unloaded from the database user data in the file and further work with the file without the database.
Secondly, these things are recorded in one file, and from the set. I.e. each user a separate file with its data. Name files have their user IDs. - Cristal commented on September 26th 19 at 06:22
September 26th 19 at 06:12
Why is all the time to write these data to disk storage?
September 26th 19 at 06:14
For such scenarios, there is caching in memory. Look in the direction of Redis or Memcached.

Find more questions by tags LaravelMySQLPHP