How to protect a database with critical data from the arbitrariness of slow queries?Asked by penny shima glanz
Situation: there is a combat server, on it there is a web server and MySQL. First of all, PHP scripts running under Apache on this server interact with the muscle, and secondly, remote users via TCP. They work with the same base. However, the performance of the “local apache plus muskl” bundle is critical, and “remote users plus muskl” is not.
A “deleted” user starts a clumsy query — for example, a REGEXP select for a non-indexed column of 20 million rows. At the same time, within 3-5 minutes, all other requests to this database, which normally fly, slow down. As a result, a critical web part stops responding at an acceptable rate. How to make so that remote users could send shit requests without harming the functioning of local connections to the database? "Lokalka" and "remotes" connect to the database by different users. Spread the base into two - the option does not suit. There is a performance margin on the server (8 cores, 24 gigabytes of memory).
Do you need it so that your users can launch their own crookedly written requests? They can not get along with a set of headlined report requests, or something like that? They can not send requests that they want to fulfill, special competent dude, who will revise them and run? And how do you prevent inadvertent deletions and so on? Do users have read-only privileges on all database objects?
If you are just so critical to give users such access ... Then, as they said here, make a slave-server with replication, or set up a user resource quotas about CPU usage, IO-bandwidth, memory for each user (under which your users connect to the database ). Myself in mysql is not strong, because I do not know how it is with resource quotas.