How to protect a database with critical data from the arbitrariness of slow queries?

Asked by penny shima glanz

Situation: there is a combat server, on it there is a web server and MySQL. First of all, PHP scripts running under Apache on this server interact with the muscle, and secondly, remote users via TCP. They work with the same base. However, the performance of the “local apache plus muskl” bundle is critical, and “remote users plus muskl” is not.

A “deleted” user starts a clumsy query — for example, a REGEXP select for a non-indexed column of 20 million rows. At the same time, within 3-5 minutes, all other requests to this database, which normally fly, slow down. As a result, a critical web part stops responding at an acceptable rate. How to make so that remote users could send shit requests without harming the functioning of local connections to the database? "Lokalka" and "remotes" connect to the database by different users. Spread the base into two - the option does not suit. There is a performance margin on the server (8 cores, 24 gigabytes of memory).

Comments:
The problem was partially solved by a significant increase in max_connections (default 151) and back-log (default 50) in my.cnf - this allowed us to fully utilize the server's power reserve, significantly increasing the initial brake threshold. - j

Answers

krystina
Take out the reading for the remote users on the slave server (can be placed on the same hardware, only use a separate hdd), and write to the master. You can divide read and write with the help of mysql proxy forge.mysql.com/wiki/MySQL_Proxy_RW_Splitting.
Replies:
in this stupid script encoding is not saved. that is, set names will not be used during the second connection. - eva warner
lisa nelson
, you can configure replication (the task book does not indicate whether remote clients will change the database): master will give work to the local server, and remote clients and the archive removal process will be sent from the slave. So for example, a question with a backup of data with tables in excess of 1 million. records are best addressed through an appeal to the slave server.
Replies:
(first comment reveals the question better) - paul
geta t
To write a daemon that checks the current muscular queers at a remote user at the required frequency and knocks them down if they have been delayed longer than ...
Replies:
All you would kick. The condition of the problem is not that it is useless requests. - mary dawn
The condition states that "the functionality of the bundle remote users plus muskl is not critical" - jordan halsey
roxannap
Split the base into two - is not satisfied with the option.? and why it is necessary to smash it into two - as it was already said to single out a separate slave or is it not part of your plans to build a master slave system? because in this way not only the speed increases but also reliability. Yes, of course, you will need the ability to design and solve all this synchronization problems - but the result may exceed your expectations.
chase carter
use Sphinx
Replies:
what is he doing here? - jennifer lynn
slow queries will work faster. - jim harden
rodgine
I think high-speed scsi / sas screws can help with iron, and from software optimization - sort_cache, random_cache, etc. can help.
barbara mccord
It seems to me, as an Enterprise Engineer, that it’s generally something wild to give to users who are not developers / support engineers, and do not know well SQL and database “generally” direct SQL-level access to a large loaded database with critical data (by the way, does it mean access via TCP? Did you mean - the ability to manually launch queries from mysql clients, such as SQLYog?)

Do you need it so that your users can launch their own crookedly written requests? They can not get along with a set of headlined report requests, or something like that? They can not send requests that they want to fulfill, special competent dude, who will revise them and run? And how do you prevent inadvertent deletions and so on? Do users have read-only privileges on all database objects?

If you are just so critical to give users such access ... Then, as they said here, make a slave-server with replication, or set up a user resource quotas about CPU usage, IO-bandwidth, memory for each user (under which your users connect to the database ). Myself in mysql is not strong, because I do not know how it is with resource quotas.
Who are iname.ua :: Advise a book on JavaScript :: Gaining experience in programming? :: Is mozilla.org available from Belarus? :: Startup promotion, how?
Leave Repply for How to protect a database with critical data from the arbitrariness of slow queries?
Useful Links