Usually a limit on the static files requests (js-scripts, pictures) are not set.
Limits are for everything else that is not considered a static logic for nginx.
If the images are generated php-script, such as GET-th "get_image.php? Url = vasya.png", the mass of such requests will lead to the achievement of the limit. When the static queries directly from the disk, type "images / vasya.png" - do not lead.
But maybe your host is somehow different.
And as already mentioned, the restriction must be applied in the IP, which exceeds this limit, without affecting other customer site, on the other hand the total limit can help to slow down the load on the server from DDoS, but at the same time make the site temporarily unavailable.
Hosting companies are usually looking for a middle ground, something that is suitable for most sites and at the same time help reduce the burden with the attacks.
I got it. Then are not those requests. Well, that is in the first place is not a single site, well, it's somehow okay. In addition, since nginx caches, I think, and does not refer to the website for each request. That is, again, a page with photos of 100 - this can be a request from the 75-minute and a frontend nginx could - can be 101 (not sure?). And how exactly is considered to be in your case, I personally do not really understand, but as will be considered from the point of view of the 75-year (limited) - I have now when the CDN and caching WP, again, seem to count as one inquiry.
For clarity, it is 75.
Here there ISPManager panel is such a thing as WWW - Magazines - site.ru.access.log.
Here we go there, and see how much of requests is due to the reference to the page.
Like in normal settings, caching and CDN possible to ensure that one request. And without them, I repeat - in the hundreds of images with the page will request 101. And to store hundreds of thumbnails on the page - the norm.
So then, it appears like 75 can and nothing very much.
no one, I immediately wrote the same.
just an example of that was at hand, three hundred ordinary small sites - this is an average of 10-20 requests / sec in total. those. at each even at times less.
as an answer to the question "75 - a lot or a little" :)
but at the same time it may be (for example, another server) that, on average about 100 req / sec, but sometimes at the time Habr effect may arrive 10 times more.
Think about it, a lot or a little ...
I just wanted to say that it is impossible to answer unequivocally, but in any case all sorts of limits (often even hidden) is bad. they certainly can basically be enough, but in some moments can easily stretch to a certain limit. it you want?
and besides, there is a desire to reduce the load at (shifting to cdn for example), and there is a desire to speed up your site.
is always better when the whole content of the site on the same domain. instead of when the page on one, a handful of all css / js with other miscellaneous services are tightened and plus pictures from some free cdn ... :) so is likely to be only slowly, although yes, but less of a chance that you hit the limits ponastavlennye your host.
and yet, do not forget that it is about the limit of requests per second.
sure that your page with hundreds of pictures will have time to boot up from the average visitor (many do mobile Internet may come) in 1 second?
so take enough / no on the number of pictures on the page is not entirely correct. much depends on the "gravity" of the site, from the width of the channel between the visitor and the server from the "tormoznutosti" client (not the fastest mobilnichek or a computer that just do not have time to download and render the page with 100 Photokami per second even if the server and channel allow ).
but the best course is not to focus on them, and as quickly as customers who may be able to rest and you query limit.
and about cdn, I think to read, but again give a link:
Spasibo.Tolko page with hundreds of pictures is not me, I'm talking about your client store.
For the link thanks, honor.
Jackyk, except for the number of requests for hosting usually a dozen other restrictions, which are much more likely you will be rested, for example, to the allocated memory, or CPU time ... well, the problem must be addressed as they arise, and understand that the hosting you can always change) I now practice in hosting disappointing that I believed in and enjoyed for years, and suddenly delighted new ones that take purely for the test) in the end of the year had to give up 3 hosting because of some load exceedances and incomprehensible rules,
But I would not say. I have two important site on WP, enabled caching and CDN, and even heavy work in the admin consumed only a twentieth part of the limit. When racing attendance hope that thanks to the CDN and WP-Super-Cache is basically given to static, and then the CPU time spent will be almost no. So those limits'm just not very scary.
Well, on the one hand it usually is, and I have - not so. Because if you give a link to the media or the visiting team - then it will not be up to solve the problem of hosting; then I from host just need to problems there were not any of the words in general. And not that was a stupid situation, when the site works fine when it is not read, and falls when it came to reading.
But while in general the special cause for concern from the fact that I analyze now, there is - still a few dozen users per second - this is a huge traffic that it can withstand (hopefully still).
Well, it's a yes. Although honestly say I still have some concerns when you transfer CMS - batch file and database created on one operating system to another hosting, where another operating system, it is possible - another version of the database, and other settings. Drop off something ... Maybe it's pure superstition.
But so it is of course too much as I yuzal them different, and in our country, and abroad on different continents ...
Well now like I found very good (in Russian). Yes, with a limit of this, but it seems he is not particularly float (and you can upgrade to a more expensive tariff, where the limit is 4 times more, though, and the price as well) ... But I do not naraduyus how websites work and do not fall entirely .
Well, like all flies. CDN generally taken to accelerate because, as with a distributed network of locations is shown that the closer to the user.
It is not enough. Let me guess - the next more expensive tariff limit for 150 of http-requests, right? Many hosters (from scratch) sin these restrictions to make the user go to a more expensive rate.
prejudices - a fact :)
literally the week came a couple of new customers with very old and neglected sites.
on wp 3-4, joomla 2.5
before the maximum php 5.6 was available to them.
and nothing, everything has been successfully transferred to the past wp 5.4.2 and joomla 3.9.19 with the transition to https.
the most out of a pair of completely abandoned plugins had to be abandoned.
Now everything works fine under php 7.4.7 and 10.5.3 MariaDB
nothing is impossible. we just ponimal what depends on what, what has caused.
what OS (unless windows of course) at all in the long run to the bulb.
it is the site usually only need web-server, something to run scripts (php) and a database (mysql).
just the same php (and most of its modules) is associated with so many "pieces" of other software that are usually included in the set of OSes. for example the same openssl.
php 5.6 with a maximum of openssl 1.0 runs, and php 7+ since openssl 1.1 can.
but this is absolutely not a problem on newer OS to put some old php, adding it needs a different version of the software.
newer versions of mysql are mini troubles, causing problems with databases created in previous versions. but also all solved.
and in general it is not your concern :) it is about the shared, where for all customers do is set up and will look like a small child.
plus who suffers without checking. first throws files / databases, rules everywhere absolute paths to new and other stuff, all checked and only then switched domain itself.
and about not falling - there is monitoring of all kinds. set on the site, even if they watch to 24/7 website worked, showed that it should.
but also because no guarantee that the 3-year 100% uptime has been and that's it tomorrow when you go to all visitors not keepeth 🤣
Release Candidate customers in production?
Monsieur knows a lot about perversion.
Well almost, only 300.
Well, not on an empty yet. Limits for one to some extent mean quality for the other (for a neighbor for hosting, well, for us it ourselves - so that we do not interfere with the neighbor). The model itself - you use more power >> to pay more - like there's nothing wrong.
That would be important, and when the hosting company and the customer responsible attitude to this issue. There is a limit, it is not hidden from the user, it is given enough penny price rate is very high (figase at this math it can be a million-plus visitors per day at a rate of less than 200 rubles!), And well, if before reaching these limits sites work fine. In principle, this is a normal honest approach. Who promise mountains of gold, such as "do what you want" - it may not always perform. Who promises with a price tag and numbers, and performs responsibly - to respect.
To post a new comment, please log in or register