Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question : To limit number of open simultaneous / concurrent connections to a file in a real world DDoS scenario #9

Open
C0nw0nk opened this issue Sep 17, 2016 · 3 comments

Comments

@C0nw0nk
Copy link

C0nw0nk commented Sep 17, 2016

So in a DDoS scenario the config can be the following.

limit_req_zone $binary_remote_addr zone=one:10m rate=30r/m;
limit_conn_zone $binary_remote_addr zone=addr:10m;

location ~ \.mp4$ {
limit_conn addr 1; #Limit open connections from same IP
limit_req zone=one burst=5; #Limit max number of requests from same IP

mp4;
limit_rate_after 1m;
limit_rate 1m;

expires max;
valid_referers none blocked networkflare.com *.networkflare.com;
if ($invalid_referer) {
return   403;
}
}

My above config would be good for a single IP or few IP's that would be spamming / flooding the mp4 files on the server but would not hold up against a real larger DDoS.

The real world scenario : An attacker could have over 1000 machines connections to hit you with.

The way they could easily bypass the above is all 1000 machines connect to download a file at the same time and they don't spam / flood the requests since if they did that they would be blocked and served the 503 status code instead they just keep the connection open to download that file they request at 1mb a second X by 1000 machines would mean your servers 1Gig port is maxed out now (Successful attack) when they finished downloading the file they instantly open it again and repeat constantly consuming the bandwidth, Because 1 IP per machine is accessing the file none of them have the same IP and they are not spamming or flooding to trigger the limit blocks they easily bypass the firewall.

A possible way to fix the above is to use limit_conn with $uri or $request_uri to limit simultaneous connections to download a single file at once.

limit_conn_zone $request_uri zone=peruri:10m; #Limit max number of open connections to a single file

limit_req_zone $binary_remote_addr zone=one:10m rate=30r/m;
limit_conn_zone $binary_remote_addr zone=addr:10m;

location ~ \.mp4$ {
limit_conn peruri 1; #Limit max number of connections to open / download a single file at anytime.

limit_conn addr 1; #Limit open connections from same IP
limit_req zone=one burst=5; #Limit max number of requests from same IP

mp4;
limit_rate_after 1m;
limit_rate 1m;

expires max;
valid_referers none blocked networkflare.com *.networkflare.com;
if ($invalid_referer) {
return   403;
}
}

The above added code :

limit_conn_zone $request_uri zone=peruri:10m; #Limit max number of open connections to a single file
limit_conn peruri 1; #Limit max number of connections to open / download a single file at anytime.

Would make it so only 1 person may access that particular file URL at a time. (Still not good enough) they could adapt to instead of spamming / flooding or slow-loris opening a single file on a mass scale to bypass the above solution via opening different mp4 URL's simultaneously if each of the 1000 IP's connect and all open a different file back to square one.

How your module could save the day! 👍

http {
limit_traffic_rate_zone rate $request_uri 32m; #Requested URL
server {
location /download/ { #The folder where all MP4 files sit
limit_traffic_rate rate 500m; #Limit total traffic to the download folder across infinite connections / requests to half of the servers 1Gig port capacity
}
}
}

So with the above is my understanding correct that your module could fix all the issues listed above by no matter how many downloads over how ever many different MP4 file URL's are taking place simultaneously your module will not let the max bandwidth output be more than Half of the servers 1Gig port capacity.

Or is the above config wrong for your module and would result in only limiting per file ($request_uri) still ?

I didn't write the above to poke holes or flaws out etc, I do so to help and make things better and solve / address potential problems (if any) sorry for the lengthy post. Think its a great module with allot of potential and I strive to maximize its potential 👍

@bigplum
Copy link
Owner

bigplum commented Sep 19, 2016

In your config, the limit 500m is for per request_uri not for the whole server.

@C0nw0nk
Copy link
Author

C0nw0nk commented Sep 19, 2016

Thanks yes oops what it the correct way to limit it for the entire server ? would replacing it with maybe $host do the trick ?

@C0nw0nk
Copy link
Author

C0nw0nk commented Sep 19, 2016

limit_traffic_rate_zone rate $host 32m;

Does work but appears to be ignoring my original limit_rate settings.
limit_rate 1m;

So now max output per download has become 2-5mb/s (not a bad thing since when the server hits the max capacity that limit will start decreasing.

The limit_traffic_rate also does not allow you to use it more than once per location or you get a duplicate error I found out since I tried to enforce my 1mb limiting still using your module.

http {
limit_traffic_rate_zone rate $host 32m; #For entire server
limit_traffic_rate_zone periprate $binary_remote_addr 32m; #Per IP
server {
location /download/ { #The folder where all MP4 files sit
limit_traffic_rate rate 500m; #Limit total traffic to the download folder across infinite connections / requests to half of the servers 1Gig port capacity
limit_traffic_rate periprate 1m; #Limit PerIP to a max of 1mb/s
}
}
}

Get a duplicate error if you try and use the directive more than once in the same location.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants