In the ‘scary free internet’ we can have DoS attacks or simply a client who has an error using your APi sending thousands of requests in a minute. This could have bad effects in your application performance.
To avoid these effects we can implement this example in our web app: Every IP cannot make more than 10 requests per minute, from that limit we should reject the requests
gem ratelimit
Seems more oriented to count everything in the app:
- Manually you create a Redis database when Rails starts:
@ip_ratelimit = Ratelimit.new("request_ip_limit_db") |
- You can check in order to send a Forbidden message:
@ip_ratelimit.exec_within_threshold request.remote_ip, threshold: 10, interval: 60 do render json: 'IP request limit overhead.', status: 403 end |
- Wherever you want in the controllers you can increment the counter:
@ip_ratelimit.add(request.remote_ip) |
Features:
- Based on local Redis database
- You can count whatever target (not only IPs)
- Implementation:
- Make an initializer to setup Redis database
- Make a filter in Application controller to increment IP counter
gem rack-throttle
Seems the easier, cleaner and faster ruby based solution for our example.
- Configured at config/application.rb:
require 'rack/throttle' require 'memcached' class Application < Rails::Application config.middleware.use Rack::Throttle::Minute, max: 10, cache: Memcached.new, key_prefix: :throttle end |
Features:
- Memcached based (which makes it faster). You can user Redis also.
- Only counts remote_host accesses (identified by default by Rack::Request#ip).
- Strategies can be customized.
- Cleaner in code because this represents a “Rack Layer”, not a “Controller Filter” which spread more the code all over our App.
Nginx ngx_http_limit_req_module
This is a fully implementation of the Leaky Bucket algorithm :
limit_req_zone $binary_remote_addr zone=one:10m rate=10r/m; server { location / { limit_req zone=one burst=5; }
How it runs:
- Reserving a 10Mbytes of shared memory
- Limiting 10 request per minute
- When it is overloaded (burst) 5 requests are delayed to the next minute.
- When rate+burst is over, request is terminated with an error 503 (Service Temporarily Unavailable)
Other solutions
- Rack level solution
- Implements Blacklists and Whitelists