-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add response handler #37
base: v2.x.x
Are you sure you want to change the base?
Conversation
What? Supports accepting a response handler function in Patova, this allows us to implement things as training mode based on Patova's features. Why? Sending the rate limit headers is not the only possible reaction on error, we might even need to send different headers based on the limit (for example global vs. local). There are other possible actions for example not failing the request but limiting the resources put in it or moving it to a different work-stream it. One use case is implementing "training mode" where we want to send the rate limit header but avoid completelly failing the request, that would let consumers prepare.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor and documentation improvements 🤓
@@ -48,6 +48,15 @@ The object has the following schema (validated [here](./lib/index.js) using [Joi | |||
* `error: Error` - The error that occurred. | |||
* `reply: Reply` - The hapi.js [reply interface](http://hapijs.com/api#reply-interface). | |||
> If an error occurs and no function is provided, the request lifecycle continues normally as if there was no token bucket restriction. This is a useful default behavior in case the limitd server goes down. | |||
* `limitResponseHandler: (limitResult, req, reply) => ()` A function that receives the result of limit checking against limitd, the `request` and the reply interface and is responsible for sending the handling |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would rename this option to takeResultHandler
.
Wouldn't it be more generic to handle conformant and non-conformant results? The same way we could reply.continue()
on non-conformant, it might be useful to reject even for conformant?
If we stay with only non-conformant, then I would rename the option to nonConformantResultHandler
instead.
I would rename limitResult
to takeResult
since it's the result object returned from limitd's take()
method.
Maybe we should describe succinctly the default behavior when the option is omitted.
and is responsible for sending the handling how to respond to the customer
This sentence needs better wording. Maybe:
and is responsible for handling a non-conformant request.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
takeResultHandler
seems really tied to limitd and in theory there might be N takes and we will only call the handler one time.
Regarding renaming to take result my main concern is that as final user you should not have to know how limitd works to use patova, so you should not care if limitd has a method called take IMO.
I will make it more generic more generic to handle the non-limit case too, although in general I resist to make changes I don't feel have a clear use case and fail to see a clear one for the "conformant" case. Maybe not including the headers, or changing some of them I guess? In any case I see some side use cases as the one I pointed out previously, so 👍
@@ -55,6 +56,16 @@ function setupRateLimitEventExt(server, options) { | |||
const extractKey = options.extractKey; | |||
const onError = options.onError; | |||
|
|||
const limitResponseHandler = options.limitResponseHandler || ((result, req, reply) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor:
We could move the default handler at the root of the file, out of the setupRateLimitEventExt
function (it is not closing on any other variable).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in this case it is mostly the same because this is a builder not used at high rate, I'd prefer to keep the function as closed as possible to their usage unless there are perf reasons not to do it.
it('should call limitResponseHandler to handle the answer', function(done){ | ||
const request = { method: 'POST', url: '/users', payload: { } }; | ||
server.inject(request, res => { | ||
expect(res.statusCode).to.equal(200); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it make sense to spy on limitResponseHandler
and check that it was actually invoked, to eliminate the possibility that the take
returned a conformant response? Otherwise, I think that configuring only the options.emptyType
bucket would make it more obvious that take
returns a non-conformant result.
I'd also add a test with a custom handler that returns non-429 result (to witness the handler overriding the default error code). That would also self-document one use-case showing how to use the new handler.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think that the kind of test patova has let us spy in an easy way and non-hacky way. Also we have other test above for the same condition that check that there is a 429 response in this case.
I could maybe check the response handler for the result if we want to be 100% sure in this particular test case
What?
Supports accepting a response handler function in Patova,
this allows us to implement things as training mode based on
Patova's features.
Why?
Sending the rate limit headers is not the only possible reaction
on error, we might even need to send different headers based on
the limit (for example global vs. local). There are other possible actions
for example not failing the request but limiting the resources put in it
or moving it to a different work-stream it.
One use case is implementing "training mode" where we want to send
the rate limit header but avoid completelly failing the request, that
would let consumers prepare.