Skip to content

INT-4497: Add rate limiting advice for handler #2544

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

meherzad
Copy link
Contributor

JIRA: https://jira.spring.io/browse/INT-4497

Add Rate Limiting advice for a handler which supports Rate, MaxDelay

JIRA: https://jira.spring.io/browse/INT-4497

Add Rate Limiting advice for handler which supports Rate, MaxDelay
Copy link
Member

@artembilan artembilan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here is some review.

I may come back with something else after sleeping with this a bit and after investigating what really we would like to do here...

Thank you for a great contribution anyway!

private final long maxDelayInMillis;
private final long oneTimeUnitInMillis;
private final AtomicLongArray queue;
private final int rate;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This style is not readable.
Would be better to surround each class member to the blank lines.

Thanks


/**
* Create an instance of RequestHandlerRateLimiterAdvice with the provided arguments.
*
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

* @param maxDelay the max allowable delay in time units. Use -1 for no max delay.
* @param timeUnit the time unit.
*/
public RequestHandlerRateLimiterAdvice(int rate, int maxDelay, TimeUnit timeUnit) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't we use a Duration instead ?

this.currentIndex = new AtomicInteger(0);
this.maxDelayInMillis = timeUnit.toMillis(maxDelay);
this.oneTimeUnitInMillis = timeUnit.toMillis(1);
this.queue = new AtomicLongArray(rate);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A queue is not a valuable name for the variable, especially when it is an object property.

Please, come up with more self-explanatory name.

Right, now it isn't clear what is its purpose.

Assert.notNull(timeUnit, "Timeunit should be non null");
this.currentIndex = new AtomicInteger(0);
this.maxDelayInMillis = timeUnit.toMillis(maxDelay);
this.oneTimeUnitInMillis = timeUnit.toMillis(1);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the magic number 1 millisecond?
Why do we need it?

Well, even if it is important, why can't we initialize it during definition?
What is the point to overhead a ctor body?

The same is about a currentIndex initialization here.

/**
* An exception thrown when the max delay is triggered for rate limiter.
*/
public static final class RateLimiterMaxDelayException extends MessagingException {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, actually exceptions are named after actions: ResourceNotFound, TransactionTimeout etc.

I'm not sure in the algorithm here yet, so I can't come up with the better name...

MaxDelayExceededException ?

The RateLimiter is already implied with the wrapping class. 🤷‍♂️


private void delayIfNecessary(Object target, Message<?> message) throws InterruptedException {
long timeToWait = -1;
while (timeToWait == -1) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe there should be a way for fail fast for a single call or as we already know per rate.
I mean if we still in the rate we doin't need to sleep or even calculate System.currentTimeMillis() at all.

I need to understand algorithm better, because it looks like I have more questions, than review requires 😢

@@ -49,11 +49,12 @@ For chains that produce a reply, every child element can be advised.
[[advice-classes]]
==== Provided Advice Classes

In addition to providing the general mechanism to apply AOP advice classes, Spring Integration provides three standard advice classes:
In addition to providing the general mechanism to apply AOP advice classes, Spring Integration provides four standard advice classes:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, let's fix it for something like these standard advice classes to cover all the possible future additions 😄

The Rate Limiter advice allows you to ensure that an endpoints does not get overloaded with requests.
When the RateLimit is breached the request will go in a blocked state.

A typical use case for this advice might be an external service provider not allowing more than n number of request per minute.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe n must be wrapped to code to emphasize it a bit.

====
[source,xml]
----
<int:service-activator input-channel="input" ref="failer" method="service">
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we have a Java sample as well, please?

@artembilan
Copy link
Member

I suggest to revise our own algorithm here in favor of well-known and stable solution in the Resilience4J: http://resilience4j.github.io/resilience4j/#_ratelimiter

In the future we may also reconsider our Circuit Breaker implementation to be based on their solution as well: http://resilience4j.github.io/resilience4j/#_circuitbreaker

Any thoughts?

Thanks

@garyrussell
Copy link
Contributor

Sure; makes sense.

@artembilan
Copy link
Member

Superseded with #2781

@meherzad , thank you for your effort.
You are welcome to review that PR or any other contribution is always welcome!

@artembilan artembilan closed this Mar 4, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants