Skip to content

Make load() return cached promises per queue cycle #97

@novemberborn

Description

@novemberborn

I'm using DataLoader in combination with GraphQL. The relevant parts of my graph are:

Project {
  creator: User!
  memberships: [Membership!]!
}

User {
  id: ID!
  name: String!
}

Membership {
  user: User!
  state: String!
}

The resolution behaves like this:

  1. Get project
  2. Get creator
  3. Get memberships (these already include a userId)
  4. For each member, get its user

User resolution uses a shared DataLoader instance. The creator is also a member. The relative timings are such that by the time I retrieve the user for each member, that instance has already loaded the creator.

As it turns out, I cannot resolve the membership state without also resolving the membership user. Naively I assumed the following would lead to a single batched call to my backend:

async state (membership, variables, context) {
  const someValue = await context.userLoader.load(membership.userId)
  return context.stateLoader.load(someValue)
}

However if the creator has already been loaded by the time I call await context.userLoader.load(membership.userId), then a resolved promise is returned. This means that for the creator, context.stateLoader.load(someValue) is called much earlier than it is for the other members (since those have to wait on a new context.userLoader request). As a result, I have two requests to load the state.

The calls to the state resolver happen within the same tick. The problem is that the previous context.userLoader.load(creatorId) call leaks into this tick. This would be because the cached promise is returned: https://github.com/facebook/dataloader/blob/25cf990f42f85c4cbffc3a6b55a45df4c0575c6b/src/index.js#L89

I'm proposing that within a tick, calls to DataLoader#load(key) with the same key return the same promise. Across ticks, they return different promises. All returned promise resolve simultaneously. Once a key is cached, each promise for that key returns the same cached value.

This could be done by maintaining an explicit per-queue cache, which is reset in dispatchQueue().

Let me know if this behavior seems sensible. I can do a PR for it, too.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions