Skip to content

Eager seq ix#965

Open
treeowl wants to merge 2 commits intoekmett:masterfrom
treeowl:eager-seq-ix
Open

Eager seq ix#965
treeowl wants to merge 2 commits intoekmett:masterfrom
treeowl:eager-seq-ix

Conversation

@treeowl
Copy link
Copy Markdown
Contributor

@treeowl treeowl commented Jan 12, 2021

  • Make ix look up eagerly for Seq.
  • Make ix use !? for vectors, which should eventually make those lookups eager too.

Use `!?` for vector `ix` rather than manual range calculations.
Once [this PR](haskell/vector#347) lands,
that will also make the lookups eager, as they should be.
Copy link
Copy Markdown
Collaborator

@phadej phadej left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Eager here means that we don't retain the original sequence/vector, i.e. f x is f <element value> and not f <think to lookup element>?

If my understanding is right, could you amend commit message(s) with that, so it would be clearer what these changes do.

Nothing -> pure m
#else
ix i f m
| 0 <= i && i < Seq.length m = f (Seq.index m i) <&> \a -> Seq.update i a m
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is here impossible to seq things to avoid thunk, Would

let x = Seq.index m i `seq` (f x <&> ...)

do anything?

Copy link
Copy Markdown
Contributor Author

@treeowl treeowl Jan 13, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using seq with index would force too much, forcing not only the lookup but also the element itself.

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. as later then 0.5.8 containers are bundled with GHC-8.2, I don't think we need to do here anything. (If someone have problems due this lazyness, and they could upgrade GHC or/and containers)

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@phadej Do you mean we shouldn't worry about the leak for older containers? I agree. Trying to plug it in that context will be very inefficient when f is actually strict.

Copy link
Copy Markdown
Collaborator

@phadej phadej left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add changelog entry. If you explain what this change means there (not just saying eager seq ix), then we don't need to repeat the explanation in commit messages.

@treeowl
Copy link
Copy Markdown
Contributor Author

treeowl commented Jan 13, 2021

@phadej For Data.Sequence, lookup isn't O(1), and this PR can make lookups happen when they're not actually necessary. I can write some fancy rewrite rules to take care of the case where ix is (obviously) used to set without inspecting. This involves a dance with non-linear patterns and carefully buried bottom values. But I'm kind of thinking all that gunk would fit better in containers than in lens.

@RyanGlScott
Copy link
Copy Markdown
Collaborator

What is the status of this PR? If I'm reading correctly, this is very nearly done modulo documentation/changelog concerns. (I'm asking since I'm doing a roundup of recent PRs before an upcoming lens release.)

@treeowl
Copy link
Copy Markdown
Contributor Author

treeowl commented Nov 1, 2021

Haven't looked at this in ages. Can try to look today.

@phadej
Copy link
Copy Markdown
Collaborator

phadej commented May 16, 2022

ping @treeowl

@treeowl
Copy link
Copy Markdown
Contributor Author

treeowl commented May 16, 2022

Pong! I will try to look at this today.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants