Skip to content

fix Range (and RangeBackwards) mutex usage#164

Merged
davseby merged 2 commits intojellydator:v3from
iurii-ssv:fix-range-mutex-usage
Jan 18, 2025
Merged

fix Range (and RangeBackwards) mutex usage#164
davseby merged 2 commits intojellydator:v3from
iurii-ssv:fix-range-mutex-usage

Conversation

@iurii-ssv
Copy link
Copy Markdown
Contributor

I think I've ran into some sort deadlocking issue with ttlcache cache recently, I haven't been able to pinpoint it exactly yet but upon cursory look at Range method it seems to me its usage of mutex isn't 100% correct,

in particular it seems we might call the item != c.items.lru.Back().Next(); item = item.Next() part of for loop without c.items.mu read-locked - which seems would result into thread-unsafe read (not sure if reading item.Next() without mutex held is fine or not but reading c.items.lru.Back() probably isn't)

same reasoning applies to RangeBackwards I guess - but I haven't changed it yet in this PR (waiting for initial review first)

@davseby
Copy link
Copy Markdown
Contributor

davseby commented Jan 15, 2025

Thank you for your pull request. I managed to reproduce the race condition via go test . -race -count=100 and tweaking the test to set a value while performing the Range method.

WARNING: DATA RACE
Write at 0x00c0007ef8e8 by goroutine 2383:
...
ttlcache/v3.(*Cache[go.shape.string,go.shape.string]).set()
ttlcache/cache.go:170 +0x35c
Previous read at 0x00c0007ef8e8 by goroutine 2384:
...
ttlcache/v3.(*Cache[go.shape.string,go.shape.string]).Range()
ttlcache/cache.go:558

I think your solution should suffice, could you apply the same fix to the RangeBackwards as well?

@davseby davseby self-requested a review January 15, 2025 13:37
@iurii-ssv
Copy link
Copy Markdown
Contributor Author

I think your solution should suffice, could you apply the same fix to the RangeBackwards as well?

Yes, I've pushed another commit to this PR just now - 384f4cb

@iurii-ssv iurii-ssv changed the title fix Range mutex usage fix Range (and RangeBackwards) mutex usage Jan 15, 2025
@iurii-ssv
Copy link
Copy Markdown
Contributor Author

Hey @davseby, wondering if this fix can be merged & released any time soon ?

So I can confirm whether or not this is the root cause of the "deadlock of sorts" I observe in my code that relies on ttlcache ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants