-
Notifications
You must be signed in to change notification settings - Fork 340
Feature/voxel annotation #858
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
|
Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA). View this failed invocation of the CLA check for more information. For the most up to date status, view the checks section at the bottom of the pull request. |
|
Can you complete the CLA? |
|
The brush hover outline (circle where the mouse pointer is) seems to go away in some cases when changing the zoom level. |
2d5359d to
9d15526
Compare
|
|
- Introduced a new dummy `MultiscaleVolumeChunkSource`. - Added `VoxelAnnotationRenderLayer` for voxel annotation rendering. - Implemented `VoxUserLayer` with dummy data source and rendering. - Added tools and logs for voxel layer interactions and debugging. - Documented voxel annotation specification and implementation details.
- Added a backend `VoxDummyChunkSource` that generates a checkerboard pattern for voxel annotations. - Implemented frontend `VoxDummyChunkSource` with RPC pairing to the backend. - Updated documentation with details on chunk source architecture and implementation.
…t seems there is some fighting.
…s to corruped the chunk after the usage of the tool. Added a front end buffer which is the only drawing storage for now. Added user settings to set the voxel_annotation layer scale and bounds. Added a second empty source to DummyMultiscaleVolumeChunkSource to prevent crashs when zoomed out too much
…lobal one (there where a missing convertion) ; add a primitive brush tool
…umeChunkSource and update related imports
…panded brush settings
…ation, and improved backend edit handling
…map options and UI settings
…r remote workflows, label creation, and new drawing tools
… of life features; add a different color to the cursors in eraser mode
…g causing the preview to sometimes disappear
|
Hello, I setup a live demo to let everyone easily try the feature and give feedback on UI, ergonomics, and bugs. Watch out: There is persistent storage, so your annotations will be saved and will override the ones already present. |
|
@briossant thanks for setting that up! Some UX feedback. These tools aren't showing the keybind: It isn't obvious without a large brush size when the brush tool is active. Annotation layers currently indicate an active tool with this entry in the top bar: (the "annotate point") though I don't think that is obvious enough, particularly when you have multiple tools keybind which fill up the top row. I would suggest trying a bottom bar and you could populate it with brush settings, along with keybinds that only become active while the brush tool is active. This is what our edit toolbars looks like. |
…otation tools ; restructure draw tab: unify tool and control definitions and refactor related toolbox generation logic.
…nd.ts to frontend.ts and backend.ts, renamed layer/vox/ to layer/voxel-annotation, moved layer/vox/tabs/tools.ts to layer/voxel-annotation/draw_tab.ts
Appreciate the feedback! I fixed the missing shortcuts ; and the bottom bar is a good idea, it would cleanup the tab ; do you know where I could find the implementation for the one you shared me? |
Check out the activate functions here: neuroglancer/src/datasource/graphene/frontend.ts Line 2388 in f0ffc01
neuroglancer/src/datasource/graphene/frontend.ts Line 2079 in f0ffc01
Also check out MULTICUT_SEGMENTS_INPUT_EVENT_MAP to see dynamic (when tool is active) keybinds |
…istic renderlayer on the seg layer -> merged the draw call of both renderlayers so it uses the depth buffer and can benefit from the depth function WebGL2RenderingContext.LESS ; the next step is to introduce a sentinel or a mask to handle the erasing case
… the respective tool specific settings havve been moved to this bar
5a51d59 to
ff5b752
Compare
…er in the segmentation layer context
…ce at once since the current painting pipeline will only draw to one source
|
I've updated the demo to the latest commit, this includes:
|
… the painted area
…ng for non-axis-aligned slices
|
@briossant It feels better! These controls are draggable right now which prevents dragging the slider Have you thought about having ctrl+mousedown+shift trigger erasing and then have the erase setting just be a toggle between everything and selected? |
…nting tools non-draggable
…n0 ; rename voxel editing properties
…ts keybind is displayed to the user
…n controle+shift is pressed
Yeah erasing feels better like this, I've updated the demo with the refactored erasing. |
…able source is selected
|
Hello, I've cleared my current todo list and am standing by for review, or further feedback. One question regarding scope: @jbms mentioned keeping new datasources and kvstores separate, but would you recommand to add write support for existing datasources (e.g. precomputed and n5) in this PR? |
| }, | ||
| ); | ||
|
|
||
| export async function proxyWrite( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would remove this then because when we later add it, it will be more clear that it that it was a requirement for that functionality.
| multiscale: MultiscaleVolumeChunkSource | undefined, | ||
| lodIndex: number, | ||
| ): VolumeChunkSource { | ||
| if (!multiscale) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it is best to use a block statement with if statements that aren't single line
| const value = valueGetter(false); | ||
|
|
||
| const pushIf = async (point: Float32Array) => { | ||
| const v = await getEnsuredValue(point); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
From some performance testing of the paint tool, it seems you can get a significant speedup for the UI thread if you only check getEnsuredValue if filterValue !== undefined and ignore the v === value check. Though I believe that would cause some unnecessary writes so perhaps that logic could be moved to the backend.
I'm wondering how much of this logic could be moved to the backend, if the only thing the frontend did when receiving input was to forward that user input to the backend and have the backend do all the voxel calculations.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, the v === value check can be removed. This would indeed speed up the function, and I can add the check back in the backend to avoid unnecessary writes. As for moving the whole logic to the backend: for the brush, I believe this would make the preview delay longer, resulting in a less responsive user experience; but for the flood fill it has less downside as the preview is already slow, and it would also fix the slowdown on big regions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Performing a large sphere brush stroke is the main performance issue I've noticed




This Draft Pull Request introduces an interactive voxel annotation feature, allowing users to perform manual segmentation by painting directly onto volumetric layers. This implementation is based on the proposal in Issue #851 and incorporates the feedback from @jbms.
Here is a live demo to try the feature, watch out, there is persistent storage, so your annotations will be saved and will override the ones already present: OPEN DEMO VIEWER
Key Changes & Architectural Overview
Following the discussion, this implementation has been significantly revised from the initial prototype:
voxlayer type, the voxel editing functionality is now integrated directly intoImageUserLayerandSegmentationUserLayervia aUserLayerWithVoxelEditingMixin. This mixin adds a new "Draw" tab in the UI.New Tool System: The Brush and Flood Fill tools are implemented as toggleable LayerTools, while the Picker tool is a one-shot tool. All integrate with Neuroglancer's new tool system. The drawing action is bound to Ctrl + Left Click.
Optimistic Preview for Compressed Chunks: To provide immediate visual feedback and solve the performance problem with compressed chunks, edits are now rendered through an optimistic preview layer.
InMemoryVolumeChunkSource.RenderLayer(e.g.,ImageRenderLayerorSegmentationRenderLayer). This ensures the preview perfectly matches the user's existing shader and display settings.Data-flow
sequenceDiagram participant User participant Tool as VoxelBrushTool participant ControllerFE as VoxelEditController (FE) participant EditSourceFE as OverlayChunkSource (FE) participant BaseSourceFE as VolumeChunkSource (FE) participant ControllerBE as VoxelEditController (BE) participant BaseSourceBE as VolumeChunkSource (BE) User->>Tool: Mouse Down/Drag Tool->>ControllerFE: paintBrushWithShape(mouse, ...) ControllerFE->>ControllerFE: Calculates affected voxels and chunks ControllerFE->>EditSourceFE: applyLocalEdits(chunkKeys, ...) activate EditSourceFE EditSourceFE->>EditSourceFE: Modifies its own in-memory chunk data note over EditSourceFE: This chunk's texture is re-uploaded to the GPU deactivate EditSourceFE ControllerFE->>ControllerBE: commitEdits(edits, ...) [RPC] activate ControllerBE ControllerBE->>ControllerBE: Debounces and batches edits ControllerBE->>BaseSourceBE: applyEdits(chunkKeys, ...) activate BaseSourceBE BaseSourceBE-->>ControllerBE: Returns VoxelChange (for undo stack) deactivate BaseSourceBE ControllerBE->>ControllerFE: callChunkReload(chunkKeys) [RPC] activate ControllerFE ControllerFE->>BaseSourceFE: invalidateChunks(chunkKeys) note over BaseSourceFE: BaseSourceFE re-fetches chunk with the now-permanent edit. ControllerFE->>EditSourceFE: clearOptimisticChunk(chunkKeys) deactivate ControllerFE ControllerBE->>ControllerBE: Pushes change to Undo Stack & enqueues for downsampling deactivate ControllerBE loop Downsampling & Reload Cascade ControllerBE->>ControllerBE: downsampleStep(chunkKeys) ControllerBE->>ControllerFE: callChunkReload(chunkKeys) [RPC] activate ControllerFE ControllerFE->>BaseSourceFE: invalidateChunks(chunkKeys) note over BaseSourceFE: BaseSourceFE re-fetches chunk with the now-permanent edit. ControllerFE->>EditSourceFE: clearOptimisticChunk(chunkKeys) deactivate ControllerFE end5. Dataset creation To complete Neuroglancer's writing capabilities, a dataset metadata creation/initialization feature was introduced.The workflow is triggered when a user provides a URL to a data source that does not resolve:Neuroglancer recognizes the potential intent to create a new dataset and prompts the user:Finally, the user is able to access dataset creation form:Data sources & Kvstores
Currently, there is a very limited set of supported data sources and kvstores, which are:
opfs: in-browser storage, also used for local development at some point, the relevancy can be discussed.ssa+https: a kvstore linked to an in development project, which is a stateless (thanks to OAuth 2.0) worker providing signed urls to read/write in s3 storesLimitations
Open Questions & Future Work
This PR focuses on establishing the core architecture. Several larger topics from the original discussion are noted here as future work:
Checklist
[ ] Added support to more (every?) datasources and kvstoresEdits