Skip to content

feat(toolkit): implement backfill SectionBloom function #6390

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 8 commits into
base: develop
Choose a base branch
from
30 changes: 30 additions & 0 deletions plugins/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -145,3 +145,33 @@ NOTE: large db may GC overhead limit exceeded.
- `<src>`: Source path for database. Default: output-directory/database
- `--db`: db name.
- `-h | --help`: provide the help info

## DB Backfill-Bloom

DB backfill bloom provides the ability to backfill SectionBloom data for historical blocks to enable `eth_getLogs` address/topics filtering. This is useful when `isJsonRpcFilterEnabled` was disabled during block processing and later enabled, causing historical blocks to lack SectionBloom data.

### Available parameters:

- `-d | --database-directory`: Specify the database directory path, it is used to open the database to get the transaction log and write the SectionBloom data back, default: output-directory/database.
- `-s | --start-block`: Specify the start block number for backfill (required).
- `-e | --end-block`: Specify the end block number for backfill (optional, default: latest block).
- `-c | --max-concurrency`: Specify the maximum concurrency for processing, default: 8.
- `-h | --help`: Provide the help info.

### Examples:

```shell script
# full command
java -jar Toolkit.jar db backfill-bloom [-h] -s=<startBlock> [-e=<endBlock>] [-d=<databaseDirectory>] [-c=<maxConcurrency>] [-f=<forceFlush>]
# examples
java -jar Toolkit.jar db backfill-bloom -s 1000000 -e 2000000 #1. backfill blocks 1000000 to 2000000
java -jar Toolkit.jar db backfill-bloom -s 1000000 -d /path/to/database #2. specify custom database directory
java -jar Toolkit.jar db backfill-bloom -s 1000000 -c 8 #3. use higher concurrency (8 threads)
```

### Backfill speed

The time required to process different block ranges varies. It is recommended to increase `--max-concurrency` appropriately to speed up the backfill process.

- 0-10000000: It's done almost instantly because there are no logs inside.
- 10000000-70000000: Takes about 3-4 hours/10,000,000 blocks with `--max-concurrency` set to 32.
1 change: 1 addition & 0 deletions plugins/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ dependencies {
implementation 'io.github.tronprotocol:leveldbjni-all:1.18.2'
implementation 'io.github.tronprotocol:leveldb:1.18.2'
implementation project(":protocol")
implementation project(":chainbase")
}

check.dependsOn 'lint'
Expand Down
1 change: 1 addition & 0 deletions plugins/src/main/java/org/tron/plugins/Db.java
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
DbConvert.class,
DbLite.class,
DbCopy.class,
DbBackfillBloom.class,
DbRoot.class
},
commandListHeading = "%nCommands:%n%nThe most commonly used db commands are:%n"
Expand Down
Loading