Skip to content

Conversation

e1732a364fed
Copy link

@e1732a364fed e1732a364fed commented Jun 8, 2025

项目中 TTS_Config 的 init 逻辑是,查看 tts_infer.yaml 的 version 项:

version = configs.get("version", "v2").lower()

但是保存时,这一项却没有保存下来, 这就删除了用户对 version 的配置。
导致下一次加载时,会使用 默认的v2:

    def save_configs(self, configs_path: str = None) -> None:
        configs = deepcopy(self.default_configs)
        if self.configs is not None:
            configs["custom"] = self.update_configs()

        if configs_path is None:
            configs_path = self.configs_path
        with open(configs_path, "w") as f:
            yaml.dump(configs, f)

更新后为

    def save_configs(self, configs_path: str = None) -> None:
        configs = deepcopy(self.default_configs)
        if self.configs is not None:
            configs["custom"] = self.update_configs()
            configs["version"] = self.version

        if configs_path is None:
            configs_path = self.configs_path
        with open(configs_path, "w") as f:
            yaml.dump(configs, f)

另外,tts_infer.yaml 中默认本身是没有给出 version 项的,要用户手动写一行(不是 custom, v1, v2, v3, v4 中的 version, 而是 配置文件中顶级的 version),

以目前的代码,如果默认 tts_infer.yaml 没有 version 这一项,用户又没看代码逻辑的话,就算改了
custom 中的 version 为 v4, 实际version 还是会被代码认成v2!

因此建议 merge 本 fork 之后,改动 tts_infer.yaml ,加上 version 这一行。即如下。

custom:
  bert_base_path: GPT_SoVITS/pretrained_models/chinese-roberta-wwm-ext-large
  cnhuhbert_base_path: GPT_SoVITS/pretrained_models/chinese-hubert-base
  device: cuda
  is_half: true
  t2s_weights_path: GPT_SoVITS/pretrained_models/s1v3.ckpt
  version: v4
  vits_weights_path: GPT_SoVITS/pretrained_models/gsv-v4-pretrained/s2Gv4.pth
v1:
  bert_base_path: GPT_SoVITS/pretrained_models/chinese-roberta-wwm-ext-large
  cnhuhbert_base_path: GPT_SoVITS/pretrained_models/chinese-hubert-base
  device: cpu
  is_half: false
  t2s_weights_path: GPT_SoVITS/pretrained_models/s1bert25hz-2kh-longer-epoch=68e-step=50232.ckpt
  version: v1
  vits_weights_path: GPT_SoVITS/pretrained_models/s2G488k.pth
v2:
  bert_base_path: GPT_SoVITS/pretrained_models/chinese-roberta-wwm-ext-large
  cnhuhbert_base_path: GPT_SoVITS/pretrained_models/chinese-hubert-base
  device: cpu
  is_half: false
  t2s_weights_path: GPT_SoVITS/pretrained_models/gsv-v2final-pretrained/s1bert25hz-5kh-longer-epoch=12-step=369668.ckpt
  version: v2
  vits_weights_path: GPT_SoVITS/pretrained_models/gsv-v2final-pretrained/s2G2333k.pth
v3:
  bert_base_path: GPT_SoVITS/pretrained_models/chinese-roberta-wwm-ext-large
  cnhuhbert_base_path: GPT_SoVITS/pretrained_models/chinese-hubert-base
  device: cpu
  is_half: false
  t2s_weights_path: GPT_SoVITS/pretrained_models/s1v3.ckpt
  version: v3
  vits_weights_path: GPT_SoVITS/pretrained_models/s2Gv3.pth
v4:
  bert_base_path: GPT_SoVITS/pretrained_models/chinese-roberta-wwm-ext-large
  cnhuhbert_base_path: GPT_SoVITS/pretrained_models/chinese-hubert-base
  device: cpu
  is_half: false
  t2s_weights_path: GPT_SoVITS/pretrained_models/s1v3.ckpt
  version: v4
  vits_weights_path: GPT_SoVITS/pretrained_models/gsv-v4-pretrained/s2Gv4.pth
version: v4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant