[hts-users:03076] Re: the order of CMLLR transforms being applied to models in the course of HTS engine generation
- Subject: [hts-users:03076] Re: the order of CMLLR transforms being applied to models in the course of HTS engine generation
- From: "Heiga ZEN (Byung Ha CHUN)" <heigazen@xxxxxxxxxx>
- Date: Fri, 21 Oct 2011 11:39:45 +0100
- Cc: "hts-users@xxxxxxxxxxxxxxx" <hts-users@xxxxxxxxxxxxxxx>
- Delivered-to: hts-users@xxxxxxxxxxxxxxx
- Dkim-signature: v=1; a=rsa-sha1; c=relaxed/relaxed; d=google.com; s=beta; t=1319193588; bh=shCYumsFVmk5SH1Wn2+7J21NMdo=; h=MIME-Version:In-Reply-To:References:Date:Message-ID:Subject:From: To:Cc:Content-Type:Content-Transfer-Encoding; b=CWr2RV/uSJTeo5rUJyFcV7r5mGMrT0JbBJUTXjB7kCBSlHiBY74f29tZuisuV1RbQ ihSDLGhfSWBIxM+t/UF1A==
- Dkim-signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=beta; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :cc:content-type:content-transfer-encoding:x-system-of-record; bh=/I8KGSIhhXbDlOebPz1ZbZ36SIAVtYWQBogRTaJfpqc=; b=cmdTXFxvgxyJ2RNa5dTbtPRDHdAGwdOB0tXpIU9Katqx4HSORTK2RWhOKXzjQBS1Yj cIxIldWg1P9uMUCV7hqQ==
- Domainkey-signature: a=rsa-sha1; s=beta; d=google.com; c=nofws; q=dns; h=dkim-signature:mime-version:in-reply-to:references:date: message-id:subject:from:to:cc:content-type: content-transfer-encoding:x-system-of-record; b=AzPV1NyPJOlZa0s5EVYTxJrfPjV1/HfiWNfWRi/ewML1rHeDmMeyan4kuVLYS5ZP+ tIc8hJt7MGRHyNZD1TErw==
Hi,
2011/10/19 Hui LIANG <tshlmail-hts@xxxxxxxxx>:
> This is exactly my concern. According to my understanding of the estimation/training of a cascade of CMLLR transforms, parent transforms (set_A) are applied to features and thus set_B should bridge the gap between speaker-independent models and adapted features. As a result, in the course of HTS engine generation, I feel that it is set_B that should be applied to the speaker-independent models first. In other words, set_B (or set_A) is "closer" to the model (or feature) side during training. set_B should be still "closer" to the model side during HTS engine generation.
>
> Does swapping the two sets of CMLLR transforms make no/negligible difference? Or, my understanding is wrong?
Ah, I understand it. I think you are correct. This is a bug so it
should be fixed. If CMLLR transforms are cascaded & they are applied
to model rather than features, the current transform should be applied
first then its parent transforms should be applied.
Regards,
Heiga
--
Heiga ZEN (in Japanese)
Byung Ha CHUN (in Korean)
<heigazen@xxxxxxxxxx>
- References
-
- [hts-users:03071] the order of CMLLR transforms being applied to models in the course of HTS engine generation, Hui LIANG
- [hts-users:03072] Re: the order of CMLLR transforms being applied to models in the course of HTS engine generation, Heiga ZEN (Byung Ha CHUN)
- [hts-users:03073] Re: the order of CMLLR transforms being applied to models in the course of HTS engine generation, Hui LIANG