- a promise of: 어떤 기술이나 방법론이 제공할 수 있는 잠재적인 장점이나 가능성
- A promise of generative models, a major branch of machine learning, is to overcome these limitations by: (1) learning realistic world models, potentially allowing agents to plan in a world model before actual interaction with the world, and (2) learning meaningful features of the input while requiring little or no human supervisions or labeling. (paper: Glow)
- amortize: 분할상환하다, ~d: 전이되는
- We demonstrate that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference. (paper: normalizing flows)
- apt: 적절한, 잘 ~하는
- This forms what has been aptly named doubly stochastic estimation (Titsias & Lazaro-Gredilla, 2014), since we have one source of stochasticity from the mini-batch and a second from the Monte Carlo approximation of the expectation. (paper: normalizing flows)
- asymptotic: 점근선의
- This is a widely raised objection to variational methods, in that unlike other inferential methods such as MCMC, even in the asymptotic regime we are unable recover the true posterior distribution. (paper: normalizing flows)
- be concerned with: ~에 관계가 있다, ~에 관심이 있다
- Generative modeling is generally concerned with the extremely challenging task of modeling all dependencies within very high-dimensional input data, usually specified in the form of a full joint probability distribution. (paper: Glow)
- coin: (새로운 낱말, 어구를) 만들다, (동전) 주조하다
- In this paper we propose a new a generative flow coined Glow, with various new elements as described in Section 3. (paper: Glow)
- contraction: 수축, 축소
- We can understand the effect of invertible flows as a sequence of expansions or contractions on the initial density. (paper: normalizing flows)
- conversely: 정반대로, 역으로
- Conversely, for a contraction, the map pushes points towards the interior of a region, increasing the density in its interior while reducing the density outside. (paper: normalizing flows)
- detrimental: 해로운
- There is also a large body of evidence that describes the detrimental effect of limited posterior approximations. (paper: normalizing flows)
- exposition: 설명, 전시회
- Turner & Sahani (2011) provide an exposition of two commonly experienced problems. (paper: normalizing flows)
- - followed by ~: ~가 따라오는 (~가 -의 다음 step을 의미)
- We propose a generative flow where each step (left) consists of an actnorm step, followed by an invertible 1 x 1 convolution, followed by an affine transformation. (paper: Glow)
- hamper: 방해하다, (뚜껑달린) 바구니
- Despite these successes and ongoing advances, there are a number of disadvantages of variational methods that limit their power and hamper their wider adoption as a default method for statistical inference. (paper: normalizing flows)
- i.i.d.: 독립항등분포 (independent and identically distribution) - 각각의 random variable이 "1) independent (독립), 2) 같은 확률 분포를 가짐" 을 의미한다. (ex) 주사위 굴리기, 동전 던지기 (but, 이 둘을 함께 하는 것은 서로 독립이긴 하지만, 서로 같은 확률 분포가 아니기 때문에 iid가 아님)
- We collect an i.i.d. dataset D, and choose a model p_θ(x) with parameter θ. (paper: Glow)
- in particular: 특히, 특별히
- (reference paper) obtain a volume-preserving invertible transformation by exploiting the use of such transition operators in the MCMC literature, in particular the methods of Langevin and Hybrid Monte Carlo. (paper: normalizing flows)
- infinitesimal: 극미한, 극소의
- We use this view of normalizing flows to develop categories of finite and infinitesimal flows and provide a unified view of approaches for constructing rich posterior approximations. (paper: normalizing flows)
- intimidate: 겁을 주다
- While it may look intimidating, its value can be surprisingly simple to compute for certain choices of transformations, as previously explored in (reference paper). (paper: Glow)
- isotropic: 등방성의
- The prior distribution P_Z is the isotropic multivariate Gaussian distribution and all the statistics of the prior distribution, µ and σ, are obtained by the text encoder f_enc. (paper: Glow-TTS)
- 3D isotropic gaussian은 완벽한 구 모양이고, 3D anisotropic gaussian은 방향에 따라 다른 반지름을 갖는 구 형태이다.
- 사진으로 예시를 들면, anisotropic한 표면은 광택이 있어서 방향에 따라 다르게 보이고, isotropic한 표면은 방향에 관계 없이 동일하게 보인다.
- leap: 뛰다, 도약, 급등
- The discipline of generative modeling has experienced enormous leaps in capabilities in recent years, mostly with likelihood-based methods and generative adversarial networks (GANs). (paper: Glow)
- leapfrog: 등 짚고 뛰어넘기, (더 높은 등급으로) 뛰어오르다
- A disadvatage of using the Langevin or Hamiltonian flow is that they require one or more evaluations of the likelihood and its gradients (depending in the number of leapfrog steps) per iteration during both training and test time. (paper: normalizing flows)
- lemma: 부명제, 단어의 원형
- matrix determinant lemma
행렬식 보조정리
- matrix determinant lemma
- likewise: 똑같이, 비슷하게
- We focus on functions where f (and, likewise, g) is composed of a sequence of transformations: (paper: Glow)
- manipulation: 조종, 처리
- Perhaps most strikingly, we demonstrate that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images. (paper: Glow)
- objection: 이의, 반대
- This is a widely raised objection to variational methods, in that unlike other inferential methods such as MCMC, even in the asymptotic regime we are unable recover the true posterior distribution. (paper: normalizing flows)
- oft-quoted: 자주 인용되는
- We show that normalizing flows admit infinitesimal flows that allow us to specify a class of posterior approximations that in the asymptotic regime is able to recover the true posterior distribution, overcoming one oft-quoted limitation of variational inference. (paper: normalizing flows)
- order-of-magnitude: 특정 값의 자리수가 한 단위 증가하는 것을 의미. 주로 log-scale을 쓰며, 10배를 의미
- Glow-TTS obtains an order-of-magnitude speed-up over the autoregressive model, Tacotron 2, at synthesis with comparable speech quality. (paper: Glow-TTS)
- outset: 착수, 시초, 발단 (at the outset: 처음에)
- At the outset, we distinguish between two types of flow mechanisms that differ in how the Jacobian is handled. (paper: normalizing flows)
- plain: 분명한, 솔직한, 보통의, 평원
- Perhapse most strikingly, we demonstrate that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images. (paper: Glow)
- preferable: 더 좋은, 나은, 선호되는
- Thus, an ideal family of variational distributions qφ(z|x) is one that is highly flexible, preferably flexible enough to contain the true posterior as one solution. (paper: normalizing flows)
- We're looking for a new hous, preferably one near the school. (네이버 영어사전)
우리는 새 집을 찾고 있어요. 학교 가까운 곳이면 더 좋구요.
- radial: 방사상의
- It applies radial contractions and expansions around the reference point and are thus referred to as radial flows. (paper: normalizing flows)
- regime: 정권, 제도, 체제
- This is a widely raised objection to variational methods, in that unlike other inferential methods such as MCMC, even in the asymptotic regime we are unable recover the true posterior distribution. (paper: normalizing flows)
- regulate: 규제하다, 조절하다
- By altering the latent representation of speech, we can synthesize speech with various intonation patterns and regulate the pitch of speech. (paper: Glow-TTS)
- renew: 재개하다, 갱신하다, 재차 강조하다
- There has been a great deal of renewed interest in variational inference as a means of scaling probabilistic modeling to increasingly complex problems on increasingly larger data sets. (paper: normalizing flows)
- resort: 의지하다, 도움을 청하다
- Whereas we would have previously resorted to local variational methods (Bishop, 2006), in general we now always compute such expectations using Monte Carlo approximations (including the KL term in the bound, if it is not analyticially known). (paper: normalizing flows)
- shorthand: 속기, 약칭
- ~ is: ~ where equation (6) will be used throughout the paper as a shorthand for the composition ~. (paper: normalizing flows)
- specification: (자세한) 설명서, 사양
- We propose the specification of approximate posterior distributions using normalizing flows, a tool for constructing complex distributions by transforming a probability density through a series of invertible mappings. (paper: normalizing flows)
- stand for: ~을 상징하다, 의미하다
- A stands for the mapping from the index of the latent representation of speech to that of statistics from f_enc: A(j) = i if z_j ~ N(z_j; µ_i, σ_i). (paper: Glow-TTS)
i: text length 대응 / j: mel length 대응
- A stands for the mapping from the index of the latent representation of speech to that of statistics from f_enc: A(j) = i if z_j ~ N(z_j; µ_i, σ_i). (paper: Glow-TTS)
- standalone: 독립형의
- Our Glow-TTS is a standalone parallel TTS model that internally learns to align text and speech by leveraging the properties of flows and dynamic programming. (paper: Glow-TTS)
- stationary: 움직이지 않는, 변하지 않는
- Importantly, in this case the stationary solution for q_t(z) is given by the Boltzmann distribution: (paper: normalizing flows)
- statistician: 통계학자
- A property of such transformations, often referred to as the law of the unconscious statistician (LOTUS), is that expectations w.r.t. the transformed density q_K can be computed without explicitly knowing q_K. (paper: normalizing flows)
- straightforward: 간단한, 솔직한
- While it is straightforward to build invertible parametric functios for use in equation (5), e.g., invertible neural networks (reference paper), such approaches typically have a complexity for computing the Jacobian determinant that scales as O(LD^3), where D is the dimension of the hidden layers and L is the number of hidden layers used. (paper: normalizing flows)
- strikingly: 현저히, 두드러지게
- Perhaps most strikingly, we demonstrate that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images. (paper: Glow)
perhaps most strikingly: 아마도 가장 놀라운 점은
- Perhaps most strikingly, we demonstrate that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images. (paper: Glow)
- surjective: 어떤 함수가 모든 가능한 출력 값에 대해 적어도 하나의 입력값을 대응시킨다면 그 함수는 surjective 하다 → 모든 음성 frame에 대해 텍스트 입력의 각 글자에 mapping되도록 보장한다는 의미로 사용
- We assume the alignment function A to be monotonic and surjective to ensure Glow-TTS not to skip or repeat the text input. (paper: Glow-TTS)
- tractability: 순종, 취급하기 쉬움
- Flow-based generative models (reference paper) are conceptually attractive due to tractability of the exact log-likelihood, tractability of exact latent-variable inference, and parallelizability of both training and synthesis. (paper: Glow)
- troublesome: 골칫거리인, 고질적인
- Those have the advantage of simplicity, but have as disadvantage that synthesis has limited parallelizability, since the computational length of synthesis is proportional to the dimensionality of the data; this is especially troublesome for large images or video. (paper: Glow)
- ultimate: 궁극적인, 최고의
- In this paper we work towrads this ultimate vision, in addition to intermediate applications, by aiming to improve upon the state-of-the-art of generative models. (paper: Glow)
- unconscious: 의식을 잃은, 무의식적인
- A property of such transformations, often referred to as the law of the unconscious statistician (LOTUS), is that expectations w.r.t. the transformed density q_K can be computed without explicitly knowing q_K. (paper: normalizing flows)
'딥러닝 개념, 논문 > 논문 어휘 및 예문' 카테고리의 다른 글
리뷰한 논문 (어휘 정리 완료) (0) | 2025.03.28 |
---|