Latent Context Sharing: Efficient Optimization by Reducing Context Dimensionality (IMAGE)
Caption
Existing derivative-free optimization techniques are computationally expensive because they optimize a long, concatenated list of latent contexts derived from the prompts. In this study, a more effective context parametrization method called Latent Context Sharing (LCS) is used. LCS assumes that each latent context consists of both unique components and components common to all contexts, and optimizes them independently. This strategy is based on the idea that there is semantic similarity between contexts, suggesting the existence of common components. Optimizing the smaller shared and unique contexts leads to a reduction in the dimensionality of the problem, making it easier to handle.
Credit
Go Irie from Tokyo University of Science
Usage Restrictions
Credit must be given to the creator.
License
Original content