Thursday, January 23, 2025

Artificial Intelligence news

Implementing responsible AI in...

Many organizations have experimented with AI, but they haven’t always gotten the...

OpenAI ups its lobbying...

OpenAI spent $1.76 million on lobbying in 2024 and $510,000 in the...

Why it’s so hard...

This story originally appeared in The Algorithm, our weekly newsletter on AI....

The second wave of...

Ask people building generative AI what generative AI is good for right...
HomeMachine LearningResolving the Mixing...

Resolving the Mixing Time of the Langevin Algorithm to its Stationary Distribution for Log-Concave Sampling



Sampling from a high-dimensional distribution is a fundamental task in statistics, engineering, and the sciences. A canonical approach is the Langevin Algorithm, i.e., the Markov chain for the discretized Langevin Diffusion. This is the sampling analog of Gradient Descent. Despite being studied for several decades in multiple communities, tight mixing bounds for this algorithm remain unresolved even in the seemingly simple setting of log-concave distributions over a bounded domain. This paper completely characterizes the mixing time of the Langevin Algorithm to its stationary distribution in…



Article Source link and Credit

Continue reading

Interpreting CLIP: Insights on the Robustness to ImageNet Distribution Shifts

What distinguishes robust models from non-robust ones? While for ImageNet distribution shifts it has been shown that such differences in robustness can be traced back predominantly to differences in training data, so far it is not known what...

Controlling Language and Diffusion Models by Transporting Activations

The increasing capabilities of large generative models and their ever more widespread deployment have raised concerns about their reliability, safety, and potential misuse. To address these issues, recent works have proposed to control model generation by steering model...

KG-TRICK: Unifying Textual and Relational Information Completion of Knowledge for Multilingual Knowledge Graphs

Multilingual knowledge graphs (KGs) provide high-quality relational and textual information for various NLP applications, but they are often incomplete, especially in non-English languages. Previous research has shown that combining information from KGs in different languages aids either Knowledge...