Thursday, January 23, 2025

Artificial Intelligence news

Implementing responsible AI in...

Many organizations have experimented with AI, but they haven’t always gotten the...

OpenAI ups its lobbying...

OpenAI spent $1.76 million on lobbying in 2024 and $510,000 in the...

Why it’s so hard...

This story originally appeared in The Algorithm, our weekly newsletter on AI....

The second wave of...

Ask people building generative AI what generative AI is good for right...
HomeMachine LearningMomentum Approximation in...

Momentum Approximation in Asynchronous Private Federated Learning



This paper was accepted for presentation at the International Workshop on Federated Foundation Models (FL@FM-NeurIPS’24), held in conjunction with NeurIPS 2024.
Asynchronous protocols have been shown to improve the scalability of federated learning (FL) with a massive number of clients. Meanwhile, momentum-based methods can achieve the best model quality in synchronous FL. However, naively applying momentum in asynchronous FL algorithms leads to slower convergence and degraded model performance. It is still unclear how to effective combinie these two techniques together to achieve a win-win…



Article Source link and Credit

Continue reading

Interpreting CLIP: Insights on the Robustness to ImageNet Distribution Shifts

What distinguishes robust models from non-robust ones? While for ImageNet distribution shifts it has been shown that such differences in robustness can be traced back predominantly to differences in training data, so far it is not known what...

Controlling Language and Diffusion Models by Transporting Activations

The increasing capabilities of large generative models and their ever more widespread deployment have raised concerns about their reliability, safety, and potential misuse. To address these issues, recent works have proposed to control model generation by steering model...

KG-TRICK: Unifying Textual and Relational Information Completion of Knowledge for Multilingual Knowledge Graphs

Multilingual knowledge graphs (KGs) provide high-quality relational and textual information for various NLP applications, but they are often incomplete, especially in non-English languages. Previous research has shown that combining information from KGs in different languages aids either Knowledge...