Federated Unlearning: Safely Withdrawing Corporate Data from Training Sets
1 min read
AI Security, Privacy & Model/Prompt Risk Management
-/5
In short
- The newly developed method of federated unlearning allows companies to securely withdraw their data from AI training without having to restart the entire training process.
- This is particularly relevant for collaborations where data is shared among partners.
- The ability to efficiently remove data could not only enhance data protection but also strengthen trust between partners.
The newly developed method of federated unlearning allows companies to securely withdraw their data from AI training without having to restart the entire training process. This is particularly relevant for collaborations where data is shared among partners. The ability to efficiently remove data could not only enhance data protection but also strengthen trust between partners. In this context, it is important to note that the implementation of such methods also brings challenges, particularly regarding technical execution and compliance with data protection regulations. A comprehensive assessment of the long-term impacts on AI development and the associated risks and opportunities is essential.
Source:
-
Föderiertes Unlearning: Unternehmensdaten sicher aus Trainigsdaten zurückziehen — Golem.de - Wissenschaft (DE)