AI

Embedding Aggregation-based Heterogeneous Models Training in Vertical Federated Learning

View PDF file for Easter Easter: Include heterogeneous models based on assembly in vertical modern learning, written by Shuo Wang, Keke Gai, Jing Yu, Liehuang Zhu, Kim-Kwang Raymond Choo and Bin Xiao

PDF HTML (experimental) view

a summary:The vertical federal learning has received great attention because it allows customers to train automatic learning models without sharing local data, which protects private data for the local customer. However, the current VFL methods face challenges when dealing with heterogeneous local models among the participants, which affects the convergence of improvement and generalization. To face this challenge, this paper suggests a new approach called trained vertical learning to train multiple heterogeneous models (VFEDMH). VFEDMH focuses on collecting local assignments to find out each participant while spreading forward. To protect the values of the local inclusion of the participants, we suggest how to protect the inclusion based on lightweight blindness. In particular, participants get local inclusion using local heterogeneous models. Then the negative limb, which owns the sample features only, pumps the blindness factor into local inclusion and sends it to the active end. The active party collects local implications for global knowledge of knowledge and sends them to negative parties. Negative limbs then use the global implications to spread forward on their local heterogeneous networks. However, the downside does not have a sample of stickers, so the domestic model can be calculated locally. To overcome this restriction, the active party helps the negative party calculate the local heterogeneous model gradients. Then, each participant trains his local model using heterogeneous typical gradients. The goal is to reduce the value of the loss of local heterogeneous models for each of them. Intensive experiments are performed to prove that VFEDMH can simultaneously train multiple heterogeneous models simultaneously with heterogeneous improved and some modern methods excel in the performance of the model.

The application date

From: Shuo Wang [view email]
[v1]

Fri, Oct 20 2023 09:22:51 UTC (2,049 KB)
[v2]

Thursday, Feb 8 2024 08:24:53 UTC (23,728 KB)
[v3]

Tuesday, 15 July 2025 10:01:29 UTC (14,443 KB)

Don’t miss more hot News like this! AI/" target="_blank" rel="noopener">Click here to discover the latest in AI news!

2025-07-16 04:00:00

Related Articles

Back to top button