Over the past decade, increasing research efforts have been concentrated on emotion recognition from physiological signals due to their capability on emotion information representation. Existing works mainly focus on exploring the relationship between stimulus and subjects, while ignoring the effects of latent correlations among different subjects, which are important for personalized emotion recognition. To tackle this issue, we aim to conduct emotion recognition using multimodal physiological signals through an edge-weighted hypergraph neural network, in which complex relationship among subjects is formulated using hypergraph for each modality respectively. In our model, the differences in significance of influence which various samples leave on the classification can be better represented. The major contribution of this network lies in its concern that the associate strengths between various samples are different, which have different impact on the training result. The hyperedge between the vertices with closer correlation should be assigned a larger weight. Reversely, the looser relation, the minor weight. To evaluate the proposed method, experiments have been conducted on the DEAP dataset and ASCERTAIN dataset. Experimental results and comparison with state-of-the-art methods show that the proposed method can achieve better performance.