Publications

Working Papers         Conference Proceedings         Journal Publications

You can also find my articles on my Google Scholar profile.

Working Papers

  • ProCyon: A multimodal foundation model for protein phenotypes
    George Dasoulas, et al.

    Read more

  • Conference Proceedings

  • [ICML 2025 GenBio] TEDDY: A Family Of Foundation Models For Understanding Single Cell Biology
    Alexis Chevalier, Soumya Ghosh, Urvi Awasthi, James Watkins, Julia Bieniewska, Nichita Mitrea, Olga Kotova, Kirill Shkura, Andrew Noble, Michael J. Steinbaugh, Vijay Sadashivaiah, George Dasoulas, Julien Delile, Christoph Meier, Leonid Zhukov, Iya Khalil, Srayanta Mukherjee, Judith Mueller

    Read more

  • [ICLR 2025] E(n) Topological Neural Networks
    Claudio Battiloro, Ege Karaismailoglu, Mauricio Tec, George Dasoulas, Michelle Audirac, Francesca Dominici

    Read more

  • [LLM4Code 2024] Learn to Code Sustainably: An Empirical Study on LLM-based Green Code Generation
    Tina Vartziotis, Ippolyti Dellatolas, George Dasoulas, Maximilian Schmidt, Florian Schneider, Tim Hoffmann, Sotirios Kotsopoulos, Michael Keckeisen

    Read more

  • [ICLR 2023] GNNDelete: A General Unlearning Strategy for Graph Neural Networks
    Jiali Cheng, George Dasoulas, Huan He, Chirag Agarwal, Marinka Zitnik

    Read more

  • [AAAI 2023] Graph Ordering Attention Networks
    Michalis Chatzianastasis, Johannes Lutzeyer, George Dasoulas, Michalis Vazirgiannis

    Based on connections with the Partial Information Decomposition framework, we introduce a novel GNN layer, namely the Graph Ordering Attention (GOAT) that imposes neighborhood orderings according to the attention coefficients. Read more

  • [ICASSP 2021] Ego-based Entropy Measures for Structural Representations on Graphs
    George Dasoulas, Giannis Nikolentzos, Kevin Scaman, Aladin Virmaux, Michalis Vazirgiannis

    Read more

  • [ICML 2021] Lipschitz Normalization for Self-Attention Layers with Application to Graph Neural Networks
    George Dasoulas, Kevin Scaman, Aladin Virmaux

    We derive a theoretical analysis on the Lipschitz continuity of attention and show that enforcing Lipschitz continuity through normalization can significantly improve the performance of deep attention models. Read more

  • [ICLR 2021] Learning Parametrised Graph Shift Operators
    George Dasoulas, Johannes Lutzeyer, Michalis Vazirgiannis

    We propose a parametrised graph shift operator (PGSO) to encode graphs, providing a unified view of common GSOs, and improve GNN performance by including the PGSO into the training in an end-to-end manner. Read more

  • [ICASSP 2021] Ego-based Entropy Measures for Structural Representations on Graphs
    George Dasoulas, Giannis Nikolentzos, Kevin Scaman, Aladin Virmaux, Michalis Vazirgiannis

    Moving beyond local interactions, nodes can share structural similarities, based on their position. We investigate feature augmentation methods of graph neural networks using structural entropy measures. Read more

  • [IJCAI 2020] Coloring Graph Neural Networks for Node Disambiguation
    George Dasoulas, Ludovic Dos Santos, Kevin Scaman, Aladin Virmaux

    Based on topological criteria and, specifically the separability, we introduce a universal approximation scheme of continuous functions on graphs. It is based on the disambiguation of identical node attributes. Read more

  • [ICPR 2020] Hcore-Init: Neural Network Initialization based on Graph Degeneracy
    Stratis Limnios, George Dasoulas, Dimitrios M. Thilikos, Michalis Vazirgiannis

    We propose a graph-based initialization of neural networks extending graph degeneracy observations. Such an initialization can encourage neurons that have structural importance in the neural network. Read more

  • Journal Publications

  • [DMLR 2025] TopoBench: A Framework for Benchmarking Topological Deep Learning
    Lev Telyatnikov, Guillermo Bernardez, Marco Montagna, Mustafa Hajij, Martin Carrasco, Pavlo Vasylenko, Mathilde Papillon, Ghada Zamzmi, Michael T. Schaub, Jonas Verhellen, Pavel Snopov, Bertran Miquel-Oliver, Manel Gil-Sorribes, Alexis Molina, Victor Guallar, Theodore Long, Julian Suk, Patryk Rygiel, Alexander Nikitin, Giordan Escalona, Michael Banf, Dominik Filipiak, Max Schattauer, Liliya Imasheva, Alvaro Martinez, Halley Fritze, Marissa Masden, Valentina Sánchez, Manuel Lecha, Andrea Cavallo, Claudio Battiloro, Matt Piekenbrock, Mauricio Tec, George Dasoulas, Nina Miolane, Simone Scardapane, Theodore Papamarkou

    Read more

  • [Experimental Mathematics] Learn2Extend: Extending sequences by retaining their statistical properties with mixture models
    George Dasoulas*, Dimitris Vartziotis*, Florian Pausinger

    Read more

  • [Nature Machine Intelligence] Multimodal representation learning with graphs
    George Dasoulas*, Yasha Ektefaie*, Ayush Noori, Maha Farhat, Marinka Zitnik

    Read more

  • [PAMI] Permute Me Softly: Learning Soft Permutations for Graph Representations
    Giannis Nikolentzos, George Dasoulas, Michalis Vazirgiannis

    We study how we can approximate graph distances by aligning adjacency matrices in a corpus of graphs. In order to allow the differentiable optimization, we suggest the utilization of soft permutation matrices. Read more

  • [Neural Networks Journal] Modularity-Aware Graph Autoencoders for Joint Community Detection and Link Prediction
    Guillaume Salha-Galvan, Johannes Lutzeyer, George Dasoulas, Romain Hennequin, Michalis Vazirgiannis

    Solving simultaneously link prediction and community detection is important in recommendation systems. Here, we show how we can extend the information that Graph Auto-encoders process towards this direction. Read more

  • [Neural Networks Journal] k-hop Graph Neural Networks
    Giannis Nikolentzos, George Dasoulas, Michalis Vazirgiannis

    Standard GNNs use a 1-hop aggregation per layer, limiting their ability to capture graph properties. We iteratively extend the aggregation operator of graph neural networks to increase their receptive field. Read more