
AI companies and VC firms meet up and pictches at Slush 2025
Heading to Slush 2025 and thinking of AI? From the startup stage to evening warmups, here's what's happening.
Cool cocktails. Hot AI ideas. It's the AI Alternatives Session at Slush 2025.
As Slush lights up Helsinki, Literal Labs invites AI founders, investors, and enthusiasts to step away from the conference floor for an evening of ideas, introductions, and challenging conversation. The AI Alternatives Session brings together those shaping the future of AI startups — exploring new approaches to explainable, energy-efficient intelligence, and the technologies transforming AI itself.
Hosted at Kupoli Cocktail Bar, in Helsinki’s city centre, the evening offers a relaxed setting to meet members of Literal Labs’ C-suite and to discuss our upcoming product launch, current AI investment trends, and opportunities for collaboration. Expect cocktails that flow freely, conversations that run deep, and a genuine space for connexion as opposed to pitches and slides.
Whether you’re attending to explore partnerships, scout new AI startups, or simply unwind among peers, you’re warmly invited to join us. Drinks are on us; ideas are on everyone.
Interested in attending? Be sure to RSVP for the AI Alternatives Session via Luma.
This year’s AI on the Startup Stage brings together four such voices: Rebekka Mikkola, Co-founder and CEO of Trismik; Sauraj Gambhir, Co-founder of Prior Labs; Noel Hurley, CEO of Literal Labs; and Ruben Bryon, Co-founder and CIO of DataCrunch.
At 12:48 PM, Literal Labs’ Noel will take to the stage to explore what comes after GPUs — discussing how logic-based AI and efficient architectures are enabling intelligence to run on the silicon we already have. Expect a sharp, pragmatic view of AI’s next chapter: one where smarter algorithms, not larger data centres, drive progress.
If you're looking to meet outside of the session with the Literal Labs team, contact us to arrange a one-on-one meeting.
Literal Labs is a UK startup pioneering logic-based AI, whose Logic-Based Networks (LBNs) deliver up to 54× faster inference with around 52× less energy use than neural networks, running efficiently and sustainably on standard MCUs and CPUs without the need for GPUs.