top of page

Q.ANT Raises €62M Series A to Revolutionize AI & HPC with Photonic Processors

  • Writer: Menlo Times
    Menlo Times
  • Jul 18
  • 1 min read
ree

Q.ANT, a pioneer in photonic processing, led by Andreas Abt(SVP, Native Computing), Dr. Michael Förtsch(CEO), and Tim Stiegler(CFO), announced a €62 million Series A financing round to accelerate the commercialization of its energy-efficient photonic processors for artificial intelligence (AI) and high-performance computing (HPC), co-led by Cherry Ventures, UVC Partners and imec.xpand with participation from additional deep tech investors, including L-Bank, Verve Ventures, Grazia Equity, EXF Alpha of Venionaire Capital, LEA Partners, Onsight Ventures, and TRUMPF. This investment ranks among Europe’s most significant deep tech funding rounds, laying the foundation for a fundamental shift in how AI is computed.

Q.ANT has developed the world’s first commercial photonic processor for AI and HPC workloads, offering a transformative alternative to traditional CMOS chips. Using light instead of electricity, its Native Processing Server—based on Thin-Film Lithium Niobate (TFLN)—delivers up to 30× energy efficiency, 50× performance gains, and 100× increased data center capacity, all without active cooling. As data center energy use approaches unsustainable levels, Q.ANT provides a scalable, sustainable, and high-performance solution ready for real-world deployment.


Q.ANT will use the new funding to scale production, develop next-gen photonic processors, grow its team, and expand into the U.S. to meet rising customer demand. Strengthened by semiconductor veterans Hermann Hauser (ARM) and Hermann Eul (Intel, Infineon), the company aims to make photonic processing a core part of global AI infrastructure by 2030. It's Photonic Native Processing Server (NPS), now available for early access, offers high-performance, energy-efficient computing in a plug-and-play format compatible with today’s AI software, eliminating heat, reducing power use, and enabling unprecedented scalability.

Comments


bottom of page