top of page

Cognee is Building Memory for AI Agents

  • Writer: Karan Bhatia
    Karan Bhatia
  • 14 hours ago
  • 2 min read

Cognee, AI Agents that adapt and learn from your feedback, led by Vasilije Markovic and the team, has raised a $7.5 million seed round led by Pebblebed, with participation from 42CAP and Vermilion Ventures, and angel investors from Google DeepMindn8n, and Snowplow.


Cognee was founded in 2024 in Berlin to address why AI agents forget context between sessions despite patched RAG pipelines and vector stores. Drawing on cognitive science and research from the University of California, Berkeley, and Brown University, the company built a memory engine that creates durable knowledge from raw data and continuously updates it over time.


From an open-source experiment to production infrastructure in 2025, Cognee scaled pipeline runs from about 2,000 to over one million and is now used by more than 70 companies.


By 2026, Bayer applied the system to scientific research workflows, the University of Wyoming built an evidence graph with page-level provenance, and integrations were added by Dilbloom and dltHub. The open-source project has over 12,000 GitHub stars, 80+ contributors, and graduated from the GitHub Secure Open Source Program.


Experience of building agents revealed three core insights for Cognee:

Stateless agents hallucinate and forget, forcing conversations to restart from scratch.


Effective memory requires structure beyond retrieval, including temporal awareness, entity relationships, feedback loops, and self-tuning.


The agentic era requires a new primitive where agents store, recall, and reason over experience, treating documents as inputs and memory as the operational foundation.


Backers associated with OpenAI and Facebook AI Research support the view that AI memory represents a standalone category rather than a feature.


Cognee enables agents to learn continuously from interactions by turning scattered data into a self-improving memory graph rather than requiring manually wired relationships.


Its ECL pipeline (Extract, Cognify, Load) ingests data from 38+ sources, structures it into a knowledge graph with embeddings and relationships, and makes it searchable. A feedback-driven “memify” layer refines the graph as rated responses adjust edge weights, improving accuracy over time. The platform unifies relational, vector, and graph storage and integrates with tools such as OpenAI Agents SDK, Anthropic Claude Agent SDK, Google ADK, Amazon Neptune, and Neo4j.


With the new funding, Cognee is focusing on four priorities:


-> A scalable cloud platform to make structured AI memory accessible without infrastructure overhead.


-> A Rust-based engine for edge and on-device agents where latency and privacy are critical.


-> Advancing cognitive memory research into production-ready tooling.


-> Accelerating open source with multi-database support, user isolation, new memory approaches, and 30+ additional data connectors.


The company positions itself as an open-source core memory layer designed to make agents truly intelligent rather than another traditional enterprise platform.

bottom of page