
AI Chat Backend (Python)
Upwork
Remoto
•1 hora atrás
•Nenhuma candidatura
Sobre
We’re looking for a Python backend developer to build a prototype chat system where the AI’s memory evolves over time. This is a short, hackathon-style project to validate technical approaches. If it works well, it may turn into ongoing collaboration. Core Requirements: - Backend service (FastAPI or similar) that connects to an external LLM (OpenAI or Anthropic). - Chat endpoint: user sends messages → system returns LLM response. - Memory layer: * Starts simple (just logging context). * Evolves into retrieval with RAG + Graph-based memory (Zep grafiti). - Configurable LLM parameters: expose system prompt, temperature, max tokens, etc. - Logging / test controls: must provide a way to inspect inputs, outputs, and memory state to validate behavior. - Fully Dockerized backend for easy deployment and reproducibility. Nice to have: - Experience with pgvector, Pinecone, or other vector DBs. - Prior work with AI memory frameworks (Zep, Grafiti, LangChain memory). - Clean, maintainable code and tests. Why this project? - Very focused scope: AI + memory only (no frontend, no auth, no extras). - Opportunity to showcase skill in a tight, experimental build. - If collaboration goes well, potential for long-term partnership.