We’re looking for a hands-on AI Engineer to build Generative AI solutions that automate processes and power conversational assistants (including RAG). You’ll turn requirements into production-ready features, implement retrieval pipelines, integrate with enterprise systems, and help improve reliability, safety, and performance through evaluation and iteration.
What you’ll do
* Build cloud-native GenAI components: LLM orchestration, RAG, API integrations, identity-aware access, and observability
* Develop assistants/copilots: prompt design & templating, tool calling, grounding, guardrails, memory management, and personalization
* Implement RAG pipelines: ingestion & chunking, embeddings, vector stores, quality checks, reranking, caching, and latency optimization
* Run evaluations: create test sets and datasets, track task success, groundedness and hallucination rate, analyze results, and iterate
* Productize solutions: tests, linters, CI checks, Docker, metrics/logs/traces, and deployment support
Must-have
* Degree in Machine Learning or a related field (CS/Engineering/Statistics/Math). Master’s is a plus
* Experience in a similar role with a strong focus on Generative AI, with hands-on delivery
* Strong Python skills, REST API integration, Git, code reviews, and clear documentation
* Practical LLM app development: prompt engineering, tool calling, grounding, guardrails, and memory management
* Practical RAG experience: preprocessing/chunking, embeddings, vector databases (e.g., Azure AI Search / OpenSearch / Pinecone) and basic evaluation (precision/recall)
* Familiarity with Azure and/or AWS and managed AI services (e.g., Azure OpenAI, Azure AI Search, Azure AI Foundry; AWS Bedrock, OpenSearch), plus common GenAI tooling (LangChain, CrewAI, Semantic Kernel, LlamaIndex)
Nice to have
* PyTorch/TensorFlow
* Agile delivery (Scrum/Kanban, Jira/Azure Boards)
* Cloud/AI certifications
* Consulting/professional services experience
📩 Interested? Send your CV to fabio.olyntho@recodme.es with subject Generative IA Engineer or send me here.