Описание на работата
Patrianna is a rapidly expanding product development company headquartered in Gibraltar, with a dynamic team spanning the globe. We are on the lookout for exceptional talentdriven, ambitious individuals who are ready to elevate business functions with expertise, adaptability, and a passion for continuous growth.
Were looking for an AI Engineer to lead the development of an intelligent, self-evolving data ecosystem powered by large language models (LLMs). Youll be at the forefront of innovation, blending data engineering, real-time intelligence, and AI productization to deliver cutting-edge solutions across the business.
Key Responsibilities
LLM-Oriented AI & Automation Systems
- Build, fine-tune, and integrate LLM pipelines using platforms such as Vertex AI, OpenAI, Google Studio for data transformation, summarization, and retrieval-based tasks.
- Develop prompt-driven automation systems for workflows including metadata extraction, tagging, reporting, and anomaly detection.
- Design and deploy RAG (Retrieval-Augmented Generation) and embedding-based search solutions integrated with enterprise knowledge bases.
Agent Orchestration & Autonomy
- Build intelligent agents using CrewAI, LangChain or similar frameworks to support use cases like automated report generation, root cause analysis, and interactive user agents.
- Implement feedback and memory mechanisms to support context-aware, evolving agents.
AI-Native API & Platform Integrations
- Develop, manage, and scale API integrations that connect AI workflows with data warehouses, business tools, and external data sources.
- Create intelligent endpoints for model inference, decision logging, and webhook/postback triggers to support real-time interactions and data enrichment.
Data Modeling for AI Workflows
- Architect and maintain data models optimized for AI usagestructured for quick access by LLMs, retrieval engines, and embedded reasoning tools.
- Integrate semantic layers and metadata mapping to facilitate natural language querying and low-code data exploration.
Documentation, Scaling & Knowledge Ops
- Document all AI workflowsfrom prompt structures and agent behaviors to API interfaces and data flowsfor reproducibility and team-wide collaboration.
- Build internal knowledge bases, vector stores, and wiki systems fueled by LLM summarization and indexing.
- Advocate for internal AI adoption through tooling, workshops, and scalable use case frameworks.Preferred Qualifications
Must-Have
- Bachelors degree in Computer Science, Data Science, AI/ML, or related field.
- Strong experience with GCP ( Vertex AI, Google Studio and other GCP tools).
- Demonstrated success integrating LLM APIs (OpenAI, Vertex AI, Perplexity, Gemini) into production systems.
- Expertise with LangChain, CrewAI, OpenAI platform or similar frameworks.
- Expertise in building and deploying JS/n8n/Python-based AI workflows, agents, and APIs.
- Solid SQL knowledge for structured data operations and AI-data interplay.
Nice to Have
- Google Cloud certifications in Data or Machine Learning.
- Experience with vector databases, embedding generation, and RAG infrastructure.
- Exposure to autonomous agent memory systems, reward tuning, or action planning loops.
- Familiarity with Data Engineering process and tools.