We are looking for a Senior AI and Data Engineer to join our innovative team. The ideal candidate will design and implement data architectures, develop complex data pipelines, and optimize ETL processes to enhance our data capabilities. This role requires deep technical expertise in Python, SQL, and various database systems.
Fingermark™ is a global tech company based in New Zealand with offices in Australia, Brazil, UAE, and the United States. For over 15 years, they have been a world leader in digital transformation and led the way in developing Artificial Intelligence (AI) and associated hardware products for the Quick Service Restaurant (QSR), Retail and Industrial sectors.
We design, build and deploy leading-edge digital technology solutions enabling operational efficiency, revenue gains and enhanced customer experiences. From AI software with real-time business analytics and consultancy through to kiosks and menu boards - we design fit-for-purpose innovative solutions and ecosystems. Fingermark researches and develops game-changing, innovative technologies that support its vision: to revolutionize customer speed of service.
- Data Architecture: Design and implement comprehensive data architectures, ensuring scalability, performance, and reliability.
- Data Pipeline Development: Lead the development and maintenance of complex data pipelines, integrating diverse data sources and optimizing performance.
- ETL Processes: Architect and implement robust Extraction, Transformation, and Loading (ETL) processes, handling complex data integration scenarios.
- Database Management: Oversee the management and optimization of database systems (relational, NoSQL, in-memory, distributed) including PostgreSQL, RedShift, and DynamoDB.
- Data APIs: Architect, implement, manage, and optimise data APIs using a variety of tools, including API gateways, lambdas and technologies like Apache Iceberg.
- Data Quality and Governance: Establish and enforce data quality assurance and data governance policies and procedures.
- Cloud Infrastructure (AWS): Design, implement, and manage data infrastructure on AWS, including S3, RDS, RedShift, Glue, EMR, Lambda, Kinesis, and CloudWatch. Strong competency in AWS security and networking.
- Automation and Scripting: Develop advanced scripts (Python) to automate complex tasks and workflows. Experience with Infrastructure as Code (Terraform).
- AI/ML Integration: Integrate AI/ML services into data pipelines and workflows, with a solid understanding of AI/ML fundamentals and data requirements.
- Problem Solving: Diagnose and resolve complex data issues, lead root cause analysis, and develop troubleshooting guides.
- Documentation: Create and maintain detailed technical documentation.
- Programming: Expert proficiency in Python and SQL.
- Databases: Deep understanding of relational, NoSQL, in-memory, and distributed databases. Expert in database management systems (PostgreSQL, RedShift, DynamoDB).
- Data Modelling and Warehousing: Expert in designing data models, creating and managing data warehouses and data lakes.
- AWS: Extensive experience with AWS data systems and services.
- Infrastructure as Code: Proficient with Terraform.
- Analytics Tools: Strong working knowledge of Jupyter, Git, AWS CLI, and Python libraries (boto3, awswrangler, Pandas, NumPy, scikit-learn, etc.).
- Data Visualization: Experience with creating data visualizations using Python libraries (Matplotlib, Seaborn, Plotly) or tools like Grafana.
- Machine Learning and AI: Strong foundational understanding of ML/AI, including Generative AI and LLMs, and experience integrating these technologies into data pipelines.
- 5+ years of experience as a Data Engineer, ideally on AWS.
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
- English proficiency (minimum C1)
- Strong problem-solving and analytical skills.
- Curious and able to suggest and lead new ways of doing things.
- Excellent communication and teamwork abilities.
- Ability to work independently and manage multiple tasks.
- Strong attention to detail and commitment to data quality.
- Relevant certifications (e.g., AWS Certified Data Analytics - Specialty) are a plus
- PJ contract, Full remote;
- English Training;
- 4 weeks paid leave;
- Annual budget for professional development;
- 10 paid sick days;