
Data Transformation – Technical Lead – Automotive Business
Skf
22 horas atrás
•Nenhuma candidatura
Sobre
- In September of 2024, SKF announced the separation of its Automotive business, with the objective to build two world-leading businesses. The role you are applying for will be part of the automotive business. This means you will have the opportunity to be a part of shaping a new company aimed at meeting the needs of the transforming global automotive market. Would you like to join us in shaping the future of motion? We are now looking for a
- Data Transformation - Technical LeadAutomotive Business
- Are you enthusiastic about using innovative technology to enhance data platform capabilities?
- SKF Automotive's Data, AI & Integrations department is seeking a Digital Solution Manager for our Data Ingestion & Transformation Toolkit. In this pivotal role, you will take a leading role in architecting the future of our data ingestion & transformation capabilities. You will design data pipelines, setting standards for reliability, scalability, and observability. A key part of your role will be to create reusable frameworks and share your expertise and jointly contribute to building a best-in-class data engineering practice.
- This position requires a senior expert who can provide architectural leadership and deep technical guidance in modern data integration and transformation tools, such as Dbt and Informatica, FiveTran HVR, or similar. A key responsibility will be to lead our strategy for data orchestration, architecting how dbt manages its transformation workflows and how it integrates seamlessly with enterprise orchestrators to ensure efficient, reliable, and scalable data delivery.You will oversee the daily management of IT systems and applications within the scope of your role, ensuring efficiency, effectiveness, and the adoption of best-in-class solutions. Join our team to drive transformative initiatives and foster a culture of collaboration and innovation.
- Key Responsibilities
- Play the role of a senior technical expert for key data integration and transformation tools, including Informatica, FiveTran HVR, and dbt. Design and architect robust solutions, stay current with new features and roadmaps, and provide expert guidance to technical teams.
- Collaborate effectively with platform vendors and service delivery partners to ensure the successful delivery and integration of their services. You will be responsible for defining technical requirements, overseeing partner activities, and validating that their work meets our architectural standards and business objectives.
- Actively contribute to our data lakehouse architecture, leveraging our modern tech stack to build and refine data pipelines. You will enable the platform by building capabilities to process data seamlessly across raw, mastered, and application-ready layers, delivering high-quality, layered data solutions.
- Collaborate with data & integration architects, data engineers, analytics engineers, and data platform engineers to design scalable, reliable and robust data ingestions and pipelines.
- Play a key role in empowering the organization with self-service capabilities by contributing to the development of our self-service data ingestion platform.
- Stay up to date with platform developments, best practices, and vendor updates, ensuring the platform remains efficient and secure while enhancing capabilities. (in alignment with the
- Lead the design and implementation of robust, scalable data ingestion pipelines pulling data from our core SAP systems (e.g., SAP S/4HANA, BW, MDG, C4C) to our Snowflake data platform. This includes collaborating with SAP Platform experts and functional experts to develop solutions both for bulk and incremental change data capture / near real-time data capture.
- Enable stakeholders to effectively use the tools in the toolkit through training and proactive guidance.
- Identify and execute cost optimization potentials, apply controlling, and conduct capacity planning sessions for continuous improvement.
- Requirements
- Bachelor’s or Master’s degree in Computer Science, Information Systems, Cloud Computing, Software Engineering, or a related field.
- Deep, hands-on expertise in architecting and implementing solutions with modern data integration tools like Informatica and FiveTran HVR, and data transformation tools such as dbt.
- Proven experience in a data engineering role, with a strong understanding of data modeling, data warehousing, and ETL/ELT principles.
- Proven experience developing and optimizing data pipelines with dbt.
- Experience with version control systems (Git) and implementing CI/CD practices in the context of dbt projects, highlighting the integrated use of Git and dbt.
- Proven experience working on one or more of these platforms – Snowflake, Databricks, Fabric.
- Prior experience building and maintaining data pipelines that extract and land data from SAP ERP systems into a modern data warehouse (Snowflake, Databricks or similar).
- Experience working with cloud platforms, preferably Microsoft Azure. Proficiency with Azure's native data services, eg: Azure Data Factory (ADF) would be an advantage.
- Experience using GenAI, prompt-based engineering and frameworks like MCP (Model Context Protocol) to build faster, more reliable data pipelines is highly desirable.
- Strong communication, stakeholder management, and leadership skills with a proven ability to balance technical nuances with strategic vision.
- Ability to work collaboratively in a dynamic, fast-paced environment.A desire to learn and grow, with an interest in learning emerging & new technologies.
- What You’ll Enjoy Working Here• A dynamic environment where you work closely with cross-functional teams to drive meaningful data solutions.• Opportunities to lead transformative initiatives and contribute to operational excellence.• A culture that values proactive problem-solving, autonomy, and ongoing professional development.
- SKF is committed to creating a diverse environment, and we firmly believe that a diverse workforce is essential for our continued success. Therefore, we only focus on your experience, skills, and potential. Come as you are – just be yourself. #weareSKF
- Some additional information
- The preferred locations for this role are Shanghai, Bangalore and Cajamar.You will report to the Head of Data Factory. For questions regarding the recruitment process, please contact Stina Scheller, EMEA Recruitment Expert, by email stina.scheller@skf.com. Please note that we can't accept applications via email.
- Is this you? If the answer is yes, submit your application with your CV in English no later than September 14th 2025.We will screen candidates continuously throughout the application period, so make sure to submit your application as soon as possible.
- At SKF, we are committed to promoting fairness and inclusivity throughout our recruitment process. To achieve this, we may include assessments and verify the information in your application in compliance with country-specific laws and regulations. If you have any questions or concerns, please feel free to contact the recruiter.