面议
港交所科技(深圳)有限公司
Summary:
The Data Platform Engineer / Data Analytics Engineer is an important role for developing data analytics capabilities and enhancing our enterprise data platform.
Duties:
The Data Platform Engineer / Data Analytics Engineer is responsible for the execution, and delivery of data analytics engineering projects to incrementally build out our enterprise data platform. Key duties focus on application of engineering skills for end-to-end data analytics technology capabilities from data ingestion, data modelling, data governance, data visualization and analytics solutions implementation. The role requires a strong technical background in cloud experiences and data domain technologies as well as excellent problem-solving skills to ensure solutions are scalable, secure, and efficient.
Key Responsibilities:
§ Work with cross-functional teams to deliver data engineering projects, from development, deployment though to support in ensuring high availability and performance.
§ Design and implement scalable and secure data analytics solutions for data ingestion, data processing, data visualization etc..
§ Collaborate with stakeholders to deliver data platform capabilities, prioritising requirements basing on business benefits and protection against data risks.
§ Promote compliant and quality reuse of data on enterprise platform, through incremental delivering functionalities in data lifecycle, including data access control, data lineage, data modelling, data quality and data analytics.
§ Apply native cloud skills like container, object storage etc. to enable effective data analytics, data management and data governance.
§ Manage development pipeline thru DevSecOps, CICD tooling and Infrastructure as Code.
§ Effectively use hybrid project management techniques, applying agile tools within SDLC phases
§ Coordinate other IT functions in the efficient use of technical tools, internal resources and skills, optimising technical synergies and efficiency across initiatives.
§ Manage third-party data products and vendors for deployment onto the data platform.
Requirements:
§ Bachelor’s degree in computer science, technology, data or related disciplines
§ 5 years of working experience in systems development, preferably having track record in large enterprises. AVP position will require stronger technology and management experiences.
§ Well versed with SDLC project disciplines, and experienced in Agile methodologies in a cross-functional teams
§ Good understanding of enterprise data architecture and technology stack, covering batch processing and streaming data for analytics usage.
§ Familiar with cloud tecnology and operations in AWS, Azure or China clouds and data processing in a cloud environment.
§ Solid technical background with hands-on experience in software development and data related technologies.
§ Proficiency in programming languages such as Python, Java, C++ or Scala
§ Hands-on experience with big data technologies and ecosystems, including Hadoop, Spark, and Kafka.
§ Knowledge of database systems, both relational and NoSQL.
§ Familiar with big data technologies, e.g., Redshift, HDFS, HBase, Hive etc.
§ Experience in data security and data management tools, e.g., LDAP, OAuth, Keycloak, Lake Formation etc.
§ Experienced in DevSecOps tolling, CI/CD pipeline, Terraform etc
§ Excellent problem-solving skills and the ability to work in a fast-paced environment.
§ Excellent influencing skills with good command of written and spoken English.
以担保或任何理由索取财物,扣押证照,均涉嫌违法,请提高警惕