Skip Navigation

Embrace the opportunities for personal and professional growth

パレクセルでの仕事

パレクセルは、臨床開発の上流から下流までのそれぞれの専門領域で、国内外の多くのお客様に高度なソリューションを提供するプロフェッショナル集団です。 パレクセルは、新しい薬や治療法を、それらを必要としている人々にいち早く届けることを目指し、日々の業務に取り組んでいます。

Senior Data Engineer

求人ID R0000036273 ロケーション インド 仕事内容

Key Responsibilities

  • Architect and implement end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake for large-scale data ingestion, transformation, and storage.

  • Using Microsoft Azure data PaaS services, design, build, modify, and support data pipelines leveraging DataBricks and PowerBI in a medallion architecture setting.

  • If necessary, create prototypes to validate proposed ideas and solicit input from stakeholders.

  • Excellent grasp of and expertise with test-driven development and continuous integration processes.

  • Analysis and Design – Converts high-level design to low-level design and implements it.

  • Collaborate with Team Leads to define/clarify business requirements, estimate development costs, and finalize work plans.

  • Run unit and integration tests on all created code – Create and run unit and integration tests throughout the development lifecycle.

  • Benchmark application code proactively to prevent performance and scalability concerns.

  • Collaborate with the Quality Assurance Team on issue reporting, resolution, and change management.

  • Support and Troubleshooting – Assist the Operations Team with any environmental issues that arise during application deployment in the Development, QA, Staging, and Production environments.

  • Familiarity with PowerBI and Reltio is advantageous but not required.

  • Collaborate with BI teams to ensure data models are optimized for reporting in Power BI, with a focus on performance and usability.

  • Establish data governance, quality, and security controls, ensuring compliance with GDPR, HIPAA, and global clinical data regulations.

  • Mentor and guide junior engineers, fostering technical excellence and knowledge sharing.

  • Drive automation and CI/CD practices within data engineering pipelines, integrating with version control and deployment workflows.

  • Work closely with Data Architects, Business Analysts, and Product Owners to translate business needs into technical solutions.

Skills:

  • Expert-level knowledge of Azure Data Factory, Databricks, and Snowflake.

  • Understanding of quality processes and estimate methods.

  • Understanding of design concepts and architectural basics.

  • Fundamental grasp of the project domain.

  • The ability to transform functional and nonfunctional needs into system requirements.

  • The ability to develop and code complicated applications is required.

  • The ability to create test cases and scenarios based on specifications.

  • Solid knowledge of SDLC and agile techniques.

  • Knowledge of current technology and trends.

  • Logical thinking and problem-solving abilities, as well as the capacity to collaborate.

  • Primary skills: Cloud Platform, Azure, Databricks, ADF, ADO.

  • Sought: SQL, Python, PowerBI.

  • General Knowledge: PowerApps, Java/Spark, Reltio.

  • 5-7 years of experience in software development with minimum 3 years of cloud computing.

  • Proficient in SQL, Python, and cloud-native architecture.

  • Strong grasp of data security, privacy compliance, and best practices in a regulated environment.

Education:

  • Bachelor's Degree in technical discipline (Math’s, Science, Engineering, Computing, etc.)

あとで応募

お知らせを受け取る

パレクセルで新しいポジションの募集が出た際、お知らせいたします。

保存したポジション

保存されているポジションはありません

すべて見る

最近見たポジション

最近見たポジションはありません

すべて見る

イベント情報 (グローバル)