Job was saved successfully.
Job was removed from Saved Jobs.

Senior Data Engineer- GCP Platform


Last Updated: 7/19/21

Job Description

Dear Job Seekers,

Salesforce is Hiring for:

Job Title: Senior Data Engineer- GCP Platform

Job Description:

  • The candidate will also have proven track record working with enterprise metrics, strong operational skills to drive efficiency and speed, expertise building repeatable data engineering processes, strong project management skills, and a vision for how to deliver data products.
  • Experience with marketing and sales data sets is a huge plus.
  • Design and Implement of data pipelines, both batch and real time, that produces reliable data for various data consumption use cases.
  • Help manage Marketing data platforms which includes the Google Cloud stack, Amazon Web services, and traditional data warehouses such as Oracle.
  • Manage all aspects of dataset design, creation, and curation; including the frameworks to derive metrics to deliver data products for KPI, visualization, data science, analyst and stakeholder teams
  • Build and own the automation and monitoring frameworks that showcase reliable, accurate, easy-to-understand metrics and operational KPIs to stakeholders for data pipeline quality and performance.
  • Drive design, building, and launching of new data models and data pipelines in production systems Be the subject matter expert for data, pipeline design, and other related big data and programming technologies.
  • Proactively identify reliability & data quality problems and drive triaging and remediation process Partner with data producers in understanding data sources, enable data contracts and define the data model that drives analytics
  • Partner with Analysts and Data Scientists on delivering reliable data that powers actionable insights
  • Evaluate various technologies and platforms in open source and proprietary products.
  • Execute proof of concept on new technology and tools to pick the best tools and solutions.
  • Understanding of data governance practices such as metadata management, data lineage, and data glossaries a huge plus.
  • Foster strong collaboration between globally distributed team members Harness operational excellence & continuous improvement with a can-do attitude.
  • Qualification and Skills: You have to love data - this is what we do. We are looking for people who are excited about different and unique data sets, and all the ways that they could be used in order to improve user experience.
  • S/M.S. in Computer Sciences or equivalent field, with 7+ years total experience and 3+ years of relevant experience within big data/data warehousing domain.
  • Solid understanding of RDBMS concepts, data structures and distributed data processing patterns. Expertise in programming pipelines in languages like Python, Scala, Java etc.
  • Expertise in big data technologies like Hadoop, Spark, Hive, Snowflake etc.
  • Experience with Cloud technologies like AWS, GCP, Containers, Kubernetes. (GCP Preferred) Experience with version control systems (Github, BitBucket etc..) and CI/CD tools.
  • Experience in data orchestration & scheduling tools like Airflow, Control-M, Autosys, Tidal etc.
  • Proven track record in building products on some production-grade systems.
  • Partner with Product Managers and Data Scientists to understand customer requirements and design prototypes and bring ideas to production.
  • Passionate, curious, self-starter and innovate mentality.
  • Hands-on knowledge of salesforce products and functionalities a plus.
  • Effective presentation skills required to drive strategic discussions at various levels in the organization.
  • Experience with Sales and Marketing data is a huge plus

Experience: 7+ years

Salary: As per the Company Standards

Location: Hyderabad

Company Details

Hyderabad, Telangana, India