Data Engineer, Analytics

Meta Meta · Big Tech · Bellevue, WA

Meta is seeking a Data Engineer, Analytics to design, model, and implement data warehousing activities. This role involves building data models, collaborating with cross-functional teams, defining SLAs, improving data logging frameworks, implementing security and governance processes, solving data integration problems, optimizing pipelines, and influencing product teams to identify data opportunities. The ideal candidate has a Bachelor's degree and 60 months of experience in big data ecosystems, ETL, object-oriented programming, schema design, SQL, data analysis, and Python.

What you'd actually do

  1. Design, model, and implement data warehousing activities to deliver the data foundation that drives impact through informed decision making.
  2. Design, build and launch collections of sophisticated data models and visualizations that support multiple use cases across different products or domains.
  3. Collaborate with engineers, product managers and data scientists to understand data needs, representing key data insights visually in a meaningful way.
  4. Define and manage SLA for all data sets in allocated areas of ownership.
  5. Create and contribute to frameworks that improve the efficacy of logging data, while working with data infrastructure to triage issues and resolve.

Skills

Required

  • Bachelor's degree in Computer Science, Engineering, Information Systems, Mathematics, Statistics, Data Analytics, Applied Sciences, or a related field
  • 60 months of progressive, post baccalaureate work experience in the job offered or in a computer-related occupation
  • Features, design, and use-case scenarios across a big data ecosystem
  • Custom ETL design, implementation, and maintenance
  • Object-oriented programming languages
  • Schema design and dimensional data modeling
  • Writing SQL statements
  • Analyzing data to identify deliverables, gaps, and inconsistencies
  • Managing and communicating data warehouse plans to internal clients
  • MapReduce or MPP system
  • Python
  • Gathering and understanding business requirements for complex systems and processes
  • Analyzing and optimizing performance of complex workflows

What the JD emphasized

  • 60 months of progressive, post baccalaureate work experience
  • 60 months of experience in the following: Features, design, and use-case scenarios across a big data ecosystem Custom ETL design, implementation, and maintenance Object-oriented programming languages Schema design and dimensional data modeling Writing SQL statements Analyzing data to identify deliverables, gaps, and inconsistencies Managing and communicating data warehouse plans to internal clients MapReduce or MPP system Python Gathering and understanding business requirements for complex systems and processes Analyzing and optimizing performance of complex workflows