Data Engineer

Meta Meta · Big Tech · Austin, TX

Data Engineer at Meta Platforms, Inc. responsible for designing, modeling, and implementing data warehousing activities, building data models and visualizations, collaborating with cross-functional teams, defining SLAs, improving data logging efficacy, implementing security models, solving data integration problems, optimizing pipelines, and influencing product teams to identify data opportunities. Requires a Master's degree in a related field and 48 months of experience in big data ecosystems, custom ETL, object-oriented programming, schema design, SQL, data analysis, and managing data warehouse plans.

What you'd actually do

  1. Design, model, and implement data warehousing activities to deliver the data foundation that drives impact through informed decision making.
  2. Design, build and launch collections of sophisticated data models and visualizations that support multiple use cases across different products or domains.
  3. Collaborate with engineers, product managers and data scientists to understand data needs, representing key data insights visually in a meaningful way.
  4. Define and manage SLA for all data sets in allocated areas of ownership.
  5. Create and contribute to frameworks that improve the efficacy of logging data, while working with data infrastructure to triage issues and resolve.

Skills

Required

  • Master's degree in Computer Science, Computer Engineering, Information Systems, or related field
  • 48 months of experience in job offered or related occupation
  • Features, design, and use-case scenarios across a big data ecosystem
  • Data architecture for multiple large-scale projects
  • Custom ETL design, implementation, and maintenance
  • Object-oriented programming languages
  • Schema design and dimensional data modeling
  • SQL
  • Analyzing data
  • Managing and communicating data warehouse plans
  • MapReduce or MPP system
  • Python

What the JD emphasized

  • 48 months of experience in job offered or related occupation
  • 48 months of experience in each of the following: Features, design, and use-case scenarios across a big data ecosystem, data architecture for multiple large-scale projects while evaluating design and operational cost-benefit tradeoffs within systems Custom ETL design, implementation, and maintenance Object-oriented programming languages Schema design and dimensional data modeling Writing SQL statements Analyzing data to identify deliverables, gaps, and inconsistencies Managing and communicating data warehouse plans to internal clients MapReduce or MPP system Python