Data Engineer, Analytics

Meta Meta · Big Tech · Seattle, WA

Data Engineer responsible for designing, modeling, and implementing data warehousing activities, building data models and visualizations, collaborating with engineers and data scientists, defining SLAs, improving data logging frameworks, implementing security models, solving data integration problems, optimizing pipelines, and influencing product teams. Requires a Master's degree and one year of experience in big data ecosystems, custom ETL, object-oriented programming, schema design, SQL, data analysis, MapReduce/MPP systems, and Python.

What you'd actually do

  1. Design, model, and implement data warehousing activities to deliver the data foundation that drives impact through informed decision making.
  2. Design, build and launch collections of sophisticated data models and visualizations that support multiple use cases across different products or domains.
  3. Collaborate with engineers, product managers and data scientists to understand data needs, representing key data insights visually in a meaningful way.
  4. Define and manage SLA for all data sets in allocated areas of ownership.
  5. Create and contribute to frameworks that improve the efficacy of logging data, while working with data infrastructure to triage issues and resolve.

Skills

Required

  • Master's degree in Computer Science, Engineering, Information Systems, Information Management, Mathematics, Statistics, Data Analytics, Applied Sciences, or a related field
  • 1 year of experience in features, design, and use-case scenarios across a big data ecosystem
  • 1 year of experience in custom ETL design, implementation, and maintenance
  • 1 year of experience in object-oriented programming languages
  • 1 year of experience in schema design and dimensional data modeling
  • 1 year of experience writing SQL statements
  • 1 year of experience analyzing data to identify deliverables, gaps, and inconsistencies
  • 1 year of experience managing and communicating data warehouse plans to internal clients
  • 1 year of experience with MapReduce or MPP systems
  • 1 year of experience with Python

Nice to have

  • data warehousing
  • data modeling
  • ETL
  • data pipelines
  • data infrastructure