Senior Data Architect

Location : CityStateForPosting West Hollywood, CA
Job Locations
US-CA-LOS ANGELES
Brand
Pluto TV
Job Type
Full-Time Staff
iCIMS ID
2019-13254

About The Brand

Pluto TV is the leading free streaming television service in America, delivering 100+ live and original channels and thousands of on-demand movies in partnership with major TV networks, movie studios, publishers, and digital media companies. Pluto TV is available on all mobile, web and connected TV streaming devices and millions of viewers tune in each month to watch premium news, TV shows, movies, sports, lifestyle, and trending digital series. Headquartered in Los Angeles, Pluto TV has offices in New York, Silicon Valley, Chicago and Berlin. Pluto is a subsidiary of ViacomCBS (NASDAQ: VIAB, VIA), a global content company with premier television, film and digital entertainment brands.

Overview and Responsibilities

PlutoTV is looking for a Senior Architect to define and operate our large scale analytics processes. PlutoTV is a free ad-supported over-the-top television service with over 100 linear channels and 1000s of VOD movies and TV shows. As of August 2019 we had 18million unique viewers, a 50% growth in just 6 months. And we are growing ever faster as we add more premium branded content and we expand into new international markets. As we mature from a scrappy startup into an industry-leader we need to develop our solutions and processes so we can grow efficiently, reliably, and globally. You have developed database schemas and services to capture and process billions of data elements every day, 24/7/365. You know how to build highly-performing data pipelines and schemas.

 

  • Collaborating with key stakeholders, executives, data engineers, and data analysts to perform data discovery and develop various objectives for data architecture and strategy
  •  Leading data modeling and data warehouse analysis, architecture, design, and development, which includes but is not limited to, setting up data ingestion pipelines, data models/star schemas, and ETL/ELT processes
  • Developing data strategies that define how the data will be stored, consumed, integrated and managed by different data entities and IT systems, as well as any applications using or processing the data
  • Collaborating with new and existing internal/external data vendors to understand data sourcing strategy, identify and collect required data, validate processes, and recommend data ingestion patterns and solutions
  • Providing expertise on dimensional modeling and database design best practices
  • Data wrangling of heterogenous data and explore and discover new insights
  • Lead and support mappings of data sources, transformations, and data movement
  • Implements procedures in place to validate data to ensure that data integrity and quality standards are met
  • Provides mentorship to support team members

Basic Qualifications

  • 5+ years of hands-on experience in large-scale data architectures
  • 5+ years writing and executing SQL queries
  • Strong analytical problem solving and decision making
  • Strong communication skills, written and verbal.
  • 3+ years with highly-performing data pipes such as Kafka, NATS or Kinesis
  • 3+ years with distributed data processing systems such as Hadoop, Spark or Snowflake
  • Experience serving in a data modeling and data architect role for enterprise data modeling across multi-domain, multi-subject areas using Inmon and/or Kimball methodologies
  • Hands-on development using and migrating data to cloud platforms
  • Proficiency in relational database design and development
  • Experience with one of the data modeling/data cataloging tools
  • Experience developing data models from technical and functional requirements including:

○ Creating logical and physical data models for current and proposed states

○ Turning application designs into data model specifications

○ Understanding data storage requirements and estimate database growth

○ Designing OLAP and reporting tool data model requirements

○ Creating intuitive data models in business end user terminology

○ Constructing database normalization. Understanding of 1NF to 5NF

○ Utilizing data modeling tools to record, update, reverse engineer and generate DDL

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed