Job details for
Big Data Dev Ops

Big Data Dev Ops

Job Info

Category: Development
Company Description: One of the largest and most well respected studios in the world
Salary: Highly Competitive, Depending on Experience
Position Type: Permanent
Location: Burbank-CA
Job Number: 8973

Job Description

JOB DESCRIPTION

Our client has been entertaining audiences for more than 90 years through the world’s most-loved characters and franchises. Our client  employs people all over the world in a wide variety of disciplines and we're always on the lookout for energetic, creative people to join our team. We are looking for a Big Data DevOps Engineer for our Consumer Intelligence Platform team.


Our client combines industry-leading technologists and disciplines to ensure global alignment with business strategy and accelerated delivery of innovative technology solutions studio- and industry-wide. This team manages the Studio’s enterprise systems and solutions, emerging platforms, information security, consumer intelligence, content mastering and delivery, and more.

What part will you play?
The Big Data Dev Ops Engineer’s primary role will be to enhance, secure and expand our consumer Big Data platform running on AWS. We are looking for someone who is not only passionate about AWS, analytics and security, but it also excited about rolling out new technologies to support multiple studio wide initiatives focused on learning more about our consumers and what they want. If you like to analyze large amounts of data such as facts, figures and number crunching this might be the perfect opportunity for you!

What will you do?
The role will own the infrastructure and networking aspect of our AWS environment, ensuring that the system is available, secure and running like a well-oiled machine. This will include implementing systems around deployment, managing and creating tools around operational processes, tuning for best performance and cost, and helping all applications utilize the rich set of AWS tooling to be best in class. 


What you’ll need?

  • A Bachelor’s degree in CS, Engineering or similar degree.

  • At least 2 years of experience with a variation of the following: AWS data technologies (S3, Redshift, Data Pipeline), Big Data Analytics, operations of data warehouses, NoSQL and securing cloud-based solutions.

  • Experience with JIRA, Confluence, GIT, 

All qualified candidates are encouraged to apply by submitting their resume as an MS word document including a cover letter with a summary of relevant qualifications, highlighting clearly any special or relevant experience.
Alec  McKinley

Alec McKinley

Executive Recruiting Specialist

Contact Recruiter

alec.mckinley@andiamogo.com

Andiamo
17 State Street, 8th floor
New York, New York 10004

Please Note: All inquiries will be treated with the utmost confidentiality. Your resume will not be submitted to any client company without your prior knowledge and consent.

Contact Us