Senior Hadoop Administrator

IT Operations | Seattle, WA

Send Jobvite:
Job ID:
R-TS0617
Location:
Seattle, WA
Post Date:
06/30/2017
Department:
IT Operations
THE MISSION

We’re seeking you, a creative and passionate Hadoop Administrator to play a key role supporting our BI initiatives. This is a DevOps position with a focus on the Ops, fully dedicated and integrated in the BI team, driving operational excellence across the team. You’ll be working directly with our BI Analysts and BI Engineers to administer our Enterprise Hadoop environment, pulling from your experience to help us build and maintain cutting-edge and scalable solutions.

You’re focused on operational excellence to stay ahead of long term problems. But you’re also fearless, willing to dive into the unknown, and love the challenge of digging into an issue until you’ve fixed it, and feel great when you’ve got the data flowing again.


THIS INCLUDES...

  • Working in a DevOps environment supporting our Business Intelligence systems
  • Troubleshooting application errors and implementing long-term fixes
  • Diagnosing and fix unknown issues
  • Writing documentation for previously unknown fixes
  • Working in collaboration with the Systems Engineering, Network Engineering, and BI Engineering teams to plan and deploy new Hadoop environments
  • Monitoring Hadoop connectivity, performance and capacity
  • Developing database design and architecture for existing and new initiatives
  • Performance tuning of Hadoop clusters and MapReduce routines
  • Development work driving practice improvements, automation, and infrastructure improvements
  • Backup and recovery
  • Ensuring security for Hadoop clusters
  • Educating and supporting business staff on using Hadoop and HDFS
  • Being primary point of contact for vendor escalations
  • Periodic on-call support
  • Participating in Agile stand-up meetings


REQUIRED ABILITIES, ACHIEVEMENTS, AND XP

  • 5+ years experience supporting enterprise-grade data warehouses
  • 3+ years experience supporting a Hadoop/HDFS infrastructure
  • Ability to write and analyze both Hive and SQL queries
  • Ability to debug and edit ETL artifacts
  • Shell scripting and/or Java programming experience
  • Knowledge of relational databases
  • Familiarity with Configuration Management tools such as CFEngine or SaltStack
  • Familiarity with Version Control tools such as git

 

Want to discover more about life at Big Fish? Check us out on TheMuse.com!

Be the next Big Fish in the Pond!