Staff Bigdata Hadoop Administrator - San Diego, United States - ServiceNow

ServiceNow
ServiceNow
Verified Company
San Diego, United States

1 week ago

Mark Lane

Posted by:

Mark Lane

beBee recruiter


Description

Company Description
At ServiceNow, our technology makes the world work for everyone, and our people make it possible.

We move fast because the world can't wait, and we innovate in ways no one else can for our customers and communities.

By joining ServiceNow, you are part of an ambitious team of change makers who have a restless curiosity and a drive for ingenuity.

We know that your best work happens when you live your best life and share your unique talents, so we do everything we can to make that possible.

We dream big together, supporting each other to make our individual and collective dreams come true. The future is ours, and it starts with you.


With more than 7,700+ customers, we serve approximately 85% of the Fortune 500, and we're proud to be one of FORTUNE 100 Best Companies to Work For and World's Most Admired Companies.

Learn more on Life at Now blog and hear from our employees about their experiences working at ServiceNow.


Job Description Please Note**:


  • This position requires passing a ServiceNow background screening, USFedPASS (US Federal Personnel Authorization Screening Standards). This includes a credit check, criminal/misdemeanor check and taking a drug test. Any employment is contingent upon passing the screening. _
    _Due to Federal requirements, only US citizens, US naturalized citizens or US Permanent Residents, holding a green card, will be considered._


As a
Staff DevOps Engineer-Hadoop Admin on our
Big Data Federal Team you will help deliver 24x7 support for our Private Cloud infrastructure.

The Big Data team plays a critical and strategic role in ensuring that ServiceNow can exceed the availability and performance SLAs of the ServiceNow Platform powered Customer instances - deployed across the ServiceNow cloud and Azure cloud


Our mission is to:


Deliver_ _state-of-the-art Monitoring, Analytics and Actionable Business Insights _by employing_ _new tools, Big Data systems, Enterprise Data Lake, AI, and Machine Learning methodologies_ _that_ _improve efficiencies across a variety of functions in the company: Cloud Operations, Customer Support, Product Usage Analytics, Product Upsell Opportunities _enabling to have a_ _significant impact both on the top-line and bottom-line growth.


The Big Data team is responsible for:

  • Collecting, storing, and providing realtime access to large amount of data
  • Provide realtime analytic tools and reporting capabilities for various functions including:
  • Monitoring, alerting, and troubleshooting
  • Machine Learning, Anomaly detection and Prediction of P1s
  • Capacity planning
  • Data analytics and deriving Actionable Business Insights

What you get to do in this role

  • Responsible for deploying, production monitoring, maintaining and supporting of Big Data infrastructure, Applications on ServiceNow Cloud and Azure environments.
  • Architect and drive the endend Big Data deployment automation from vision to delivering the automation of Big Data foundational modules (Cloudera CDP), prerequisite components and Applications leveraging Ansible, Puppet, Terraform, Jenkins, Docker, Kubernetes to deliver endend deployment automation across all ServiceNow environments.
  • Performance tuning and troubleshooting of various Hadoop components and other data analytics tools in the environment: HDFS, YARN, Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis, Hue, Kerberos, Tableau, Grafana, MariaDB, and Prometheus.
  • Responsible for enforcing data governance policies in Commercial and Regulated Big Data environments.

Qualifications To be successful in this role you have**:

  • 6 + years of overall experience with at least 4+ years DevOps experience building and administering Hadoop clusters
  • Deep understanding of Hadoop/Big Data Ecosystem. Good knowledge in Querying and analyzing large amount of data on Hadoop HDFS using Hive and Spark Streaming and working on systems like HDFS, YARN, Hive, HBase. Spark, Kafka, RabbitMQ, Impala, Kudu, Redis, Hue, Tableau, Grafana, MariaDB, and Prometheus.
  • Experience securing Hadoop stack with Sentry, Ranger, LDAP, Kerberos KDC
  • Experience supporting CI/CD pipelines on Cloudera on Native cloud and Azure/AWS environments
  • Demonstrated expertlevel experience in delivering endend deployment automation leveraging Puppet, Ansible, Terraform, Jenkins, Docker, Kubernetes or similar technologies.
  • Good knowledge of Perl, Python, Bash, Groovy and Java.
  • Indepth knowledge of Linux internals (Centos 7.x) and shell scripting
  • Ability to learn quickly in a fastpaced, dynamic team environment
GCS-23


Additional Information
ServiceNow is an Equal Employment Opportunity Employer.

All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status or any other category protected by law.

At ServiceNow, w

More jobs from ServiceNow