Join Our Talent Network

What is a Talent Network

Joining our Talent Network will enhance your job search and application process. Whether you choose to apply or just leave your information, we look forward to staying connected with you.

Why Join?

  • Receive alerts with new job opportunities that match your interests
  • Receive relevant communications and updates from our organization
  • Share job opportunities with family and friends through Social Media or email

Join our Talent Network today!

Search jobs

Hadoop Architect

  • Other

Job Description

Job Description

Provides architectural and big picture oversight
and creates Architectural Specification for the development of new or enhanced
products and/or services.

Researches and evaluates new technologies,
design patterns, and software products to determine feasibility and
desirability of incorporating their capabilities within the company's products.

Identifies opportunities to implement and/or
enforce compliance of architectural standards, including Reference
Architecture, into customer and product enhancement or development projects.

Supports development and product teams by
providing high-level analysis and design reviews, performance, scalability and
benchmark test guidance, and subject matter expertise in technology and design.

Plans, directs and maintains projects. Reviews
work requests and estimates scope of projects.

Hands on experience working with Hadoop
Hortonworks 2.5 or higher

In-depth knowledge and understanding of Hadoop
Architecture and HDFS including YARN

Working knowledge of MapReduce, HBase, Pig,
Java, Hive, Zookeeper, Flume and Sqoop

Successful Big data deployments and
implementation strategies utilizing Hadoop life cycle model

Ability to perform analytics on data stored in
Hadoop via SQL queries

Experience working in core development Big Data
projects, Should be able to perform hands-on development activities in HDFS,
Hive, HBase, Spark, Scala, Map Reduce and Hadoop ETL development via tools is
plus.

Strong knowledge of programming and scripting
languages such as Java, Spark, Python, Hive, Pig

Experience with major big data technologies and
frameworks including but not limited to Hadoop, MapReduce, Pig, Hive, HBase,
Oozie, Mahout, Flume, ZooKeeper, MongoDB, and Cassandra;

Experience in working on production big data
solutions and experience in client-driven large-scale data lake implementation
projects;

Work with data engineering related groups in the
support of deployment of Hadoop and Spark jobs.

Deep understanding of Hadoop and Spark cluster
security, networking connectivity and IO throughput along with other factors
that affect distributed system performance

Strong working knowledge of disaster recovery,
incident management, and security best practices

 

Thanks for joining our Talent Network,

By joining our Talent Network you have not officially applied to a position.

Please apply now to become candidates for vacancies or continue update resume.