Big Data Architect

US-OH-Milford
Job ID
2017-7086
Category
Information Technology

Overview

Our Client is  one of the top 5 IT services and consulting companies worldwide, is looking for a Big Data Architect for FULL TIME in Milford, OH. Below is the detail requirement.

 

Responsibilities

  • Design and implementation of Big Data solutions, including leadership role in design to develop shared/reusable components.
  • Hands-on experience with architecting and Implementing Hadoop applications with complete detailed design of the Hadoop solution, including Data Ingestion, Data Storage/Management, Data Transformation.
  • Hands-on experience with the Hadoop stack - MapReduce, Sqoop, Pig, Hive, Flume, Spark, Kafka
  • Hands-on experience with related/complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef, Scala).
  • Hands-on experience in Web Application related frameworks and technologies like Spring, Ajax, Angular JS, O-R mapping
  • Experience with Hadoop Security model consisting of authentication, service level authorization, authentication for Web consoles and data confidentiality.
  • Experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture.
  • Using Big Data technology and customer’s business requirements design and document a comprehensive technical architecture.
  • Analysis and documentation of source system data sources from traditional (RDBMS) and new data sources (web, machine-to-machine, geospatial, etc.)
  • Plan and execute a technology proof-of-concept (POC) using Big Data technology
  • Experience in handling the Structured and Unstructured data using Big Data
  • Hands on technical competences:
  • Java/J2EE, Linux, PHP, PEARL, Python, C, C++, Scala
  • Hadoop, Hive, HBase, Pig, MapReduce, Spark, Kafka and other Hadoop eco-system components
  • NoSQL databases – Cassandra, MongoDB, MariaDB,Couchbase
  • Data warehouse, BI and ETL tools
  • Detailed knowledge of RDBMS data modeling and SQL AWS, Azure,

 

Qualifications

  • Bachelor's degree in Computer science or related field, with minimum of 12+ year of solid IT consulting experience in data warehousing, operational data stores and large scale implementations.
  • Minimum 5 years of experience in Core Java or Python or Scala.
  • Minimum 4years Hands-on experience on Hadoop Technologies
  • Demonstrate excellent communication skills including the ability to effectively communicate with internal and external customers.
  • Ability to use strong industry knowledge to relate to customer needs and dissolve customer concerns and high level of focus and attention to detail.
  • Strong work ethic with good time management with ability to work with diverse teams and lead meetings

 

 

 

 

 

Praveen Kumar

Lorven Technologies Inc, 

101, Morgan Lane, Suite 209, Plainsboro, NJ-08536

Work : 609-799-4202 Ext 239  | Fax: 609-799-4204

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed

Connect With Us!

Not ready to apply? Connect with us for general consideration.