Connecting to LinkedIn...

Hadoop Consultant - Big Data and Analytics technology

Job Title: Hadoop Consultant - Big Data and Analytics technology
Contract Type: Permanent
Location: Paris, Isle of France
Salary: €80000 - €110000 per annum + Company Benefits
Start Date: ASAP
REF: PI180915D
Contact Name: James Gribbon
Contact Email:
Job Published: over 6 years ago

Job Description

Opportunity to join one of the world's largest Data/Big-Data technology companies - their product portfolio includes Data Warehousing, Business Intelligence, Analytics, ETL and Big Data solutions. They have an urgent need to hire a Hadoop Consultant to work with a number of their major customers in France on some major BigData projects.

JOB TITLE: Hadoop Consultant - Big Data and Analytics technology
LOCATION: Good Access to the Paris Office
PACKAGE: Base €80-110k + Bonus + Benefits


* $2bn Turnover organisation
* Global recognised technology vendor within the Analytics, BigData, Data Warehousing and Business Intelligence space
* Global office network
* Multiple award winning company and technology
* Industry leading leadership team


* Hands-on experience in the design, development or support of Hadoop in an implementation environment at a leading technology vendor or end-user computing organisation
* 2+ years' experience implementing ETL/ELT processes with MapReduce / YARN, PIG and Hive
* Hands on experience with HDFS, and NoSQL database such as HBASE, Cassandra on large data sets
* Experience in any programming languages such as; Shell, C, C++, C#, Java, Python, Perl and R
* Demonstrate a keen interest in, and fair understanding of, "big data" technology and the business trends that are driving the adoption of this technology
* Demonstrate analytical and problem solving skills; particularly those that apply to a Big Data environment
* Strong understanding of data structures, modelling and Data Warehousing
* Team-oriented individual with excellent interpersonal, planning, coordination, and problem-solving skills


* Engage with Account teams and prospective customers to analyse and understand customer requirements
* Shape and influence customer requirements so that they are deployed in an optimum Hadoop architecture
* Assist in qualifying requirements and provide guidance within the Big Data Team to determine whether Hadoop is a good fit for the problem that the customer is trying to solve
* Design, plan and execute on-site / off-site customer proof-of-concepts
Configure and use Hadoop distribution tools and associated products. Typically Hive, Pig, HCatalog, MapReduce / YARN, Kerberos, Knox, XA Secure and procedural programming languages
* Partner with Hadoop administrators to secure and configure Hadoop clusters to optimise performance and administrate the Hadoop environment
* Post-POC-execution, document and disseminate the results and lessons learned to all stakeholders


To apply for the above position, Please contact James Gribbon or hit the apply button.