Company Confidential

We are looking for exceptional Big Data Developers to join our fast growing Information Technology Team. We are a dynamic, talented, and passionate team looking for highly talented individuals. We design, develop and support complex algorithms that optimize our high-performancee clusters.  The various data assets collected and housed in our platform provides data on consumers across a multitude of dimensions such as raw data, sub segment level data and time series data.  The current MPP platform written in C++ handles billions of records and terra bytes of data for each Customer.  The ideal candidate is a creative problem solver, resourceful in getting things done, and productive working independently or collaboratively.  They have an ability to break large complex problems into manageable pieces, strong object oriented concepts, an ability to dig through large volumes of data, and will apply component oriented micro services architecture as the position level demands.


How You’ll Contribute:

  • Convert specifications to detailed instructions and logical steps to follow for coding
  • Build program code, test and deploy in different servers or computer equipment
  • Develop algorithms, perform data analysis and data validation
  • Provide user support and problem solving – research results and analyze logs in search of causes and correct problems identified programs
  • Enjoying being challenged and to solve complex data problems on a daily basis
  • Document the programs developed, logic, coding and corrections

What You’ll Bring:

  • 4+ years with data ingestion, analysis, integration, and design
  • 3+ years utilizing relational concepts, RDBMS systems, and data design techniques
  • 3+ years as an ETL developer using an enterprise tool such as Ab Initio, SSIS and/or Informatica
  • Ability to handle large volumes and variations of data, analyze, investigate and provide insight
  • Strong Programming skills with 2+ years of experience in software development
  • Experience developing on Linux
  • Sound knowledge of SQL
  • Solid computer science fundamentals (algorithms, data structures and programming skills)
  • Scripting experience (e.g. Perl, Python, Shell, Ruby, etc.)
  • Strong analytical skills
  • Computer Science (equivalent degree) preferred or comparable years of experience

Nice to Have:

  • Experience providing stable and reliable big data solutions
  • Experience with Ab Initio
  • Experience with distributed data processing, hpcc, Hadoop, and etc…
  • Familiarity with test driven development, continuous integration and release management

 

To apply for this job email your details to Info@princetonstaffingsolutions.com