We use a highly curated database to find the candidate with the skills most suited to your business’s needs. Browse the talent below, find the perfect fit and our qualified customer service agents will take care of the rest.
Overall 7 years of experience in Information Technology includes Core Java, Hadoop Big Data Echo System, Tableau and SIEBEL CRM. Fast learner with 2 years Hadoop Big Data experience while leading multiple projects from requirement gathering to successful implementation.
Over 13 years of experience in the areas of Big Data, Data Warehousing Technologies and followed by Big Data Testing and Software Testing and looking for a Senior Data Engineer/Big Data Lead position Proficient IT experience in Analysis, Design, Development, Testing and Implementation of business application systems for Process, Insurance, Finance and Telecom domains Experience in Big Data Architecture, Business & Data Analysis, Data Migration, Data Integration, Data Governance, Data Visualization and Reporting Services. Gathering business requirements for both functional and technical requirements that would best suit the needs of the technical architectural development process. Experience in deploying Hadoop cluster on Public and Private Cloud Environments like Cloudera and Amazon Web Services (AWS) Development experience with Spark processing using RDDs, DataFrames and SparkSQL Hands on experience on Cloudera Manager, HDFS, YARN, Hive, Hue, Hbase, Oozie and Zookeeper Proficient in Python and Scala coding Experience working with both Structured and Un-structured data Experience in setting up Sqoop and Kafka to load source data into HDFS. Well versed in Designing ETL(Informatica/Spark) mappings, Run ETL process using Big Data tools for various transactions and loading data from various legacy and disparate data sources to Data Lake or Enterprise Data Warehouse Extensive experience in Data Warehousing Techniques, concepts & architecture, producing solutions that enable the stakeholders to make informed decision, satisfying business objectives/goals Leading globally (UK, India and USA) distributed teams in an onsite/offshore model Well-versed with Data Visualization tools like Tableau Expertise in all phases of SDLC and STLC and involved in end to end lifecycle Exposure in Software Engineering Process and familiar with various Life Cycle Models like Agile (Scrum and TDD) and Waterfall model. Familiar with CI/CD (Continuous Integration & Delivery) development practice Expertise in data analysis with strong analytical skills and problem-solving skills. Strong experience on Defect Life Cycle and expertise in Test Management and Defect Management and experience on Project Management Experience working on CRM and Client facing applications. Quick learner and excellent Team player having ability to meet tight deadlines and work under pressure Strong Leadership (Teal Lead) qualities and able to competently drive forward the project activities independently Excellent Communication and Interpersonal skills and ability to quickly adopt technical challenges more
With strong background in Business Analysis, IT Test driven Development, Risk Management, Data Modelling, across multiple asset classes and with good interpersonal skills, seeking challenging opportunities in the field of IT Business Analysis and Development within Banking and Finance sector.
I recently relocated from Atlanta GA to Edinburgh Scotland and looking for a contracting or full time position in big data.
My experience has developed around the following disciplines:
* Last 4 years - Big Data and Data warehousing and data management - specifically around monetization of big data assets * Before that - Platform modernization (that is the installation of new integrated platforms to increase revenue or reduce cost), * Managing transformational efforts to align technology to reengineer the business * Building effective teams and streamlining processes more
Design and implement system that leverages the Big Data eco-system tools such as Hadoop (Cloudera / Hortonworks) Hive and Impala, as well as messaging systems like RabbitMQ, Kafka, and Storm to build data pipelines using CI tools like Jenkins/Bamboo
A developer who codes for an application that uses massive quantities of data that are too big for a normal system to support. Go Big or Go Home!
What does a big data developer do?
They support the migration of data from a normal-sized system to a Big Data system. Think of them as taking smaller blocks of data from a messy cupboard, to a huge storage unit to make everything more organised and faster.
What should a big data developer know?
The most important skills for a Big Data Developer to have are; Apache Hadoop, NoSQL, Data Mining, NoSQL, Machine Learning, Programming Languages like Scala, Python and Data Visualisation. Always read the full Job Specification to ensure you have the skills and knowledge required.
Big data developer vs data scientist
Big Data Developers work with masses of varied data to enable businesses to have more scope and reach across all areas, whereas Data Scientists extract and use specifics from within Big Data to fine-tune certain areas of information to reach and interest an audience.