We use a highly curated database to find the candidate with the skills most suited to your business’s needs. Browse the talent below, find the perfect fit and our qualified customer service agents will take care of the rest.
I have overall 17 years’ experience out of which 8 years of data architecture, analysis and design in Telco and card payment domains. In my last role at Visa, I was working as a big data analyst. My key skills include Big data analysis, data modelling, data profiling and solution design for data warehouse and Big data systems. As part of my job, I work closely with enterprise architecture team to make sure logical models are aligned to enterprise models. I am currently looking for a new exciting contracting role in Big Data technologies
Sr. Data Engineer(Hadoop/Spark/Kafka/Cassandra/Scala/Java)
9.5 years of product development experience in various domain like banking, E-commerce, Telecom Expense Management, API Management on AWS cloud and IOT. 5 years of Rich experience in Designing and developing Big Data processing platform that includes ETL application using Cloudera Hadoop Distribution (CDH), Real-time data processing and analysis using Apache Spark Streaming, IOT data analysis using Storm. Experience in building data processing and analytics platform using various open source technologies. Experience in agile software development process and lifecycle. Expertise in Hadoop Map Reduce, Hive, Oozie, ZooKeeper, Flume, Spark, Storm, Kafka, Cassandra and Elastic search. 5 years of experience in various AWS services like EC2, S3, RDS, ELB, Kinesis etc.
Experience Summary: I have over 15+ Years of commercial software development experience as Architect / Senior Software Developer in Java, J2EE and Big Data Technologies and I bring with me an extensive and proven experience of working on some of the technically challenging and world class projects for Telecom, Banking, Finance and HR clients from UK, Singapore, Malaysia and India.
5+ Years of experience into Big Data Technologies using Cloudera Distribution of Apache Hadoop, HDFS, Map Reduce Programming, Sqoop, Hbase, Flume, ZooKeeper, Hive, Pig, Oozie, Impala, Spark etc.
Extensive hands-on working experience on 3 projects and 4 POC’s with in Big Data using Apache Hadoop and Spark
Cloudera certified Apache Hadoop developer CCD-410 more
15+ years of MS SQL Server experience comprising of database design, data engineering and SQL development. Previous projects include the development of a trading system with automatic settlement, performance tuning and refactoring of legacy code, decommissioning of a SSAS data warehouse and replacing with data models on column store indexes. Gathering of data from databases in a multi tenant environment to help formulate new lines of business as well as maintain operation scorecards to monitor internal performance. Familiar with SQL 2012, 2014 and most recently 2016. Currently busy with personal Azure SQL project. Experienced with SDLC, software development methodologies, product deployment and technological innovation. Track record of technical, managerial and executive roles. Excellent communication skills.
Designing and creating a new Big Data and Advanced Analytics Platform. Brining Digital Transformation to a FTSE 100 company. Utilising Cloud technologies to provide a common Digital platform for the IoT and Analytics solutions to the broader Babcock Enterprise and external markets of Defence, Energy & Critical National Infrastructure. Delivering a comprehensive market review of Cloud providers and identifying the appropriate Cloud Provider for the organisation. Developing the business case for creating the new Microsoft Azure based Analytics platform. Designing a hybrid Analytics Platform hosted on Azure, both Cloud and Stack. Producing the Big Data options for Babcock and developing both analytical and Big Data POCs. Developing IOC on Microsoft Azure including tools such as IoT, Azure SQL, Data Lake, Data factory, HDInsight, Machine Learning, IoT, Artificial Intelligence (AI), TensorFlow, Python PowerBI and APIs. Responsible for delivering the platform into Production and on boarding internal and external customers. more
A highly motivated and enthusiastic Microsoft Certified C# big data developer and architect with a wealth of technical and business skills acquired across a wide range of demanding roles. 12 years’ experience and a proven track record in delivering successful development projects.
A developer who codes for an application that uses massive quantities of data that are too big for a normal system to support. Go Big or Go Home!
What does a big data developer do?
They support the migration of data from a normal-sized system to a Big Data system. Think of them as taking smaller blocks of data from a messy cupboard, to a huge storage unit to make everything more organised and faster.
What should a big data developer know?
The most important skills for a Big Data Developer to have are; Apache Hadoop, NoSQL, Data Mining, NoSQL, Machine Learning, Programming Languages like Scala, Python and Data Visualisation. Always read the full Job Specification to ensure you have the skills and knowledge required.
Big data developer vs data scientist
Big Data Developers work with masses of varied data to enable businesses to have more scope and reach across all areas, whereas Data Scientists extract and use specifics from within Big Data to fine-tune certain areas of information to reach and interest an audience.