1. Architect, design and develop analytical model using Big Data Technologies like Hadoop, Spark/Scala, Hive, python, Kafka etc.
2. Design robust code from the point of view of performance, reuse and supportability, proper controls, consistent with best practices and with appropriate documentation.
3. Hands on experience in ETL tool- AbInitio, UNIX shell scripting.
Qualification:Minimum 4 year college degree.
1. Hands-on experience as a big data developer with a proven record of multiple successful deliveries.
2. Relevant Experience with Hadoop , Spark/Scala , Python ,Kafka
3. Relevant Hands-on experience in UNIX and Java.