1. Minimum 1+ years of Hands-on experience on Big data tools and frameworks.
2. Must have expertise in following technologies – Hadoop – Spark – Kafka Hive/ Pig.
3. Proficient in at least one of the following programming language Python/Scala.
4. Must have hands on experience in Redshift / Snowflake.
5. Experience with AWS/Azure would be an added advantage.
6. Good communication (written and oral) and interpersonal skills.