AWS Big data Architect at DXC Technology India

Posted 1 year ago
Apply Now

Job Description

Full–time

Job Description:

Responsibilities:
• Architect the organization of directories, data schemas and layers,
devise naming conventions and support routine activities on Big Data
platform to ensure that business solution objectives are met.
• Setting up EC2, EMR, Hadoop clusters. Perform installing,
configuring, monitoring large Hadoop clusters and big data cloud
platforms.
• Enforcing role-based authorization to data and metadata stored on a
Hadoop cluster.
• Maintain data integrity and access control while using the AWS
application platform.
• Review and suggest best code migration practices of Hadoop and other
components from lower environment to higher environment (Production).
• Identify areas which are not within established guidelines.
• Responsible for managing and supporting implementations, updates,
patches, health checks and administration of Big Data Analytics
platform.
• Ensure that the connectivity and network are always up and running.
• Involve with development teams… around change, release, problem &
incident management.
• Plan for capacity upgrading or downsizing as and when the need
arises. Manage disaster recovery processes.
• Responsible for handling Hadoop platform errors arising from
stakeholder.
• Ability to quickly identify issues and debug hadoop logs. Solid
research skills with an emphasis on finding and using information
quickly.
• Participate in mentoring supporting and guiding team members to help
grow skills and capabilities of the AWS practice.

Qualifications:
• Bachelors degree from reputed institution/university.
• 5 to 7 years of experience in Hadoop and Big Data Technologies such
as MapReduce, HDFS, S3, Sqoop, Hive, Hbase, Impala, Oozie, Spark Core
etc
• 4+ years of experience in working on Data Integration projects on
Hadoop.
• Experience with AWS to provide counsel on the capabilities and
limitations of an architecture.
• Ability to architect streamlined data pipelines and deliver solutions
leveraging various AWS platform services
• Proficiency in security implementation best practices on Network
Security Groups , Amazon EC2 Security Groups, AWS Key Management
Service etc
• Experience working in the SCRUM Environment.
• Business requirement analysis, talk with local BU regarding business
requirement analysis and help to shape the data pipeline on Big Data
platform.
• 4+ years of experience administrating on UNIX systems with
proficiency in Shell scripting.
• 2+ years of strong technical experience in architecting, development
and deployment of AWS based solutions.
• Excellent knowledge on Hadoop configuration files (Core-Site,
HDFS-Site, YARN-Site and Map Red-Site).
• Thorough understanding of Mainframe file formats, Relational DBs (SQL
Server/Oracle/Mysql), sql script, sql performance optimization etc
• Experience in ETL, Data ingestion and Migration/Moving on premise
applications to cloud environments and building datalakes.
• Knowledge of Hadoop distributed architecture and HDFS.
• Hands-on experience with Hadoop distribution platforms like Cloudera,
Hortonworks, MapR etc.
• Familiarity with message brokers, Kafka, NoSQL such as HBase,
Cassandra, MongoDB.
• Demonstrated ability to learn new technologies quickly.

Notice period : Less than 30 days