Posted: Apr 21, 2020
Imagine what you could do here. At Apple, new ideas have a way of becoming great products, services, and customer experiences very quickly. Bring passion and dedication to your job and there’s no telling what you could accomplish.
We’re looking for a Big Data and Cloud Engineer who is hardworking and motivated to make an impact in creating a robust and scalable data platform. This is a hands-on Engineering role in the Operations Business Analytics team, which provides a unique opportunity to create innovations and ground-breaking change using cloud and big data technologies. You will be able to incubate and experiment newer technologies and platforms to demonstrate value of the data and enable data consumptions at scale with agility.
For this role we seek strong engineering skills and communication, as well as a belief that data driven processes lead to great products. You will need to have a passion for quality and an ability to… understand complex systems.
• Highly technical and analytical with 10 or more years of data engineering, analytics systems development and deployment experience
• Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations and virtual teams.
• Ability to think understand sophisticated business requirements and render them as prototype systems with quick turnaround time.
• Knowledge of foundation infrastructure requirements such as Networking, Storage, and Hardware Optimization with Hands-on experience with any cloud Infrastructure (AWS, Azure, Google Cloud)
• Track record of implementing Cloud based services in a variety of business such as large enterprises and start-ups.
• Deep understanding of Data transformations , underlying technologies and understanding of related concepts (such as data cataloging and curation, etc.)
• Demonstrated industry leadership in the fields of Data Warehousing, Data Science and Data processing.
As Big Data & Cloud Engineer, you will work on a small team to develop large scale data pipelines and analytical solutions using Big Data technologies.
– Understand current data landscape and develop architectural models that will operate at large scale and high performance, and advise on how to run these architectural models on on-Prem or Private Cloud infrastructure.
– Experience in high level programming languages such as Java, Scala, or Python.
– Proficiency with databases Teradata, MySQL, Postgres and SQL is required.
– Proficiency in data processing using technologies like Spark Streaming, Spark SQL, or Map/Reduce.
– Expertise in Hadoop related technologies such as HDFS, Azkaban, Oozie, Impala, Hive, and Pig.
– Expertise in developing big data pipelines using technologies like Kafka, Flume, or Storm.
– Experience with large scale data warehousing, mining or analytic systems.
– Work with analysts to capture requirements and translate them into data engineering tasks.
– Aptitude to independently learn new technologies.
– Extract best-practice knowledge, reference architectures, and patterns from these engagements for sharing with the worldwide architect community and engineering teams.
Education & Experience
Education & Experience
Bachelors or Masters Degree in Computer Science or Mathematics background preferred