Senior Engineer - Big Data ID:11377100,000 INR ~ 150,000 INRBangalore
Other, Construction Engineer
Roles and Responsibilities
• Serve as a lead in Operations team, supporting and operating key aspects of infrastructure services
including security, capacity planning, availability, and performance.
• Co-ordinate shift, rotation with teams in different geographical areas.
• Create procedures/run books for operational and security aspects of platform.
• Improve infrastructure by developing and improving automation tools
Provide advanced business and engineering support services to end users:
• Gather business details
• Lead other admins and platform engineers through design and implementation decisions to achieve
balance between strategic design and tactical needs
• Research and deploy new tools and frameworks to build a sustainable big data platform
• Assist with creating programs for training and onboarding for new end users
• Lead Agile/Kanban workflows and team process work
• Troubleshoot issues to resolve problems
• Provide status updates to Operations product owner and stakeholders
• Track all details in the issue tracking system (JIRA)
• Provide issue review and triage problems for new service/support requests
• Use DevOps automation tools, including Jenkins build jobs
• Fulfill any ad-hoc data or report request queries from different functional groups.
• On call for 2 nd or 3 rd shifts
Senior Engineer – (6 to 10 years of experience)
• Working experience and good understanding of the AWS or Azure environment; Advanced
experience with IAM policy and role management.
• Security: Experience implementing role-based security, including AD integration, security policies,
and auditing in a Linux/Hadoop/AWS environment. Familiar with penetration testing and scan tools
for remediation of security vulnerabilities.
• Infrastructure Operations: 5+ years supporting systems infrastructure operations, upgrades,
deployments, and monitoring
• Programming: 3+ years of experience with one of Python, Scala or Java programming language
• Hadoop: Experience with Hadoop (Hive, Spark, Sqoop) and / or AWS EMR
• DevOps: Experience with DevOps automation - Orchestration/Configuration Management and CI/CD
• Version Control: Working experience with one or more version control platforms like Github
• ETL: Job scheduler experience like Oozie or Airflow; Nice to have Airflow experience
• Data Science tools (nice to have): R, RStudio, Tensorflow
• Monitoring: Hands on experience with monitoring tools such as AWS CloudWatch, Datadog and
• Networking: Working knowledge of TCP/IP networking, SMTP, HTTP, load-balancers (ELB) and high
• Demonstrated successful experience learning new technologies quickly
09:00 ~ 18:00
Please sign in.