Some careers shine brighter than others.
If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.
We are currently seeking an experienced professional to join our team in the role of a Consultant Specialist
In this role, you will:
Be expected to define and contribute at a high-level to many aspects of our collaborative Agile development process:
Big Data development, automated testing of new and existing components in an Agile, DevOps and dynamic environmentWorking with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Spark, and MapReduce access for the new userExecutes the review / acceptance / tuning and ensures environment availability 24x7General operational excellence. This includes good troubleshooting skills, understanding of system’s capacity and bottlenecks.Must Haves:
Strong problem-solving skills and adaptability to a complex environmentProviding technical support and design Hadoop Big Data platforms (preferably Cloudera distributions like Hive, Beeline, Spark, HDFS, Kafka, Yarn, Zookeeper etc.), handle and identify possible failure scenario (Incident Management), respond to end users of Hadoop platform on data or application issues, report and monitor daily SLA that identifies vulnerabilities and opportunities for improvement.Hands-on experience with large scale Big Data environment builds, capacity planning, performance tuning and monitoring, including end-to-end Cloudera cluster installation.Handling Hadoop security activities using Apache Ranger, Knox, TLs, Kerberos and Encryption zone management.Expertise in software installation and configuration, orchestration, and automation with tools such as Jenkins/Ansible.Improve the current estate by incorporating the use of centralized S3 data storage (VAST) throughout the platform processing stack5 years experience in engineering solutions in a Big Data on-prem or cloud environment.