yesterday

Big Data Architect

Credit Suisse AG

  • Work region
  • Sector
  • Employment type
  • Position

 Please refer to alpha.ch in your application

Big Data Architect

Big Data Architect #105059Schweiz-Region Zürich-Zürich | Vollzeit | Technology [IT] |

Stellen ID

105059English

We Offer

- You a challenging and interesting position as a Big Data Architect within the Chief Compliance & Regulatory Office
domain
- You the possibility to be part of a highly motivated team that is developing and deploying solutions to some of the
companys most challenging analytic and big data problems
- As a member of this team, you will work with clients and data spanning Credit Suisses global organization to solve
emerging critical challenges via the utilization of emerging technologies
- You will be responsible for large-scale distributed data analytic platforms and compute environments (Spark,
Map/Reduce)
- You will use distributed file systems and storage technologies
- You will have a visible platform to drive the architecture/design decisions needed to achieve cost-effective and
high performance result
- The opportunity to provide technical guidance to a team of Big Data engineers who are building the platform and
innovating in core areas, real time analytics and large-scale data processing
- Agile development in a global team (Zurich, New York, London, Warsaw)

You Offer

- A Bachelor / Advanced degree in Computer Science or equivalent education
- Prior hands on solution architecture experience in large projects. Ability to translate requirements into technical
designs and to work closely with Data Owners, Business and Big Data Project team to work out platform requirements as
well as present architecture concepts to Senior Management
- Your excellent people management abilities and your strong communication skills allow you to explain technical
solutions to technical and non-technical teams and to build the bridge between IT and Business Partners
- Hands on experience with code
- Experience with one or more major Hadoop distributions (Cloudera, HDP) and various ecosystem components (e.g. HDFS,
YARN, Sentry/Ranger, Hive, Impala, Kudu, Sqoop, Palantir)
- Strong knowledge on security and availability related topics of Big Data Platforms
- Experience with Data Concepts (ETL, near-/real-time streaming, data structures, metadata and workflow management)
would be a big strength
- You have good Programming / Scripting Skills (Python, Java, C/C , Scala, Bash, Korn Shell)
- DevOps Tools (Chef, Docker, Puppet, Bamboo, Jenkins) are required, experience with Apache NiFi, Flume, Kafka, Flink,
Storm, Akka, Oozie, Tableau, Knox, Record Service are beneficial
- Ability to work well as part of global and cross-cultural teams
- Experience and understanding of banking domain concepts is a big advantage
- Excellent verbal and written English language skills

*jobsch*Ms. T.Goop (HLOF42) would be happy to receive your application. Please apply via our career-portal