Skip to main content

Spark Scala Architect/SME

Job Description

Mandatory Skills

You need to have the below skills.

· At least 12+ Years of IT Experience with Deep understanding of component understanding around Spark Data Integration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans.

· Spark SME – Be able to analyse Spark code failures through Spark Plans and make correcting recommendations.

· To be able to traverse and explain the architecture you have been a part of and why any particular tool/technology was used.

· Spark SME – Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations.

· Spark – SME Be able to understand Data Frames / Resilient Distributed Data Sets and understand any memory related problems and make corrective recommendations.

· Monitoring –Spark jobs using wider tools such as Grafana to see whether there are Cluster level failures.

· Cloudera (CDP) Spark and how the run time libraries are used by PySpark code.

· Prophecy – High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code.

· Ready to work at least three days from Sheffield (UK) office and accept changes as per customer policies.


Good to have skills.

Ideally, you should be familiar with

· Collaboration with multiple customer stakeholders

· Knowledge of working with Cloud Databases

  • · Excellent communication and solution presentation skills.

Spark Scala Architect/SME

Yorkshire, UK
Full time

Published on 09/25/2024

Share this job now