£600 Per day
Outside
Hybrid
London
Summary: The AI/Data Developer role is a contract position requiring SC clearance, focused on developing advanced AI solutions for a government client. The position demands expertise in AI and data development, particularly with Python and Apache Spark. The role offers hybrid working arrangements and a competitive daily rate. Candidates must possess active SC clearance and relevant experience in data processing and machine learning.
Key Responsibilities:
- Designing, building, and maintaining data processing pipelines using Apache Spark
- Implementing ETL/ELT workflows for large-scale data sets
- Developing and optimising Python-based data ingestion tools
- Collaborating on the design and deployment of machine learning models
- Ensuring data quality, integrity, and performance across distributed systems
- Contributing to data architecture and storage strategy design
- Working with cloud data platforms (AWS, Azure, or GCP) to deploy scalable solutions
- Monitoring, troubleshooting, and tuning Spark jobs for performance and cost efficiency
- Engaging regularly with customers and internal stakeholders
Key Skills:
- Proven background in AI and data development
- Strong proficiency in Python, including data-focused libraries such as Pandas, NumPy, and PySpark
- Hands-on experience with Apache Spark (PySpark preferred)
- Solid understanding of data management and processing pipelines
- Experience in algorithm development and graph data structures is advantageous
- Active SC Clearance is mandatory
Salary (Rate): £600 daily
City: London
Country: United Kingdom
Working Arrangements: hybrid
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
AI/Data Developer - Contract - SC Cleared
- £500 - £600pd (Outside of IR35)
- Hybrid working
Our client, a leading deep-tech organisation, is seeking an experienced AI/Data Developer for an urgent contract assignment.
Key Requirements:
Proven background in AI and data development
Strong proficiency in Python, including data-focused libraries such as Pandas, NumPy, and PySpark
Hands-on experience with Apache Spark (PySpark preferred)
Solid understanding of data management and processing pipelines
Experience in algorithm development and graph data structures is advantageous
Active SC Clearance is mandatory
Role Overview:
You will play a key role in developing and delivering advanced AI solutions for a Government client. Responsibilities include:
Designing, building, and maintaining data processing pipelines using Apache Spark
Implementing ETL/ELT workflows for large-scale data sets
Developing and optimising Python-based data ingestion tools
Collaborating on the design and deployment of machine learning models
Ensuring data quality, integrity, and performance across distributed systems
Contributing to data architecture and storage strategy design
Working with cloud data platforms (AWS, Azure, or GCP) to deploy scalable solutions
Monitoring, troubleshooting, and tuning Spark jobs for performance and cost efficiency
Engaging regularly with customers and internal stakeholders
This is an excellent opportunity to join a high-profile organisation on a long-term contract, delivering cutting-edge work in the AI and data space.
People Source Consulting Ltd is acting as an Employment Agency in relation to this vacancy. People Source specialise in technology recruitment across niche markets including Information Technology, Digital TV, Digital Marketing, Project and Programme Management, SAP, Digital and Consumer Electronics, Air Traffic Management, Management Consultancy, Business Intelligence, Manufacturing, Telecoms, Public Sector, Healthcare, Finance and Oil & Gas.