Job Description

Are you an experienced, passionate pioneer in technology – a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feel and share new ideas with your colleagues – without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center – we are breaking the mold of a typical Delivery Center.

Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below …

Work You’ll Do/Responsibilities

This Data Engineer will:

  • Design, develop, and implement end-to-end data pipelines, utilizing ETL processes and technologies such as Databricks, Python, Spark, Scala, JavaScript/JSON, SQL, and Jupyter Notebooks based on mission needs and objectives.
  • Work closely with cross-functional teams to understand data requirements and design optimal data models and architectures is preferred.
  • Optimize data pipelines ensuring scalability, reliability, and high-performance processing as well as providing innovative solutions to troubleshoot complex data-related problems with large datasets.

The Team

Artificial Intelligence & Data Engineering:

In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment.

The Artificial Intelligence & Data Engineering team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets.

Artificial Intelligence & Data Engineering will work with our clients to:

Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms.

Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions.

Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements.

Qualifications

Required

  • Active Secret Clearance required
  • Bachelor’s degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline, or equivalent experience
  • 2+ years of data engineering experience
  • 2+ years of experience with ETL processes and technologies such as DataBricks, Python, Spark, Scala, JavaScript/JSON, SQL, and Jupyter Notebooks
  • 2+ years of experience in government consulting
  • Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future.
  • Must live in a commutable distance (approximately 100-mile radius) to one of the following Delivery locations: Atlanta, GA; Charlotte, NC; Dallas, TX; Gilbert, AZ; Houston, TX; Lake Mary, FL; Mechanicsburg, PA; Philadelphia, PA with the ability to commute to assigned location for the day, without the need for overnight accommodations
  • Expectation to co-locate in your designated Delivery location up to 30% of the time based on business needs. This may include a maximum of 10% overnight client/project travel

Preferred

  • AWS, Azure and/or Google Cloud Platform Certification.
  • Experience working with agile development methodologies such as Sprint and Scrum.
  • Experience government consulting/clients
  • Strong problem-solving skills and ability to solve complex data-related issues
  • Data governance experience

Information for applicants with a need for accommodation: https://www2.deloitte.com/us/en/pages/careers/articles/join-deloitte-assistance-for-disabled-applicants.html