Data Scientist

Company Details

Solves toughest problems in space aeronautics defense cyberspace

At Northrop Grumman, our employees have incredible opportunities to work on revolutionary systems that impact people’s lives around the world today, and for generations to come. Our pioneering and inventive spirit has enabled us to be at the forefront of many technological advancements in our nation’s history – from the first flight across the Atlantic Ocean, to stealth bombers, to landing on the moon. We look for people who have bold new ideas, courage and a pioneering spirit to join forces to invent the future, and have fun along the way. Our culture thrives on intellectual curiosity, cognitive diversity and bringing your whole self to work — and we have an insatiable drive to do what others think is impossible. Our employees are not only part of history, they’re making history.

Northrop Grumman is looking for a Data Engineer or Scientist to work in the Oklahoma City, OK area. The selected candidate will be a key member of the Business Unit greater organization to drive performance across a broad range of strategic projects and activities.  They will work closely with our various profit/loss and functional teams to develop, transform, and model data and/or data based systems that provide insights, to enable decision making and improve performance.

Area of responsibility will include:

  • Big data engineering and management
  • Data ingestion, analytics, data modeling and data wrangling
  • Turn data into understanding with appealing, interactive and responsive dashboards
  • Illuminate and explain trends and anomalies
  • Track business objectives and performance
  • Optimize for high performance
  • Communicates the potential impact of analyses and applicable data privacy policies, procedures, and regulation.
  • Cloud automation engineering using AWS environment or similar
  • The role will work in highly autonomous projects to engineer, analyze, integrate, test, debug, and monitor data processing workflows and ultimate solutions to ensure accurate and efficient Extract,
  • Transform, Load (ETL) of data.

This may require:

  • Writing /editing and testing scripts, parsers, and programs to process, load and analyze data.
  • Perform the construction, optimization, and analysis of databases and datasets to ensure proper handling of the data
  • Data preparation/cleaning, integration, and automation from heterogeneous sources
  • Design, develop, and maintain data visualizations of a scalable analytics system
  • Perform end-to-end data quality assessments focused on analytical accuracy and data integrity
  • Coordinate and resolve data quality issues
  • Help gather and optimize metrics and contribute status as required by the customer and management.
  • Interface with customers electronically, on the phone and in person to ensure data requirements are met.
  • Perform data flow implementation using developed and approved SOPs and patterns to ensure consistency and maintainability
  • Identify areas of improvement and present innovative solutions to increase efficiency and quality of data flows.

Basic qualifications:

  • Bachelor’s degree (or higher) in a Science, Technology, Engineering, or Mathematics (STEM) field with 2+ years of experience
  • Understanding of at least one programming language
  • Experience with data visualizations tools, preferably Tableau or PowerBI
  • Familiarity with AWS cloud environments
  • Experience with data preparation/cleansing
  • Able to obtain and maintain a Secret clearance per business requirements. US Citizenship is a requirement of this.

Preferred qualifications:

  • Proficiency with open source software packages used to perform data engineering in Python/R, C#, SSIS, Java, Flask/Django and/or Bootstrap
  • Familiarity with open source ETL frameworks such as NiFi or similar
  • Proficiency in HTML/CSS, JavaScript, Flask, and other Front-End Technologies
  • Proficiency with unstructured and structured data analysis techniques
  • Familiarity with streaming and batch layer open techniques to perform big data processing
  • Scripting Experience (e.g. bash, Perl, Python, etc.)
  • Past experience in roles as a data modeler, data architect, software developer, DBA to achieve scaled and project objectives
  • Experience with web analytics and load testing software (e.g. Matomo)
  • Experience with load testing software (e.g. Locust)
  • Proficiency with Linux (e.g. file/permission manipulation, directory navigation, executing scripts, etc.)
  • Experience developing web applications supported across multiple browsers (e.g. IE11, Edge, Chrome, Firefox)
  • JMP proficiency (SAS statistical tool)
  • MATLAB experience
  • Experience in responsive UI/UX design

Tagged as: Matlab, python, C++, css, html, flask, nifi, jmp proficiency, locust

Visit us on LinkedInVisit us on FacebookVisit us on Twitter