Sep 13, 2024

Observability Platform Engineer

Full time Information Technology

Job Description

Overview

The Observability Engineer takes on the critical role of utilizing a ETL platform (extraction, transformation, and loading) to effectively manage and streamline log data flows. Their responsibilities encompass ensuring data integrity, security, and compliance. They collaborate closely with IT teams, engaging in tasks such as designing, implementing, and troubleshooting ETL platform configurations tailored to meet the distinct data processing needs of financial services organizations. The Observability Engineer plays a pivotal role in the design, construction, and ongoing maintenance of data pipelines and infrastructure. These elements are essential in facilitating data-driven decision-making throughout the organization. Your expertise in observability engineering technologies and best practices will prove invaluable in our quest to ensure data accuracy, availability, and security while also lending support to various data-driven initiatives. 

Responsibilities

Job Responsibilities/Accountabilities: 

  • Design and deploy ETL solutions to optimize log data ingestion, transformation, and routing. 
  • Create custom pipelines and configurations tailored to the specific requirements of financial applications and systems. 
  • Develop and maintain log data transformation rules and functions to extract relevant information and enrich log events. 
  • Configure ETL platform to intelligently route log data to appropriate destinations, such as SIEM (Security Information and Event Management) systems, data lakes, or real-time monitoring tools. 
  • Implement filtering mechanisms to reduce noise and focus on critical events. 
  • Aid in the development of log metrics, including both KPI’s and KRI’s. 
  • Monitor and fine-tune ETL platform configurations to improve data processing efficiency and reduce latency. 
  • Monitor and optimize data pipelines for performance and scalability, identifying and resolving bottlenecks as needed. 
  • Implement security best practices to protect sensitive data within log streams. 
  • Ensure compliance with industry regulations, such as PCI DSS, GDPR, and SEC requirements, by managing and auditing log data appropriately. 
  • Create documentation and provide training for IT and operational teams on ETL platform best practices and usage. 
  • Collaborate with cross-functional teams, including IT, DevOps, and security, to integrate ETL platform seamlessly into the financial services infrastructure. 
  • Aid in the development of a log bill back model and proper reporting around that model. 
  • Work with vendors and support teams to resolve technical issues. 

This job might be for you if: 

  • You want to build advanced data pipelines to better assist with industry leading security tools.  
  • You want to transform and minimize overall log footprint for security tooling.  
  • You want to work closely with a threat detection and automation team to secure a Fortune 100 financial services organization through effective data pipelines. 
  • You love taking on difficult challenges and finding creative solutions. You do not get flustered easily. If you do not know the answer, you will dig until you find it. 
  • You have a native desire to consistently deliver high-quality products. 
  • You are motivated and driven. You volunteer for new challenges without waiting to be asked. You’re going to take ownership of the time you spend with us and truly make a difference. 

Qualifications

To land this role: 

You need to have a history of being self-motivated and capable of solving problems with minimal oversight. The ability to learn quickly and retain information is key to being successful in this role. You have strong experience working in a competitive, fast-paced, highly technical environment, ideally in the Financial Services Industry. 

 

Job Requirements: 

  • Proven experience in designing, implementing, and managing ETL solutions or with big data technologies (e.g., Hadoop, Spark) and data lake architectures. 
  • Strong understanding of log data processing, log management, and log analytics. 
  • Proficiency in scripting and programming languages (e.g., Python, JavaScript) for custom log data transformations. 
  • Knowledge of security industry standards (NIST, PCI, ISO), best practices (ITIL), regulations (SOX, FINRA), news, and other relevant cybersecurity information. 
  • Individual must be able to perform with minimal supervision of routine duties; must demonstrate ability to solve practical problems and deal with a variety of situations where only limited standardization exists. 
  • Communicate technical issues and concepts both verbally and in writing to audiences of varying technical backgrounds.  
  • Ability to help gather business requirements and align them to security solutions. 
  • Working a minimum of 3 days in the office. 
  • Willingness to work on US or London time zone hours. 
  • Expert organizational skills are a must as well as the ability to work successfully within a team.  
  • 1-3+ years of overall technology experience. Prefer a diverse background including security, networking, compute, storage, and endpoint. 
  • 1-3+ years of experience as a data engineer or log engineer. 

 

Education / Certification requirements: in at least one of or working towards the following: 

  • Associate Degree or Bachelor’s degree in information security, Information Assurance, Information Systems, or equivalent experience. 
  • SANS related certifications (GSEC, GCIA, GCED, GCIH etc.). 
  • Other field appropriate certifications may be considered. 

 

Physical requirements/Working conditions: 

  • In-Office 3 days per week
  • Climate controlled office environment. 
  • Minimal physical requirements other than occasional light lifting of boxed materials. 
  • Dynamic, time-sensitive environment.