By clicking “Accept”, you agree to the storing of cookies to assist site navigation, marketing efforts, and analyze site usage
< All opportunities

Lead Data Engineer

Apply now

Pendulum is a venture-backed, seed-stage startup that empowers companies and governments to detect, deter, and counteract the impacts of harmful online narratives with machine learning enabled tools. We are currently seeking a Lead Software Developer to work as tech leader and subject matter expert to lead development of scalable and robust data workflows in the development of our data platform, which empowers our customers to define and track narratives and social media discussions that are relevant to their business. 

We firmly believe that the best teams put diversity and inclusion first when hiring and operating. At Pendulum this is paramount. Our mission of fighting against the impacts of harmful narratives and the underlying analysis narratives, claims, and communities in our society demands diversity of thought at all levels within our team. As such, prioritizing diversity and inclusion are core to first principles and requirements for working at Pendulum.

Your Responsibilities

  • Partner with technical and non-technical co-workers across the company to elicit their business and data requirements
  • Architect, build, and launch large efficient, reliable, scalable, and secure data pipelines in partnership with Product Engineering
  • Research a new problem space, disambiguate it, define tasks, and lead all team members to high impact delivery of projects
  • Design and develop data workflows for immediate and the future use cases
  • Build and manage data workflows in the cloud using Apache Airflow
  • Optimize workflows and ETL/ELT jobs
  • Integrate existing data platforms with AWS services like Amazon S3, Amazon Redshift, Amazon EMR, AWS Batch, and Amazon SageMaker
  • Design and develop data resources to enable self-serve data consumption
  • Build tools for testing workflow DAGs and validating data
  • Define and share best practices on data modeling, workflow management, and ETL tuning
  • Maintain workflows and troubleshoot complex data, deployment, security issues
  • Being a trusted tech leader and subject matter expert by providing technical guidance and consultation to team members and stakeholders across the company
  • Coach and mentor team members

Required Qualifications

  • Have a solid track record of driving complex projects to success by leading designs, justifying technical trade-offs, and solving hard challenges persistently.
  • Experience with workflow management solutions like Airflow
  • Strong knowledge of AWS Cloud services
  • Experience in schema design and dimensional data modeling
  • Strong skills in SQL and Python coding
  • Strong skills in distributed system optimization (e.g. Spark, Presto, Hadoop, Hive)
  • Strong track record of building large scale data sets, preferable petabyte scale and beyond
  • Ability to perform basic statistical analysis to inform business decisions
  • Experience with Docker and with engineering development tools like Git
  • Excellent communication skills, both written and verbal

Preferred Qualifications

  • Knowledge of Terraform or AWS Cloud Formation
  • Experience with AWS Lake Formation
  • Experience with notebook-based Data Science workflows
  • Familiarity with A/B testing and machine learning

Apply now

Max file size 10MB.
Uploading...
fileuploaded.jpg
Upload failed. Max size for files is 10 MB.
Thank for applying.
Your form submission was successful.
An error occurred while submitting the form.
Please try again.

We firmly believe that the best teams put diversity and inclusion first when hiring and operating. At Pendulum this is paramount. Our mission of fighting against the impacts of harmful narratives and the underlying analysis narratives, claims, and communities in our society demands diversity of thought at all levels within our team. As such, prioritizing diversity and inclusion are core to first principles and requirements for working at Pendulum.