Lookout is a cybersecurity company that makes it possible for individuals and enterprises to be both mobile and secure. With 150 million mobile sensors fueling a dataset of virtually all the mobile code in the world, the Lookout Security Cloud can identify connections that would otherwise go unseen -- predicting and stopping mobile attacks before they do harm. Headquartered in San Francisco, Lookout has offices in Amsterdam, Boston, London, Sydney, Tokyo, Toronto and Washington, D.C. To learn more, visit www.lookout.com.
With a massive mobile security dataset and over 100 granted patents, Lookout’s security platform, the Lookout Security Cloud, supports a wide range of Lookout products and services and forms the backbone of Lookout’s value and competitive advantages in the mobile security market. As an engineer on Lookout’s Data Engineering team, you will be working on a platform that processes hundreds of millions of security events daily. You will support real-time analysis in the Lookout Security Cloud, as well as ad hoc analysis by researchers and analysts hunting for new mobile threats. Your datasets will directly support product features such as data dashboards and threat detection. We’re looking for engineers who understand both data and infrastructure, and are enthusiastic about working on both.
- Design, build & maintain reliable systems for storing, transforming, and analyzing large amounts of data using Apache Spark, Apache Airflow, and related technologies
- Manage infrastructure for batch and streaming data processing and data storage, including maintaining CI/CD tooling, system patching and upgrades, monitoring, and performance analysis
- Work with analysts and engineers across the company to meet quality and timeliness goals for data products
- B.S. in Computer Science or related experience
- 5+ years experience designing, implementing, and maintaining complex software systems
- Programming experience on Python, Scala, and/or Java
- Experience with Big Data processing systems such as Spark, Flink, Kafka, Airflow, and/or Storm
- Cloud experience with AWS, GCP, or Azure
- Experience with agile software development, code reviews, git, and task management through tools like Jira.
- Experience with ETL workflow & scheduling systems such as Airflow and/or Luigi
- Knowledge of configuration management & deployment using tools such as Ansible, Chef, Puppet or Spinnaker.