At Nexmo, Business intelligence is the team working on everything related to data, from ETL and data warehouses all the way to the reporting platform where other teams can build reports and get answers they need from our data.
Currently, we're looking for an experienced data engineer, to help us improve data processing systems that handle hundreds of GBs of data every day.
What will be your role
As a Senior Data Engineer, you'll become a senior member of the Business intelligence team, responsible for designing and building systems for processing the ~500GB of logs generated by the Nexmo API platform every day and loading it into our data warehouse and other systems which consume aggregated data.
As a Senior Data Engineer, you will:
* Design and implement data loading and aggregation frameworks and jobs that will be able to handle hundreds of GBs of json files, using Python, Airflow and Snowflake * Build best practice ETLs to transform raw data into easy to use dimensional data for self service reporting * Improve our deployment and testing infrastructure within AWS, using tools like Jenkins, Puppet and Docker * Work closely with the Product, Infrastructure and Core teams, to make sure data needs are considered during product development and to guide decisions related to data * Mentor junior team members
What we're looking for
We're looking to strengthen our team by adding an experienced ETL / Data engineering profesional, who has practical experience building performant data pipelines at scale, modeling data for self-serve use and working in a dynamic, startup-like environment.
Ideally, you'll have:
* 5 years+working experience building data processing, ETL and DWH solutions * Experience with software development using Python(prefered), Java or Scala * Experience working with relational databases (MySQL or Postgres prefered) and DWH technologies (Snowflake and Redshift prefered) * Experience working as part of a larger engineering team, within an agile framework and using version control systems * Familiarity with cloud technologies and working with cloud infrastructure (AWS prefered) and exposure to deployment and infrastructure automation would be highly beneficial * Exposure to distributed data processing systems with tools like Kubernetes, Hadoop, Spark, Kafka or ELK a large plus
Now part of Vonage, Nexmo is a global API platform for cloud communications that handles over 90 million requests every day. Customers like Airbnb, Viber, Whatsapp, Snapchat, and many others depend on our APIs and SDKs to connect with their customers all over the world.
Company built by engineers
* Nexmo is a tech company, built by engineers at the intersection of cloud and telecommunications and this is reflected everywhere in our culture. * Even though we've grown a lot in recent years, we've managed to maintain a startup-like working atmosphere, with very flat hierarchies and a very pragmatic approach to getting things done
Interesting engineering challenge
* We're in the process of migrating our DWH from Redshift to Snowflake, with a lot of room for re-design and refactoring along the way * We appreciate initiative and ownership, so you'll be encouraged to design, implement and own parts of our data processing system, end to end
Agile development using modern technologies
* We iterate quickly, use the scrum framework, do frequent code reviews and have a very open and pragmatic work culture within the team * Our data platform platform is built in the cloud, using technologies like Airflow, Snowflake, Kafka, S3, GCDS, ELK stack and Tableau