Reaktor is a partner for forward-thinking companies and organizations. We reimagine businesses, and design and build tomorrow’s digital products. Our full range of consultancy and agency services include expertise from high-level business strategy to hands-on product design and development.
We pride ourselves on a culture that is open, respectful and human. Reaktorians are a friendly, curious bunch – in addition to being some of the sharpest minds around. Doing what others consider unrealistic is our bread and butter.
At Reaktor, we constantly challenge ourselves and our clients to achieve the best possible outcome. While working in Dubai, you will get to work on a variety of exciting projects and be a part of a unique community of passionate professionals. All while enjoying more than 3,500 hours of sunlight per year.
You are an analytical thinker and a quick learner. You have a business mindset, good communication skills, and are a self-starter.
In this role, you will work with datasets from multiple online and offline businesses. Your responsibilities will include data ingestion, integration & cleansing, data warehousing, and data analysis.
You should have a background in Engineering, Business Intelligence, and Data Analysis. You’ll be responsible for designing, expanding and optimizing the data architecture and data flows for our selected customers. Ultimately, you will build, deploy and enhance analytical capabilities to help our clients make better decisions.
What you’ll be doing
- Create and maintain optimal data pipeline architecture.
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud “big data” technologies (mostly GCP, AWS, and Azure).
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and cloud regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
The essential tech
- Understanding of Cloud Data Architecture
- Professional knowledge of at least one general-purpose programming language (Java, C++, C#, Go)
- Professional knowledge of at least one scripting programming language (Python, Scala, etc)
- Professional knowledge on at least one Cloud Architecture stack (GCP + BigQuery + etc, AWS + Redshift + etc)
- Professional knowledge of any BigData platforms (Hadoop, Spark, etc)
- Data manipulation with SQL
- Administration and monitoring of Big Data environment (Linux, Ambari, HDInsight, ADLS)
- Data Mart implementation in Hive & SQL
- Integration between Big Data and Tradition BI environments (e.g. Sqoop)
- Understanding of SDLC, Source Control, IT Project management