Companies you'll love to work for

76

Companies

671

Jobs

Part-time Engineer

Braintrust

Braintrust

Dubai - United Arab Emirates
Posted on Aug 1, 2024
Job Description

  • Client will not consider talent who apply outside of Braintrust***

This role will be 10-20 hours per week

While we start with a lower hourly rate, it's an opportunity for rapid growth. As you excel, your pay and equity will increase. This allows you to be part of shaping our company's success and your own career path. We're excited about the potential for you to thrive with us!

Responsibilities:

  • Design, build, and maintain robust and efficient data pipelines to process and analyze large volumes of structured and unstructured data.
  • Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions.
  • Develop data integration solutions to ingest data from various sources, including databases, APIs, and external systems.
  • Optimize data workflows for performance, scalability, and reliability, ensuring high availability and data quality.
  • Implement data transformation and cleansing processes to prepare data for analytics and reporting purposes.
  • Conduct thorough testing and debugging of data pipelines to identify and resolve issues promptly.
  • Continuously monitor and optimize data infrastructure to improve efficiency and resource utilization.
  • Stay updated on emerging technologies and industry trends in data engineering and recommend best practices for adoption.
  • Document system architecture, design specifications, and operational procedures to facilitate knowledge sharing and maintainability.

Qualifications:

  • Bachelor's or Master's degree in Computer Science, Engineering, or related field.
  • Proven experience as a Data Engineer, Python Engineer or similar role, with a strong emphasis on Python programming.
  • Solid understanding of data structures, algorithms, and software design principles.
  • Proficiency in building and optimizing complex SQL queries for data manipulation and analysis.
  • Hands-on experience with big data technologies such as Hadoop, Spark, or Kafka.
  • Familiarity with distributed computing frameworks and containerization technologies (e.g., Docker, Kubernetes).
  • Experience with cloud platforms like AWS, Azure, or Google Cloud for data storage and processing.
  • Strong analytical and problem-solving skills with a keen attention to detail.
  • Excellent communication and interpersonal skills, with the ability to collaborate effectively in a team environment.