Engineer Manager Data - Buenos Aires, Argentina - Yalo Inc.

Yalo Inc.
Yalo Inc.
Empresa verificada
Buenos Aires, Argentina

hace 3 semanas

Sofía Rodríguez

Publicado por:

Sofía Rodríguez

beBee Recruiter


Descripción

Yalo:

Hi This is Yalo We are on a mission to bring conversational commerce to the world...


Remember how it used to be to interact with businesses that knew and understood you, that could recommend exactly what you needed, and that with a simple message could get you what you wanted??? Yep..

neither do we.

That is why at Yalo we are marrying the scale of digital commerce with the personalization and simplicity of conversations to help companies delight their users.

We know that traditional SAAS companies focus on first world problems...

we don't Having started in Latin America, our roots are in Emerging Markets and therefore we care about bringing amazing experiences to a population that traditionally has been underserved, such as the small shop owner in Brazil that is ordering online for the first time.


If you're looking for a place to make things happen, learn fast, and impact emerging markets in a way that hasn't been done before, look no further.

Come Join us in our mission of improving billions of lives through the power of conversational commerce


Your mission


Responsible for driving the team who is designing and maintaining the organization's Data & Analytics architecture, ensuring it aligns with business goals and requirements.


What are the responsibilities for this role?
-
Lead the team responsible for designing, building and maintaining batch and real-time data pipelines in production.

  • Guide the team towards the following:
  • Maintain and optimize the data infrastructure required for accurate extraction, transformation, and loading of data from a wide variety of data sources.
  • Build and maintain Kafka and Snowplow pipelines.
  • Develop ETL (extract, transform, load) processes to help extract and manipulate data from multiple sources.
  • Help to design and maintain a semantic layer.
  • Automate data workflows such as data ingestion, aggregation, and ETL processing.
  • Prepare raw data in Data Warehouses into a consumable dataset for both technical and nontechnical stakeholders.
-
Establishing and enforcing data governance policies and procedures to ensure data quality, integrity, and security
:


  • Partner with data scientists and data analysts to deploy machine learning and data models in production.
  • Build, maintain, and deploy data products for analytics and data science teams on GCP platform.
  • Ensure data accuracy, integrity, privacy, security, and compliance through quality control procedures.
  • Monitor data systems performance and implement optimization strategies.
  • Leverage data controls to maintain data privacy, security, compliance, and quality for allocated areas of ownership.
  • Collaboration: Work closely with crossfunctional teams, product managers, and stakeholders to ensure the delivery of highquality software.
  • Continuous Learning: Stay updated with the latest trends and technologies in data systems, ensuring that our systems remain stateoftheart.

Job Requirements (Must have)

  • Bachelor's/Master's degree in Computer Science, Information Systems, or a related field.
-
Minimum 5 years of Data & Analytics Engineering experience ideally in cloud environments and proven data team's leadership.

  • Experience in
    Data Governance topics.
  • Ability to analyze complex data requirements and design appropriate data solutions.
  • Excellent communication and interpersonal skills to effectively communicate with technical and nontechnical stakeholders.
  • Strong problemsolving abilities to identify and resolve data architecture challenges and issues.
  • Demonstrated leadership skills to lead and mentor junior team members and drive data architecture initiatives forward.
  • Adaptability: Ability to adapt to evolving technologies, tools, and business requirements in the data architecture space.
  • Business Acumen: Understanding of business processes, objectives, and strategies to align data architecture efforts with business goals.
  • Working knowledge of Kafka pipelines or relevant equivalent event driven technologies.
  • Good understanding of microservices and APIs.
-
Strong experience in designing and building ETL models and data workflows

DBT & Great Expectations:

-
Working knowledge on designing and implementing a BI semantic layer
:

-
Advanced SQL skills and experience with relational databases and database design
:

-
Experience working with BigQuery cloud Data Warehouse and other solutions like Snowflake, Databricks
:


  • Working knowledge in programming languages (e.g.
    Python).
  • Strong proficiency in data pipeline and workflow management tools (e.g.,
    Airflow).
  • Strong project management and organizational skills.
  • Excellent problemsolving, communication, and organizational skills.
  • Proven ability to work independently and with a team.

Nice to have:


  • Expertise in open table formats like Hudi, Iceberg, Delta.
  • Expertise with Snowplow pipelines.
  • Expertise in databases like Druid, Pinot, and Elasticsearch.
**Wha

Más ofertas de trabajo de Yalo Inc.