Inglés: Conversacional ( Puede negociarse un nivel intermedio)
Requisitos Técnicos:
BIG data development experience.
Demonstrates up-to-date expertise in Data Engineering, complex data pipeline development.
Experience in agile models
Design, develop, implement and tune large-scale distributed systems and pipelines that process large volume of data; focusing on scalability, low -latency, and fault-tolerance in every system built.
Experience with Java, Python to write data pipelines and data processing layers
Experience in Airflow & Github.
Experience in writing map-reduce jobs.
Demonstrates expertise in writing complex, highly-optimized queries across large data sets
Proven, working expertise with Big Data Technologies Hadoop, Hive, Kafka, Presto, Spark, HBase.
Highly Proficient in SQL.
Experience with Cloud Technologies ( GCP, Azure)
Experience with relational model, memory data stores desirable ( Oracle, Cassandra, Druid)
Provides and supports the implementation and operations of the data pipelines and analytical solutions
Performance tuning experience of systems working with large data sets
Experience in REST API data service – Data Consumption
Ofrecemos:
Sueldo entre $70,000 a $75,000 Brutos Mensuales
Prestaciones d Ley y Superiores ( Vales + Apoyo de Conectividad+ Seguro de Vida + Seguro de Gastos Médicos Mayores)
Capacitación constante
100% Nominal
Recuerda que ningún reclutador puede pedirte dinero a cambio de una entrevista o un puesto. Asimismo, evita realizar pagos o compartir información financiera con las empresas.