Salesforce Data Engineer LATAM

Posted 26 January 2026
Salary 25/h
LocationMeridian
Job type Contract
Discipline Enterprise Applications (SAP/Salesforce/MS Dynamics)
Reference76012
Remote working Remote

Job description

We are hiring 2 Salesforce Data Engineers for one of our Consulting clients in the Financial Services industry

This is a remote contract and can be based from any nearshore/LATAM location - the customer is based in the US and working hours will be EST.

The contract is 40 hours/week starting from the beginning of March for at least 2 months, with potential for extension - paid at an hourly rate of $25-30/h

Professional English proficiency is required -

This person will build and maintains the systems that power AI, analytics, and data driven  decision making. This role focuses on creating and orchestrating efficient data pipelines, organizing data for scale, and ensuring data is clean, secure, and ready for use across the business. The work supports BI, Operations, System Integrations and AI practices by ensuring high quality data is consistently available. 

Primary Responsibilities

  • Design, build, and maintain data pipelines that support efficient collection, ingestion, storage, and processing
  • Implement modern data architectures such as, data lakes, data warehouses, lakehouses, and data mesh platforms
  • Develop streaming data flows for near real time and low latency use cases
  • Clean and prepare data to support analytics, reporting, and AI model readiness
  • Improve performance and reliability across data systems
  • Apply data governance and security best practices to safeguard customer information
  • Partner with technical and business teams to understand requirements and deliver effective solutions
  • Identify opportunities to streamline operations and reduce cost through smarter data design
  • Monitor and resolve issues to maintain dependable, resilient data operations

Required Qualifications

  • Experience building and maintaining data pipelines, and ETL/ELT scalable frameworks.
  • Experience in Salesforce projects handling data migrations and integrations within the platform
  • Strong foundation in relational and non relational data systems
  • Strong data modeling skills
  • Working knowledge of data lake, data warehouse, and lakehouse patterns
  • Hands on experience with both batch and  streaming data pipelines
  • Proficiency in SQL, Python and modern data engineering tools and libraries, such as Pandas
  • Ability to design structured, scalable solutions for analytics and AI preparation
  • Familiarity with cloud platforms and distributed processing frameworks
  • Clear, concise communication skills
  • Experience with Databricks, Snowflake, Microsoft Synapse, Fabric, AWS Glue, DMS, or similar data platforms and technologies
  • Experience with Open Data platforms and tools, such as Apache Spark, Airflow, Delta Lake, or Iceberg
  • Background supporting Data migrations, API integrations, and Machine Learning or AI data requirements
  • Understanding of data governance, lineage, and secure data practices
  • Exposure to a data product mindset and domain oriented or data mesh approaches
Reach out to learn more!