Data Engineer - AI & ML Job at Theron Solutions, San Francisco, CA

ZUIweDJ2Q0twMksveHRzK2dOV0dsSVgrdXc9PQ==
  • Theron Solutions
  • San Francisco, CA

Job Description

Job Description

Location : San Francisco, CA

Responsibilities :

1.Design and Build Data Pipelines:

•Develop, construct, test, and maintain data pipelines to extract, transform, and load (ETL) data from various sources to data warehouses or data lakes.

•Ensure data pipelines are efficient, scalable, and maintainable, enabling seamless data flow for downstream analysis and modeling.

•Work with stakeholders to identify data requirements and implement effective data processing solutions.

2. Data Integration:

•Integrate data from multiple sources such as internal databases, external APIs, third-party vendors, and flat files.

•Collaborate with business teams to understand data needs and ensure data is structured properly for reporting and analytics.

•Build and optimize data ingestion systems to handle both real-time and batch data processing.

3. Data Storage and Management:

•Design and manage data storage solutions (e.g., relational databases, NoSQL databases, data lakes, cloud storage) that support large-scale data processing.

•Implement best practices for data security, backup, and disaster recovery, ensuring that data is safe, recoverable, and complies with relevant regulations.

•Manage and optimize storage systems for scalability and cost efficiency.

4. Data Transformation:

•Develop data transformation logic to clean, enrich, and standardize raw data, ensuring it is suitable for analysis.

•Implement data transformation frameworks and tools, ensuring they work seamlessly across different data formats and sources.

•Ensure the accuracy and integrity of data as it is processed and stored.

5. Automation and Optimization:

•Automate repetitive tasks such as data extraction, transformation, and loading to improve pipeline efficiency.

•Optimize data processing workflows for performance, reducing processing time and resource consumption.

•Troubleshoot and resolve performance bottlenecks in data pipelines.

6. Collaboration with Data Teams:

•Work closely with Data Scientists, Analysts, and business teams to understand data requirements and ensure the correct data is available and accessible.

•Assist Data Scientists with preparing datasets for model training and deployment.

•Provide technical expertise and support to ensure the integrity and consistency of data across all projects.

7. Data Quality Assurance:

•Implement data validation checks to ensure data accuracy, completeness, and consistency throughout the pipeline.

•Develop and enforce data quality standards to detect and resolve data issues before they affect analysis or reporting.

•Monitor and improve data quality by identifying areas for improvement and implementing solutions.

8. Monitoring and Maintenance:

•Set up monitoring and logging for data pipelines to detect and alert for issues such as failures, data mismatches, or delays.

•Perform regular maintenance of data pipelines and storage systems to ensure optimal performance.

•Update and improve data systems as required, keeping up with evolving technology and business needs.

9. Documentation and Reporting:

•Document data pipeline designs, ETL processes, data schemas, and transformation logic for transparency and future reference.

•Create reports on the performance and status of data pipelines, identifying areas of improvement or potential issues.

•Provide guidance to other teams regarding the usage and structure of data systems.

10. Stay Updated with Technology Trends:

•Continuously evaluate and adopt new tools, technologies, and best practices in data engineering and big data systems.

•Participate in industry conferences, webinars, and training to stay current with emerging trends in data engineering and cloud computing.

Qualifications: (Please list all required qualifications) Click here to enter text.

(Rationalizes basic requirements for candidates to apply. Helps w/rationalization when

Requirements : -

1. Educational Background:

Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or a related field

2. Technical Skills:

•Proficiency in programming languages such as Python, Java, or Scala for data processing.

•Strong knowledge of SQL and relational databases (e.g., MySQL, PostgreSQL, MS SQL Server).

•Experience with NoSQL databases (e.g., MongoDB, Cassandra, HBase).

•Familiarity with data warehousing solutions (e.g., Amazon Redshift, Google BigQuery, Snowflake).

•Hands-on experience with ETL frameworks and tools (e.g., Apache NiFi, Talend, Informatica, Airflow).

•Knowledge of big data technologies (e.g., Hadoop, Apache Spark, Kafka).

•Experience with cloud platforms (AWS, Azure, Google Cloud) and related services for data storage and processing.

•Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes) for building scalable data systems.

•Knowledge of version control systems (e.g., Git) and collaboration tools (e.g., Jira, Confluence).

•Understanding data modeling concepts (e.g., star schema, snowflake schema) and how they relate to data warehousing and analytics.

•Knowledge of data lakes, data warehousing architecture, and how to design efficient and scalable storage solutions.

3.Soft Skills:

•Strong problem-solving skills with an ability to troubleshoot complex data issues.

•Excellent communication skills, with the ability to explain technical concepts to both technical and non-technical stakeholders.

•Strong attention to detail and a commitment to maintaining data accuracy and integrity.

•Ability to work effectively in a collaborative, team-based environment.

4.Experience:

-3+ years of experience in data engineering, with hands-on experience in building and maintaining data pipelines and systems.

- Proven track record of implementing data engineering solutions at scale, preferably in large or complex environments.

- Experience working with data governance, compliance, and security protocols.

5.Preferred Qualifications**:

-Experience with machine learning and preparing data for AI/ML model training.

-Familiarity with stream processing frameworks (e.g., Apache Kafka, Apache Flink).

-Certification in cloud platforms (e.g., AWS Certified Big Data – Specialty, Google Cloud Professional Data Engineer).

-Experience with DevOps practices and CI/CD pipelines for data systems.

-Experience with automation and orchestration tools (e.g., Apache Airflow, Luigi).

-Familiarity with data visualization and reporting tools (e.g., Tableau, Power BI) to support analytics teams

6.Work Environment:

•Collaborative and fast-paced work environment.

•Opportunity to work with state-of-the-art technologies.

•Supportive and dynamic team culture

EOE: Our client is an Equal Opportunity Employer and Prohibits Discrimination and Harassment of Any Kind: We are committed to the principle of equal employment opportunity for all employees and to providing employees with a work environment free of discrimination and harassment. All employment decisions at our client are based on business needs, job requirements, and individual qualifications, without regard to race, color, religion or belief, national, social or ethnic origin, sex (including pregnancy), age, physical, mental or sensory disability, HIV Status, sexual orientation, gender identity and/or expression, marital, civil union or domestic partnership status, past or present military service, family medical history or genetic information, family or parental status, or any other status protected by the laws or regulations in the locations where we operate. We will not tolerate discrimination or harassment based on any of these characteristics.

Job Tags

Similar Jobs

PrideStaff

Overhead Crane Operator Job at PrideStaff

 ...Forklift / Overhead Crane Operator Pay Rate: $22-24/hr Shift: 2nd, & 3rd shift openings, Week & Weekend shifts Week Shift: Mon. - Thurs, 10 hour shifts ($22/hr) Weekend Shift: Fri - Sat, 6:00am - 6:00pm, 12 hour shifts with a pay differential ($24/hr) Location... 

PayPal, Inc.

Sr. Content Designer Job at PayPal, Inc.

 ...the ideal candidate if you're an excellent writer who is also a design thinker and customer champion. You love using words to tell...  ...collaboration and open communication. Job Description: At PayPal, Content Designers work side by side with Designers, Content Strategists... 

Ballerina Farm

Shopify Developer/Digital Product Manager Job at Ballerina Farm

 ...Monitor and maintain site performance, ensuring minimal downtime Development and Customization: Customize Shopify themes to align with brand aesthetics and functionality needs Implement apps and integrations to enhance functionality (e.g., email marketing, CRM... 

The Woodhouse Day Spa - Fort Worth

Licensed Massage Therapist Full Time Job at The Woodhouse Day Spa - Fort Worth

 ...future at Woodhouse ! Join our team of full-time Licensed Massage Therapists earn estimated $50k* - $80k* per year PLUS a GENEROUS,...  ...Requirements: Active Massage Therapist License No job experience necessary, we will train Woodhouse is the leader in affordable... 

University of Maryland Medical System

Forensic Nurse Examiner Apprentice (PRN) Job at University of Maryland Medical System

 ...offerings as a community partner to help improve the health status of Prince George's County residents. Job Description A Forensic Nurse Examiner Apprentice is specially trained and certified by the Maryland Board of Nursing to provide comprehensive care to adults...