Data Engineer at Kloeckner Metals Corporation in Roswell, Georgia

Posted in Information Technology 12 days ago.

Type: Full-Time





Job Description:

Job Summary 


Kloeckner Metals Corporation is seeking a highly skilled and motivated Data Engineer to join our growing data team. The ideal candidate will possess a strong foundation in data warehousing concepts, including ETL/ELT processes, dimensional modeling, and semantic layer development. This role requires a hands-on approach to building and maintaining robust data pipelines and infrastructure, with a focus on automation and efficiency. You will be responsible for designing, developing, and deploying data solutions that empower our organization to make data-driven decisions.


Job Responsibilities   



  1. Data Pipeline Development


  • Design, build, and maintain efficient ETL/ELT pipelines to process large-scale structured and unstructured data.

  • Leverage Spark (PySpark/SQL) and Python for data transformation and processing.

  • Integrate data from diverse sources, including databases, APIs, and cloud platforms.

  • Utilize MS Fabric for data lake, data warehouse, and data pipeline development


  • Data Modeling & Warehousing


    • Develop and optimize dimensional models, fact tables, and views to support analytics and reporting.

    • Design semantic models for BI tools (e.g., Power BI) to ensure accurate and accessible data for stakeholders.

    • Build and optimize data warehouses and data marts using dimensional modeling techniques (star schema, snowflake schema).

    • Write complex and efficient SQL queries for data extraction, transformation, and loading.

    • Develop and implement data transformations and data quality checks using Python and Spark.

    • Design and implement data integration solutions for various data sources, including Oracle databases.


  • Automation & Integration


    • Automate repetitive data workflows using Power Automate and API integrations.

    • Build and maintain APIs to facilitate seamless data exchange between systems.

    • Implement monitoring and alerting systems for pipeline reliability.


  • Collaboration & Support


    • Collaborate with data analysts and business stakeholders to understand data requirements and deliver effective solutions.

    • Support Power BI report development by ensuring clean, transformed data availability.


  • Performance Optimization


    • Tune SQL queries, Spark jobs, and database configurations for speed and efficiency.

    • Monitor and troubleshoot data pipeline performance and data quality issues.


  • Documentation & Best Practices


    • Document pipelines, data models, and processes.

    • Advocate for data governance, security, and quality standards.

    • Implement data security and access control measures

     Qualifications:



    • Bachelor's degree in Computer Science, Data Science, or a related field.

    • Proven minimum 2-3 years of experience as a Data Engineer or in a similar role.  

    • Strong understanding of ETL/ELT processes and data warehousing concepts.


    Other Skills and Expertise:



    • Expertise in dimensional modeling (star schema, snowflake schema).

    • Advanced SQL skills, including query optimization and performance tuning.

    • Proficiency in Python and Spark for data processing and transformation.

    • Understanding of API and using API for data integration.

    • Experience with automating processes using Power Automate.

    • Strong problem-solving and analytical skills.

    • Excellent communication and collaboration skills.

    • Ability to work independently and as part of a team.  

    • Experience with version control systems (e.g., Git).

    • Experience with Microsoft Fabric and Power BI, including DAX and Power Query.

    • Familiarity with Oracle databases, including PL/SQL.

    • Experience in cloud-based data warehousing solutions (e.g., Azure Synapse Analytics).

    • Knowledge of data governance and data quality principles.

    • Experience working with cloud platforms (Azure, AWS, GCP) for data engineering solutions.

    •  Familiarity with Lakehouse architectures and Delta Lake.

    • Knowledge of CI/CD for data pipelines using DevOps tools.


    Equal Opportunity Employer/Protected Veterans/Individuals with Disabilities
    The contractor will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor’s legal duty to furnish information. 41 CFR 60-1.35(c)

    IT





    More jobs in Roswell, Georgia

    Other
    about 7 hours ago

    Wellstar Health Systems
    Other
    about 7 hours ago

    Wellstar Health Systems
    Other
    about 7 hours ago

    Wellstar Health Systems
    More jobs in Information Technology

    Information Technology
    19 minutes ago

    Vituity
    Information Technology
    30+ days ago

    Penn National Insurance
    Information Technology
    about 3 hours ago

    Railinc Corp.