Velocity Staff

  • ETL Data Architect

    Location US-KS-Overland Park
    Posted Date 3 weeks ago(10/26/2018 12:55 PM)
    # Positions
    1
  • Overview

    Velocity Staff, Inc. is currently working with our client located in Overland Park, Kansas to identify an ETL Data Architect to join their team on a full-time, permanent basis. The ETL Data Architect will be responsible for assisting in the build of an enterprise data solution and maintaining a cloud-focused data platform for analytics and business intelligence.

     

    Responsibilities

    • Design and maintain a flexible data models along with the framework and processes that feed into it.
    • Design, build, and maintain forward-looking data models.
    • Design, build, and maintain framework for reliable job execution.
    • Act as a lieson to the business, understanding data requirements and assisting our Business Intelligence team with technical solutions.
    • Possess and execute strong ETL development capabilities as well as strong knowledge of ETL process and data integration.

    Qualifications

    Required Qualifications:
    • Degree in Computer Science, Data Science, related field or equivalent practical experience.
    • Expertise in extracting and manipulating data from a variety of sources, primarily cloud based, and loading/indexing into relational data structures - strong ETL Development skills are a MUST.
    • Proven experience with relational (Postgres, MS SQL, Redshift) and noSQL (MongoDB) Databases.
    • Experience modeling and maintaining tradtional relational database schemas for efficient data storage and retrieval.
    • Strong customer-facing communication and careful listening skills. Proven success in and genuine enthusiasm for working directly with customer technical teams.
    Desired Skills/Qualifications:
    • Experience with data processing and analytics using AWS Glue or Apache Spark.
    • Experience building data-lake style infrastructures using streaming data set technologies such as AWS Kinesis/Firehose or Apache Kafka.
    • Experience with data processing using Parquet and Avro.
    • Development experience delivering analytics processes utilizing Python or Javascript.
    • Experience with web UI development (HTML, CSS, JavaScript.)
    • Familiarity with NodeJS, Express, Angular.
    • Experience with front-end BI tools such as Power BI using DAX and M Query.
    • Knowledge of DataWarehouse terminology.

    Options

    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed

    Connect With Us And We Promise Not To Overwhelm Your Inbox!

    Not ready to apply? Connect with us to learn about future opportunities.