Job ID: 39368***Not able to use 3rd Party Agencies***
Our client is seeking a Data Engineer to join their dynamic team in Beaverton, OR.
Data Engineer Responsibilities:
Design and implement features in collaboration with team engineers, product owners, data analysts, and business partners using Agile / Scrum methodology.
Contribute to overall architecture, frameworks and patterns for processing and storing large data volumes.
Design and implement distributed data processing pipelines using Spark, Hive, Sqoop, Python, and other tools and languages prevalent in the Hadoop ecosystem.
Build utilities, user defined functions, and frameworks to better enable data flow patterns.
Research, evaluate and utilize new technologies/tools/frameworks centered around high-volume data processing.
Define and apply appropriate data acquisition and consumption strategies for given technical scenarios.
Build and incorporate automated unit tests and participate in integration testing efforts.
Work with architecture/engineering leads and other teams to ensure quality solutions are implemented, and engineering best practices are defined and adhered to.
Work across teams to resolve operational and performance issues.
Data Engineer Qualifications:
MS/BS in Computer Science, or related technical discipline.
5+ years of industry experience, 3+ years of relevant big data/relational db experience.
5 + years in Unix systems engineering with experience in Red Hat Linux, Centos or Ubuntu.
Troubleshooting production issues and performing On-Call duties, at times.
Ability to architect, design and implement solutions with AWS Virtual Private Cloud, EC2, AWS Data Pipeline, AWS Cloud Formation, Auto Scaling, AWS Simple Storage Service, EMR and other AWS products.
2+ year experience in Python, Snowflake and Airflow. Strong programming experience in Python.
Extensive experience working with Hadoop and related processing frameworks such as Spark, Hive, Sqoop, etc.
Desire to work collaboratively with your teammates to come up with the best solution to a problem.
Demonstrated experience and ability to deliver results on multiple projects in a fast-paced, agile environment.
Excellent problem-solving and interpersonal communication skills.
Strong desire to learn and share knowledge with others.
Passionate about data and striving for excellence.
Experience with RDBMS systems, SQL and SQL Analytical functions
Experience with workflow orchestration tools like Apache Airflow.
Experience with performance and scalability tuning.
Experience in Agile/Scrum application development using JIRA.
Experience working in a public cloud environment, particularly AWS.
Familiarity with practices like Continuous Development, Continuous Integration and Automated Testing.
Experience with Scala or Java.
Familiarity with build tools such as CloudFormation and automation tools such as Jenkins or Circle CI.
Benefits are available to eligible VanderHouwen contractors and include coverage for medical, dental, vision, life insurance, short and long term disability, and matching 401k.
VanderHouwen is an award-winning, Women-Owned, WBENC certified professional staffing firm. Founded in 1987, VanderHouwen has been successfully placing experienced professionals throughout the Pacific Northwest and nationwide. Our recruitment teams are highly specialized in either Technology and IT, Engineering, or Accounting and Finance career markets. Our recruiters value building meaningful, professional relationships with each candidate as well as developing honed knowledge of companies' staffing needs and workplaces. Partner with us to land your next exciting career.
VanderHouwen is an Equal Opportunity Employer and participates in E-Verify. VanderHouwen does not discriminate on the basis of race, color, religion, sex, national origin, age, disability, or any other characteristic protected by applicable local, state or federal civil rights laws.
Not Ready to Apply?