Sr. Data Engineer at C-Vision Inc
Warren, MI 48091
About the Job
Data Engineer
Work Location: Warren MI - Hybrid Role
Contract W2 opportunity
Need Only Visa Independent Consultant
Required Skills:
Expert in multiple tools and technologies including Azure and Databricks
Scala, Spark, Python, SQL, .NET, REST API, Angular
SQL Server, PostgreSQL, SQL Server Integration Services (SSIS), Complex ETL with massive data, PowerBI
Job Description:
As a Data Engineer, you will build industrialized data assets and data pipelines in support of Business Intelligence and Advance Analytic objectives. The Data Engineer handles leading and delivering new and innovative data driven solutions that are elegant and professional. You will work closely with our forward-thinking Data Scientists, BI developers, System Architects, and Data Architects to deliver value to our vision for the future. Our team focuses on writing maintainable tests and code that meet the customer’s need and scale without rework. Our engineers and architects work in highly collaborative environments across many disciplines (user experience, database, streaming technology, custom rules engines, and ai/ml and most web technologies). We work on innovative technologies – understanding and inventing modern designs and integration patterns along the way. Our Data Engineers must –
• You will assemble large, complex data sets that meet functional / non-functional business requirements.
• You will identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
• Lead and deliver exceptional data driven solutions across many different languages, tools, and technology.
• Develop a culture which takes challenging and complex ideas and turns them into production solutions.
• Have a broad, enterprise-wide view of business and varying degrees of appreciating for strategy, process, capabilities, enablers, and governance.
• Be able to critically apply themselves to solving problems in an interconnected and integrated environment.
• Think strategically about adopting innovative technologies that are beyond the horizon and deliver on sound solution design.
• Create high-level models that can be leveraged in future analysis to extend and mature the business architecture.
• Working towards continuously raising our standards of engineering excellence in quality, efficiency, and developing repeatable designs.
• Performing hands-on development, leading code reviews and testing, creating automation tools, and creating proofs of concepts
• Ability to work along with operations team during any production issues related to platform.
• Best practices like agile methodologies, design thinking and continuous deployment that will allow you to innovate fast.
• Deliver solutions across Big Data applications to support business strategies and deliver business value.
• Build tool / automation to make deployment and monitoring production environment more repeatable.
• Build strong relationships with Business & Technology Partners and provide leadership, direction, best practices, and coaching to technology development teams.
• Owning all Technical aspects of development and for assigned applications.
• Driving continuous improvement in applications, through use of consistent development practices and tools, and through ongoing design and code refactoring
• Collaborating with stakeholders through ongoing product and platform releases, to solve existing needs, identify exciting opportunities, and predict future challenges.
• Working closely with product managers and architects to prioritize and manage features, technical requirements, known defects, and issues.
• Managing and appropriately escalating delivery impediments, risks, issues, and changes tied to the product development initiatives.
• Data Integration – Design and develop new source system integrations from a variety of formats including files, database extracts and APIs.
• Data Pipelines – Design and develop highly scalable Data Pipelines that incorporate complex transformations and efficient code.
Data Delivery – Design and develop solutions for delivering data that meets SLAs
Story Behind the Need – Business Group & Key Projects
Business group
Surrounding team & key projects
Purpose of this team
Motivators for this need (i.e.-special projects/general workload support/etc.)
Business Group – responsible for extracting data from our manufacturing plant floor applicants, data is provided for our business partner to use in an operation supporting manner such as operations/analytical uses, creating analytical reports, not doing the actual development, forcing the data and making it available in databases. They are looking to move to cloud this year, work around migrating current data pipelines to cloud.
These positions will be very focused on V-live area: data that allows the support for us to go and reflash our vehicles in the yards with new software, extracting data out of the applications on the plant. New requirements coming in from the business partners.
Reflash vehicles – updating vehicles software (the contractor will not actually be doing this, providing the supporting data to complete this task)
Motivator for this need – more demand of work vs the current amount of people. Critical work and need help completing this.
Typical Day in the Role
Typical task breakdown and rhythm
Interaction level with team
Work environment description
Chance for flex hours or remote
Chance for extension later
Confirm if group participates in GM Contract Worker Shutdown days
Daily Breakdown – every day is mostly the same, come in and start their daily responsibilities, has stand ups where they discuss the work that they have on top for the sprint, where they are at if there are any blockers, if they need help expediting the work with leadership, very independent working but still around other team members that will be working on the same data, some interaction with their business partners, understanding what is needed. Some adhoc meetings that may come up that they have to attend. Define what the scope of the sprint is going to be prior to the sprint, they have a lot of backlog to fulfill, sometimes priorities change they might be working on something and have to switch to something more critical.
Scope is defined in 3 week sprints - what’s assigned should be delivered in this timeframe
Interaction level with team – team of 9 data engineers, 2 data engineer testers 4 in Michigan and the rest are in different states. Interacting mostly with the people in Michigan.
Work environment – open cubicle environment in the Tech building. The manager is not local to Michigan, she is in Austin.
Compelling Story & Candidate Value Proposition
What makes this role interesting?
Competitive market comparison
Unique selling points
Value added or experience gained
They are looking for someone who is passionate about data and data quality, someone who has high interest in this will be a great fit. They handle critical data to get cars delivered to the customers and the bottom line for GM, data passion and the ability to provide the products to customers.
Candidate Requirements
Degrees or certifications required
Years of experience required
Technologies/depth of technologies required
Any preferred schools/companies
Degree Requirement – Bachelor Degree in the data space or a related field (computer science, information technology, mathematics are good fields)
Years of experience – 5-7 years’ experience building ETL
(ETL – extract transform and load)
ETL – code utilizing different technologies, connects you to a database and helps extract it out so the table and the fields do the transformation cleaning, connect to another database and actually load it to the new database
Working with azure cloud technologies
Understanding databases, database structures, data model, data architecture
Expert in multiple tools and technologies including Azure and Databricks
Scala, Spark, Python, SQL, .NET, REST API, Angular
SQL Server, PostgreSQL, SQL Server Integration Services (SSIS), Complex ETL with massive data, PowerBI
Rekha
C-Vision Inc.
575E Big Beaver Rd, Ste 190 | Troy, MI 48083
Email:
Web:
Work Location: Warren MI - Hybrid Role
Contract W2 opportunity
Need Only Visa Independent Consultant
Required Skills:
Expert in multiple tools and technologies including Azure and Databricks
Scala, Spark, Python, SQL, .NET, REST API, Angular
SQL Server, PostgreSQL, SQL Server Integration Services (SSIS), Complex ETL with massive data, PowerBI
Job Description:
As a Data Engineer, you will build industrialized data assets and data pipelines in support of Business Intelligence and Advance Analytic objectives. The Data Engineer handles leading and delivering new and innovative data driven solutions that are elegant and professional. You will work closely with our forward-thinking Data Scientists, BI developers, System Architects, and Data Architects to deliver value to our vision for the future. Our team focuses on writing maintainable tests and code that meet the customer’s need and scale without rework. Our engineers and architects work in highly collaborative environments across many disciplines (user experience, database, streaming technology, custom rules engines, and ai/ml and most web technologies). We work on innovative technologies – understanding and inventing modern designs and integration patterns along the way. Our Data Engineers must –
• You will assemble large, complex data sets that meet functional / non-functional business requirements.
• You will identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
• Lead and deliver exceptional data driven solutions across many different languages, tools, and technology.
• Develop a culture which takes challenging and complex ideas and turns them into production solutions.
• Have a broad, enterprise-wide view of business and varying degrees of appreciating for strategy, process, capabilities, enablers, and governance.
• Be able to critically apply themselves to solving problems in an interconnected and integrated environment.
• Think strategically about adopting innovative technologies that are beyond the horizon and deliver on sound solution design.
• Create high-level models that can be leveraged in future analysis to extend and mature the business architecture.
• Working towards continuously raising our standards of engineering excellence in quality, efficiency, and developing repeatable designs.
• Performing hands-on development, leading code reviews and testing, creating automation tools, and creating proofs of concepts
• Ability to work along with operations team during any production issues related to platform.
• Best practices like agile methodologies, design thinking and continuous deployment that will allow you to innovate fast.
• Deliver solutions across Big Data applications to support business strategies and deliver business value.
• Build tool / automation to make deployment and monitoring production environment more repeatable.
• Build strong relationships with Business & Technology Partners and provide leadership, direction, best practices, and coaching to technology development teams.
• Owning all Technical aspects of development and for assigned applications.
• Driving continuous improvement in applications, through use of consistent development practices and tools, and through ongoing design and code refactoring
• Collaborating with stakeholders through ongoing product and platform releases, to solve existing needs, identify exciting opportunities, and predict future challenges.
• Working closely with product managers and architects to prioritize and manage features, technical requirements, known defects, and issues.
• Managing and appropriately escalating delivery impediments, risks, issues, and changes tied to the product development initiatives.
• Data Integration – Design and develop new source system integrations from a variety of formats including files, database extracts and APIs.
• Data Pipelines – Design and develop highly scalable Data Pipelines that incorporate complex transformations and efficient code.
Data Delivery – Design and develop solutions for delivering data that meets SLAs
Story Behind the Need – Business Group & Key Projects
Business group
Surrounding team & key projects
Purpose of this team
Motivators for this need (i.e.-special projects/general workload support/etc.)
Business Group – responsible for extracting data from our manufacturing plant floor applicants, data is provided for our business partner to use in an operation supporting manner such as operations/analytical uses, creating analytical reports, not doing the actual development, forcing the data and making it available in databases. They are looking to move to cloud this year, work around migrating current data pipelines to cloud.
These positions will be very focused on V-live area: data that allows the support for us to go and reflash our vehicles in the yards with new software, extracting data out of the applications on the plant. New requirements coming in from the business partners.
Reflash vehicles – updating vehicles software (the contractor will not actually be doing this, providing the supporting data to complete this task)
Motivator for this need – more demand of work vs the current amount of people. Critical work and need help completing this.
Typical Day in the Role
Typical task breakdown and rhythm
Interaction level with team
Work environment description
Chance for flex hours or remote
Chance for extension later
Confirm if group participates in GM Contract Worker Shutdown days
Daily Breakdown – every day is mostly the same, come in and start their daily responsibilities, has stand ups where they discuss the work that they have on top for the sprint, where they are at if there are any blockers, if they need help expediting the work with leadership, very independent working but still around other team members that will be working on the same data, some interaction with their business partners, understanding what is needed. Some adhoc meetings that may come up that they have to attend. Define what the scope of the sprint is going to be prior to the sprint, they have a lot of backlog to fulfill, sometimes priorities change they might be working on something and have to switch to something more critical.
Scope is defined in 3 week sprints - what’s assigned should be delivered in this timeframe
Interaction level with team – team of 9 data engineers, 2 data engineer testers 4 in Michigan and the rest are in different states. Interacting mostly with the people in Michigan.
Work environment – open cubicle environment in the Tech building. The manager is not local to Michigan, she is in Austin.
Compelling Story & Candidate Value Proposition
What makes this role interesting?
Competitive market comparison
Unique selling points
Value added or experience gained
They are looking for someone who is passionate about data and data quality, someone who has high interest in this will be a great fit. They handle critical data to get cars delivered to the customers and the bottom line for GM, data passion and the ability to provide the products to customers.
Candidate Requirements
Degrees or certifications required
Years of experience required
Technologies/depth of technologies required
Any preferred schools/companies
Degree Requirement – Bachelor Degree in the data space or a related field (computer science, information technology, mathematics are good fields)
Years of experience – 5-7 years’ experience building ETL
(ETL – extract transform and load)
ETL – code utilizing different technologies, connects you to a database and helps extract it out so the table and the fields do the transformation cleaning, connect to another database and actually load it to the new database
Working with azure cloud technologies
Understanding databases, database structures, data model, data architecture
Expert in multiple tools and technologies including Azure and Databricks
Scala, Spark, Python, SQL, .NET, REST API, Angular
SQL Server, PostgreSQL, SQL Server Integration Services (SSIS), Complex ETL with massive data, PowerBI
Rekha
C-Vision Inc.
575E Big Beaver Rd, Ste 190 | Troy, MI 48083
Email:
rekha@c-visionit.com
Web:
http://www.cvision.io/